mirror of
https://github.com/System-End/cdn.git
synced 2026-04-19 16:18:17 +00:00
Codebase Rewrite – Slack Bot, Backblaze B2 Migration, API v3
This update is a full rewrite of the codebase with major improvements 💪 : - Slack Bot Integration – Added now built-in Slack bot! - Backblaze B2 Migration – Switched from Vercel to B2, cutting storage / egress costs by around 90%. - API v3 – New version includes file hashes, sizes, and additional metadata. - API Token Requirement – ⚠️ All older API versions (v1, v2) now require authentication tokens. ⚠️ -- Deployor 💜
This commit is contained in:
parent
8d073b004b
commit
09441c495b
22 changed files with 919 additions and 437 deletions
8
.gitignore
vendored
8
.gitignore
vendored
|
|
@ -1,3 +1,5 @@
|
|||
.env
|
||||
.vercel
|
||||
.vscode
|
||||
/node_modules/
|
||||
/splitfornpm/
|
||||
/.idea/
|
||||
/.env
|
||||
/package-lock.json
|
||||
|
|
|
|||
285
README.md
285
README.md
|
|
@ -1,44 +1,241 @@
|
|||
<h1 align="center">CDN</h1>
|
||||
<p align="center"><i>Deep under the waves and storms there lies a <a href="https://app.slack.com/client/T0266FRGM/C016DEDUL87">vault</a>...</i></p>
|
||||
<p align="center"><img alt="Raft icon" src="http://cloud-pxma0a3yi.vercel.app/underwater.png"></p>
|
||||
<p align="center">Illustration above by <a href="https://gh.maxwofford.com">@maxwofford</a>.</p>
|
||||
|
||||
---
|
||||
|
||||
CDN powers the [#cdn](https://app.slack.com/client/T0266FRGM/C016DEDUL87) channel in the [Hack Club Slack](https://hackclub.com/slack).
|
||||
|
||||
## Version 2 <img alt="Version 2" src="https://cloud-b46nncb23.vercel.app/0v2.png" align="right" width="300">
|
||||
|
||||
Post this JSON...
|
||||
```js
|
||||
[
|
||||
"website.com/somefile.png",
|
||||
"website.com/somefile.gif",
|
||||
]
|
||||
```
|
||||
|
||||
And it'll return the following:
|
||||
```js
|
||||
{
|
||||
"0somefile.png": "cdnlink.vercel.app/0somefile.png",
|
||||
"1somefile.gif": "cdnlink.vercel.app/1somefile.gif"
|
||||
}
|
||||
```
|
||||
|
||||
## Version 1 <img alt="Version 1" src="https://cloud-6gklvd3ci.vercel.app/0v1.png" align="right" width="300">
|
||||
|
||||
Post this JSON...
|
||||
```js
|
||||
[
|
||||
"website.com/somefile.png",
|
||||
"website.com/somefile.gif",
|
||||
]
|
||||
```
|
||||
|
||||
And it'll return the following:
|
||||
```js
|
||||
[
|
||||
"cdnlink.vercel.app/0somefile.png",
|
||||
"cdnlink.vercel.app/1somefile.gif"
|
||||
]
|
||||
```
|
||||
<div align="center">
|
||||
<img src="https://assets.hackclub.com/flag-standalone.svg" width="100">
|
||||
<h1>CDN</h1>
|
||||
<p>A CDN solution for Hack Club!</p>
|
||||
</div>
|
||||
|
||||
<p align="center"><i>Deep under the waves and storms there lies a <a href="https://app.slack.com/client/T0266FRGM/C016DEDUL87">vault</a>...</i></p>
|
||||
|
||||
<div align="center">
|
||||
<img src="https://files.catbox.moe/6fpj0x.png" width="100%">
|
||||
<p align="center">Banner illustration by <a href="https://gh.maxwofford.com">@maxwofford</a>.</p>
|
||||
|
||||
<a href="https://app.slack.com/client/T0266FRGM/C016DEDUL87">
|
||||
<img alt="Slack Channel" src="https://img.shields.io/badge/slack-%23cdn-blue.svg?style=flat&logo=slack">
|
||||
</a>
|
||||
</div>
|
||||
|
||||
## 🚀 Features
|
||||
|
||||
- **Multi-version API Support** (v1, v2, v3)
|
||||
- **Slack Bot Integration**
|
||||
- Upload up to 10 files per message
|
||||
- Automatic file sanitization
|
||||
- file organization
|
||||
- **Secure API Endpoints**
|
||||
- **Cost-Effective Storage** (87-98% cost reduction vs. Vercel CDN)
|
||||
- **Prevent File Deduplication**
|
||||
- **Organized Storage Structure**
|
||||
|
||||
## 🔧 Setup
|
||||
|
||||
### 1. Slack App Configuration
|
||||
|
||||
1. Create a new Slack App at [api.slack.com](https://api.slack.com/apps)
|
||||
2. Enable Socket Mode in the app settings
|
||||
3. Add the following Bot Token Scopes:
|
||||
- `channels:history`
|
||||
- `channels:read`
|
||||
- `chat:write`
|
||||
- `files:read`
|
||||
- `files:write`
|
||||
- `groups:history`
|
||||
- `reactions:write`
|
||||
4. Enable Event Subscriptions and subscribe to `file_shared` event
|
||||
5. Install the app to your workspace
|
||||
|
||||
### 2. CDN Configuration (Cloudflare + Backblaze)
|
||||
|
||||
1. Create a Backblaze B2 bucket
|
||||
2. Set up Cloudflare DNS:
|
||||
- Add a CNAME record pointing to your B2 bucket (e.g., `f003.backblazeb2.com`) you can upload a file and check in info!
|
||||
- Enable Cloudflare proxy
|
||||
3. Configure SSL/TLS:
|
||||
- Set SSL mode to "Full (strict)"
|
||||
- ⚠️ **WARNING**: This setting may break other configurations on your domain! You could use another domain!
|
||||
4. Create a Transform Rule:
|
||||
- Filter: `hostname equals "your-cdn.example.com"`
|
||||
- Rewrite to: `concat("/file/(bucket name)", http.request.uri.path)` (make sure u get the bucket name)
|
||||
- Preserve query string
|
||||
|
||||
### 3. Environment Setup
|
||||
|
||||
Create a `.env` file with:
|
||||
```env
|
||||
# Slack
|
||||
SLACK_BOT_TOKEN=xoxb- # From OAuth & Permissions
|
||||
SLACK_SIGNING_SECRET= # From Basic Information
|
||||
SLACK_APP_TOKEN=xapp- # From Basic Information (for Socket Mode)
|
||||
SLACK_CHANNEL_ID=channel-id # Channel where bot operates
|
||||
|
||||
# Backblaze (Public Bucket)
|
||||
B2_APP_KEY_ID=key-id # From B2 Application Keys
|
||||
B2_APP_KEY=app-key # From B2 Application Keys
|
||||
B2_BUCKET_ID=bucket-id # From B2 Bucket Settings
|
||||
B2_CDN_URL=https://cdn.example.com
|
||||
|
||||
# API
|
||||
API_TOKEN=beans # Set a secure random string
|
||||
PORT=3000
|
||||
```
|
||||
|
||||
### 4. Installation & Running
|
||||
|
||||
```bash
|
||||
npm install
|
||||
node index.js
|
||||
```
|
||||
Feel free to use pm2!
|
||||
|
||||
## 📡 API Usage
|
||||
|
||||
⚠️ **IMPORTANT SECURITY NOTE**:
|
||||
- All API endpoints require authentication via `Authorization: Bearer api-token` header
|
||||
- This includes all versions (v1, v2, v3) - no exceptions!
|
||||
- Use the API_TOKEN from your environment configuration
|
||||
- Failure to include a valid token will result in 401 Unauthorized responses
|
||||
|
||||
### V3 API (Latest)
|
||||
<img alt="Version 3" src="https://files.catbox.moe/e3ravk.png" align="right" width="300">
|
||||
|
||||
**Endpoint:** `POST https://e2.deployor.hackclub.app/api/v3/new`
|
||||
|
||||
**Headers:**
|
||||
```
|
||||
Authorization: Bearer api-token
|
||||
Content-Type: application/json
|
||||
```
|
||||
|
||||
**Request Example:**
|
||||
```bash
|
||||
curl --location 'https://e2.deployor.hackclub.app/api/v3/new' \
|
||||
--header 'Authorization: Bearer beans' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data '[
|
||||
"https://assets.hackclub.com/flag-standalone.svg",
|
||||
"https://assets.hackclub.com/flag-orpheus-left.png",
|
||||
"https://assets.hackclub.com/icon-progress-marker.svg"
|
||||
]'
|
||||
```
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"files": [
|
||||
{
|
||||
"deployedUrl": "https://cdn.deployor.dev/s/v3/3e48b91a4599a3841c028e9a683ef5ce58cea372_flag-standalone.svg",
|
||||
"file": "0_16361167e11b0d172a47e726b40d70e9873c792b_upload_1736985095691",
|
||||
"sha": "16361167e11b0d172a47e726b40d70e9873c792b",
|
||||
"size": 90173
|
||||
}
|
||||
// Other files
|
||||
],
|
||||
"cdnBase": "https://cdn.deployor.dev"
|
||||
}
|
||||
```
|
||||
|
||||
<details>
|
||||
<summary>V2 API</summary>
|
||||
|
||||
<img alt="Version 2" src="https://files.catbox.moe/uuk1vm.png" align="right" width="300">
|
||||
|
||||
**Endpoint:** `POST https://e2.deployor.hackclub.app/api/v2/new`
|
||||
|
||||
**Headers:**
|
||||
```
|
||||
Authorization: Bearer api-token
|
||||
Content-Type: application/json
|
||||
```
|
||||
|
||||
**Request Example:**
|
||||
```json
|
||||
[
|
||||
"https://assets.hackclub.com/flag-standalone.svg",
|
||||
"https://assets.hackclub.com/flag-orpheus-left.png",
|
||||
"https://assets.hackclub.com/icon-progress-marker.svg"
|
||||
]
|
||||
```
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"flag-standalone.svg": "https://cdn.deployor.dev/s/v2/flag-standalone.svg",
|
||||
"flag-orpheus-left.png": "https://cdn.deployor.dev/s/v2/flag-orpheus-left.png",
|
||||
"icon-progress-marker.svg": "https://cdn.deployor.dev/s/v2/icon-progress-marker.svg"
|
||||
}
|
||||
```
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>V1 API</summary>
|
||||
|
||||
<img alt="Version 1" src="https://files.catbox.moe/tnzdfe.png" align="right" width="300">
|
||||
|
||||
**Endpoint:** `POST https://e2.deployor.hackclub.app/api/v1/new`
|
||||
|
||||
**Headers:**
|
||||
```
|
||||
Authorization: Bearer api-token
|
||||
Content-Type: application/json
|
||||
```
|
||||
|
||||
**Request Example:**
|
||||
```json
|
||||
[
|
||||
"https://assets.hackclub.com/flag-standalone.svg",
|
||||
"https://assets.hackclub.com/flag-orpheus-left.png",
|
||||
"https://assets.hackclub.com/icon-progress-marker.svg"
|
||||
]
|
||||
```
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
[
|
||||
"https://cdn.deployor.dev/s/v1/0_flag-standalone.svg",
|
||||
"https://cdn.deployor.dev/s/v1/1_flag-orpheus-left.png",
|
||||
"https://cdn.deployor.dev/s/v1/2_icon-progress-marker.svg"
|
||||
]
|
||||
```
|
||||
</details>
|
||||
|
||||
## 🤖 Slack Bot Features
|
||||
|
||||
- **Multi-file Upload:** Upload up to 10 files in a single message no more than 3 messages at a time!
|
||||
- **File Organization:** Files are stored as `/s/{slackUserId}/{timestamp}_{sanitizedFilename}`
|
||||
- **Error Handling:** Error Handeling
|
||||
- **File Sanitization:** Automatic filename cleaning
|
||||
- **Size Limits:** Enforces files to be under 2GB
|
||||
|
||||
## Legacy API Notes
|
||||
- V1 and V2 APIs are maintained for backwards compatibility
|
||||
- All versions now require authentication via Bearer token
|
||||
- We recommend using V3 API for new implementations
|
||||
|
||||
## Technical Details
|
||||
|
||||
- **Storage Structure:** `/s/v3/{HASH}_{filename}`
|
||||
- **File Naming:** `/s/{slackUserId}/{unix}_{sanitizedFilename}`
|
||||
- **Cost Efficiency:** Uses B2 storage for significant cost savings
|
||||
- **Security:** Token-based authentication for API access
|
||||
|
||||
## 💻 Slack Bot Behavior
|
||||
|
||||
- Reacts to file uploads with status emojis:
|
||||
- ⏳ Processing
|
||||
- ✅ Success
|
||||
- ❌ Error
|
||||
- Supports up to 10 files per message
|
||||
- Max 3 messages concurrently!
|
||||
- Maximum file size: 2GB per file
|
||||
|
||||
## 💰 Cost Optimization
|
||||
|
||||
- Uses Cloudflare CDN with Backblaze B2 storage
|
||||
- Free egress thanks to Cloudflare-Backblaze Alliance
|
||||
- 87-98% cost reduction compared to Vercel CDN
|
||||
|
||||
<div align="center">
|
||||
<br>
|
||||
<p>Made with 💜 for Hack Club</p>
|
||||
<p>All illustrations by <a href="https://gh.maxwofford.com">@maxwofford</a></p>
|
||||
</div>
|
||||
128
api/v1/new.ts
128
api/v1/new.ts
|
|
@ -1,128 +0,0 @@
|
|||
import { urlParse } from "https://deno.land/x/url_parse/mod.ts";
|
||||
|
||||
const endpoint = (path: string) => {
|
||||
// https://vercel.com/docs/api#api-basics/authentication/accessing-resources-owned-by-a-team
|
||||
let url = "https://api.vercel.com/" + path;
|
||||
if (Deno.env.get("ZEIT_TEAM")) {
|
||||
url += ("?teamId=" + Deno.env.get("ZEIT_TEAM"));
|
||||
}
|
||||
return url;
|
||||
};
|
||||
|
||||
const deploy = async (
|
||||
files: { sha: string; file: string; path: string; size: number }[],
|
||||
) => {
|
||||
const req = await fetch(endpoint("v12/now/deployments"), {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
name: "cloud",
|
||||
files: files.map((f) => ({
|
||||
sha: f.sha,
|
||||
file: f.file,
|
||||
size: f.size,
|
||||
})),
|
||||
projectSettings: {
|
||||
framework: null,
|
||||
},
|
||||
}),
|
||||
});
|
||||
const json = await req.text();
|
||||
console.log(json)
|
||||
const baseURL = JSON.parse(json).url;
|
||||
const fileURLs = files.map((f) => "https://" + baseURL + "/" + f.path);
|
||||
|
||||
return { status: req.status, fileURLs };
|
||||
};
|
||||
|
||||
export default async (req: Request) => {
|
||||
if (req.method == "OPTIONS") {
|
||||
return new Response(
|
||||
JSON.stringify(
|
||||
{ status: "YIPPE YAY. YOU HAVE CLEARANCE TO PROCEED." },
|
||||
),
|
||||
{
|
||||
status: 204
|
||||
},
|
||||
);
|
||||
}
|
||||
if (req.method == "GET") {
|
||||
return new Response(
|
||||
JSON.stringify(
|
||||
{ error: "*GET outta here!* (Method not allowed, use POST)" },
|
||||
),
|
||||
{
|
||||
status: 405
|
||||
},
|
||||
);
|
||||
}
|
||||
if (req.method == "PUT") {
|
||||
return new Response(
|
||||
JSON.stringify(
|
||||
{ error: "*PUT that request away!* (Method not allowed, use POST)" },
|
||||
),
|
||||
{
|
||||
status: 405,
|
||||
},
|
||||
);
|
||||
}
|
||||
if (req.method != "POST") {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "Method not allowed, use POST" }),
|
||||
{
|
||||
status: 405,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
const decoder = new TextDecoder();
|
||||
console.log(req)
|
||||
const buf = await req.arrayBuffer();
|
||||
console.log(decoder.decode(buf))
|
||||
console.log(buf)
|
||||
const fileURLs = JSON.parse(decoder.decode(buf));
|
||||
console.log(fileURLs)
|
||||
console.log(fileURLs.length)
|
||||
console.log(typeof fileURLs)
|
||||
if (!Array.isArray(fileURLs) || fileURLs.length < 1) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "Empty file array" }),
|
||||
{ status: 422 }
|
||||
);
|
||||
}
|
||||
|
||||
const authorization = req.headers.get("Authorization");
|
||||
|
||||
const uploadedURLs = await Promise.all(fileURLs.map(async (url, index) => {
|
||||
const { pathname } = urlParse(url);
|
||||
const filename = index + pathname.substr(pathname.lastIndexOf("/") + 1);
|
||||
|
||||
const headers = {
|
||||
"Content-Type": "application/json",
|
||||
"Authorization": ""
|
||||
}
|
||||
if (authorization) {
|
||||
headers['Authorization'] = authorization;
|
||||
}
|
||||
const res = await (await fetch("https://cdn.hackclub.com/api/newSingle", {
|
||||
method: "POST",
|
||||
headers,
|
||||
body: url,
|
||||
})).json();
|
||||
|
||||
res.file = "public/" + filename;
|
||||
res.path = filename;
|
||||
|
||||
return res;
|
||||
}));
|
||||
|
||||
const result = await deploy(uploadedURLs);
|
||||
|
||||
return new Response(
|
||||
JSON.stringify(result.fileURLs),
|
||||
{ status: result.status }
|
||||
);
|
||||
};
|
||||
|
|
@ -1,64 +0,0 @@
|
|||
import { Hash } from "https://deno.land/x/checksum@1.4.0/mod.ts";
|
||||
|
||||
const endpoint = (path: string) => {
|
||||
// https://vercel.com/docs/api#api-basics/authentication/accessing-resources-owned-by-a-team
|
||||
let url = "https://api.vercel.com/" + path;
|
||||
if (Deno.env.get("ZEIT_TEAM")) {
|
||||
url += ("?teamId=" + Deno.env.get("ZEIT_TEAM"));
|
||||
}
|
||||
return url;
|
||||
};
|
||||
|
||||
const uploadFile = async (url: string, authorization: string|null) => {
|
||||
const options = {
|
||||
method: 'GET', headers: { 'Authorization': "" }
|
||||
}
|
||||
if (authorization) {
|
||||
options.headers = { 'Authorization': authorization }
|
||||
}
|
||||
const req = await fetch(url, options);
|
||||
const data = new Uint8Array(await req.arrayBuffer());
|
||||
const sha = new Hash("sha1").digest(data).hex();
|
||||
const size = data.byteLength;
|
||||
|
||||
await fetch(endpoint("v2/now/files"), {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Length": size.toString(),
|
||||
"x-now-digest": sha,
|
||||
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
|
||||
},
|
||||
body: data.buffer,
|
||||
});
|
||||
|
||||
return {
|
||||
sha,
|
||||
size,
|
||||
};
|
||||
};
|
||||
|
||||
export default async (req: Request) => {
|
||||
if (req.method != "POST") {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "Method not allowed, use POST" }),
|
||||
{
|
||||
status: 405,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
const decoder = new TextDecoder();
|
||||
const buf = await req.arrayBuffer();
|
||||
const singleFileURL = decoder.decode(buf);
|
||||
if (typeof singleFileURL != "string") {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "newSingle only accepts a single URL" }),
|
||||
{
|
||||
status: 422
|
||||
},
|
||||
);
|
||||
}
|
||||
const uploadedFileURL = await uploadFile(singleFileURL, req.headers.get("Authorization"));
|
||||
|
||||
return new Response(JSON.stringify(uploadedFileURL))
|
||||
};
|
||||
|
|
@ -1,40 +0,0 @@
|
|||
import { endpoint } from "./utils.ts";
|
||||
|
||||
// Other functions can import this function to call this serverless endpoint
|
||||
export const deployEndpoint = async (
|
||||
files: { sha: string; file: string; size: number }[],
|
||||
) => {
|
||||
return await deploy(files);
|
||||
};
|
||||
|
||||
const deploy = async (
|
||||
files: { sha: string; file: string; size: number }[],
|
||||
) => {
|
||||
const req = await fetch(endpoint("v12/now/deployments"), {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
name: "cloud",
|
||||
files: files.map((f) => ({
|
||||
sha: f.sha,
|
||||
file: f.file,
|
||||
size: f.size,
|
||||
})),
|
||||
projectSettings: {
|
||||
framework: null,
|
||||
},
|
||||
}),
|
||||
});
|
||||
const json = await req.json();
|
||||
const deployedFiles = files.map((file) => ({
|
||||
deployedUrl: `https://${json.url}/public/${file.file}`,
|
||||
...file,
|
||||
}));
|
||||
|
||||
return { status: req.status, files: deployedFiles };
|
||||
};
|
||||
|
||||
export default deploy;
|
||||
|
|
@ -1,37 +0,0 @@
|
|||
import { urlParse } from "https://deno.land/x/url_parse/mod.ts";
|
||||
import { uploadEndpoint } from "./upload.ts";
|
||||
import { deployEndpoint } from "./deploy.ts";
|
||||
import { ensurePost, parseBody } from "./utils.ts";
|
||||
|
||||
export default async (req: Request) => {
|
||||
if (!ensurePost(req)) return null;
|
||||
|
||||
const body = new TextDecoder().decode(await req.arrayBuffer());
|
||||
const fileURLs = JSON.parse(body);
|
||||
|
||||
if (!Array.isArray(fileURLs) || fileURLs.length < 1) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "Empty/invalid file array" }),
|
||||
{ status: 422 }
|
||||
);
|
||||
}
|
||||
|
||||
const authorization = req.headers.get('Authorization')
|
||||
|
||||
const uploadArray = await Promise.all(fileURLs.map(f => uploadEndpoint(f, authorization)));
|
||||
|
||||
const deploymentFiles = uploadArray.map(
|
||||
(file: { url: string; sha: string; size: number }, index: number) => {
|
||||
const { pathname } = urlParse(file.url);
|
||||
const filename = index + pathname.substr(pathname.lastIndexOf("/") + 1);
|
||||
return { sha: file.sha, file: filename, size: file.size };
|
||||
},
|
||||
);
|
||||
|
||||
const deploymentData = await deployEndpoint(deploymentFiles);
|
||||
|
||||
return new Response(
|
||||
JSON.stringify(deploymentData.files),
|
||||
{ status: deploymentData.status }
|
||||
);
|
||||
};
|
||||
|
|
@ -1,53 +0,0 @@
|
|||
import { Hash } from "https://deno.land/x/checksum@1.4.0/hash.ts";
|
||||
import { endpoint, ensurePost, parseBody } from "./utils.ts";
|
||||
|
||||
// Other functions can import this function to call this serverless endpoint
|
||||
export const uploadEndpoint = async (url: string, authorization: string | null) => {
|
||||
const options = { method: 'POST', body: url, headers: {} }
|
||||
if (authorization) {
|
||||
options.headers = { 'Authorization': authorization }
|
||||
}
|
||||
console.log({ options})
|
||||
const response = await fetch("https://cdn.hackclub.com/api/v2/upload", options);
|
||||
const result = await response.json();
|
||||
console.log({result})
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
const upload = async (url: string, authorization: string | null) => {
|
||||
const options = { headers: {} }
|
||||
if (authorization) {
|
||||
options.headers = { 'Authorization': authorization }
|
||||
}
|
||||
const req = await fetch(url, options);
|
||||
const reqArrayBuffer = await req.arrayBuffer();
|
||||
const data = new Uint8Array(reqArrayBuffer);
|
||||
const sha = new Hash("sha1").digest(data).hex();
|
||||
const size = data.byteLength;
|
||||
|
||||
await fetch(endpoint("v2/now/files"), {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Length": size.toString(),
|
||||
"x-now-digest": sha,
|
||||
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
|
||||
},
|
||||
body: data.buffer,
|
||||
});
|
||||
|
||||
return {
|
||||
url,
|
||||
sha,
|
||||
size,
|
||||
};
|
||||
};
|
||||
|
||||
export default async (req: Request) => {
|
||||
if (!ensurePost(req)) return null;
|
||||
|
||||
const body = new TextDecoder().decode(await req.arrayBuffer());
|
||||
const uploadedFileUrl = await upload(body, req.headers.get("Authorization"));
|
||||
|
||||
return new Response(JSON.stringify(uploadedFileUrl));
|
||||
};
|
||||
|
|
@ -1,57 +0,0 @@
|
|||
export const endpoint = (path: string) => {
|
||||
// https://vercel.com/docs/api#api-basics/authentication/accessing-resources-owned-by-a-team
|
||||
let url = "https://api.vercel.com/" + path;
|
||||
if (Deno.env.get("ZEIT_TEAM")) {
|
||||
url += ("?teamId=" + Deno.env.get("ZEIT_TEAM"));
|
||||
}
|
||||
return url;
|
||||
};
|
||||
|
||||
export const parseBody = async (body: Request["body"]) => {
|
||||
const decoder = new TextDecoder();
|
||||
const buf = await Deno.readAll(body);
|
||||
const result = decoder.decode(buf);
|
||||
return result;
|
||||
};
|
||||
|
||||
export const ensurePost = (req: Request) => {
|
||||
if (req.method == "OPTIONS") {
|
||||
return new Response(
|
||||
JSON.stringify(
|
||||
{ status: "YIPPE YAY. YOU HAVE CLEARANCE TO PROCEED." },
|
||||
),
|
||||
{
|
||||
status: 204
|
||||
},
|
||||
);
|
||||
}
|
||||
if (req.method == "GET") {
|
||||
return new Response(
|
||||
JSON.stringify(
|
||||
{ error: "*GET outta here!* (Method not allowed, use POST)" },
|
||||
),
|
||||
{
|
||||
status: 405
|
||||
},
|
||||
);
|
||||
}
|
||||
if (req.method == "PUT") {
|
||||
return new Response(
|
||||
JSON.stringify(
|
||||
{ error: "*PUT that request away!* (Method not allowed, use POST)" },
|
||||
),
|
||||
{
|
||||
status: 405,
|
||||
},
|
||||
);
|
||||
}
|
||||
if (req.method != "POST") {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "Method not allowed, use POST" }),
|
||||
{
|
||||
status: 405,
|
||||
},
|
||||
);
|
||||
}
|
||||
return true;
|
||||
};
|
||||
88
index.js
Normal file
88
index.js
Normal file
|
|
@ -0,0 +1,88 @@
|
|||
const dotenv = require('dotenv');
|
||||
dotenv.config();
|
||||
|
||||
const logger = require('./src/config/logger');
|
||||
const {App} = require('@slack/bolt');
|
||||
const fileUpload = require('./src/fileUpload');
|
||||
const express = require('express');
|
||||
const cors = require('cors');
|
||||
const apiRoutes = require('./src/api/index.js');
|
||||
|
||||
const BOT_START_TIME = Date.now() / 1000;
|
||||
|
||||
const app = new App({
|
||||
token: process.env.SLACK_BOT_TOKEN,
|
||||
signingSecret: process.env.SLACK_SIGNING_SECRET,
|
||||
socketMode: true,
|
||||
appToken: process.env.SLACK_APP_TOKEN
|
||||
});
|
||||
|
||||
// API server
|
||||
const expressApp = express();
|
||||
expressApp.use(cors());
|
||||
expressApp.use(express.json());
|
||||
expressApp.use(express.urlencoded({ extended: true }));
|
||||
|
||||
// Log ALL incoming requests for debugging
|
||||
expressApp.use((req, res, next) => {
|
||||
logger.info(`Incoming request: ${req.method} ${req.path}`);
|
||||
next();
|
||||
});
|
||||
|
||||
// Log statement before mounting the API routes
|
||||
logger.info('Mounting API routes');
|
||||
|
||||
// Mount API for all versions
|
||||
expressApp.use('/api', apiRoutes);
|
||||
|
||||
// Error handling middleware
|
||||
expressApp.use((err, req, res, next) => {
|
||||
logger.error('API Error:', err);
|
||||
res.status(500).json({ error: 'Internal server error' });
|
||||
});
|
||||
|
||||
// Fallback route for unhandled paths
|
||||
expressApp.use((req, res, next) => {
|
||||
logger.warn(`Unhandled route: ${req.method} ${req.path}`);
|
||||
res.status(404).json({ error: 'Not found' });
|
||||
});
|
||||
|
||||
// Event listener for file_shared events
|
||||
app.event('file_shared', async ({event, client}) => {
|
||||
logger.debug(`Received file_shared event: ${JSON.stringify(event)}`);
|
||||
|
||||
if (parseFloat(event.event_ts) < BOT_START_TIME) {
|
||||
logger.info(`Ignoring file event from before bot start: ${new Date(parseFloat(event.event_ts) * 1000).toISOString()}`);
|
||||
return;
|
||||
}
|
||||
|
||||
const targetChannelId = process.env.SLACK_CHANNEL_ID;
|
||||
const channelId = event.channel_id;
|
||||
|
||||
if (channelId !== targetChannelId) {
|
||||
logger.info(`Ignoring file shared in channel: ${channelId}`);
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
await fileUpload.handleFileUpload(event, client);
|
||||
} catch (error) {
|
||||
logger.error(`Error processing file upload: ${error.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
// Slack bot and API server
|
||||
(async () => {
|
||||
try {
|
||||
await fileUpload.initialize();
|
||||
await app.start();
|
||||
const port = parseInt(process.env.API_PORT || '4553', 10);
|
||||
expressApp.listen(port, () => {
|
||||
logger.info(`⚡️ Slack app is running in Socket Mode!`);
|
||||
logger.info(`🚀 API server is running on port ${port}`);
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to start:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
})();
|
||||
9
logger.js
Normal file
9
logger.js
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
const winston = require('winston');
|
||||
|
||||
const logger = winston.createLogger({
|
||||
level: 'info',
|
||||
format: winston.format.combine(winston.format.colorize(), winston.format.simple()),
|
||||
transports: [new winston.transports.Console()],
|
||||
});
|
||||
|
||||
module.exports = logger;
|
||||
22
package.json
Normal file
22
package.json
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
{
|
||||
"name": "cdn-v2-hackclub",
|
||||
"version": "1.0.0",
|
||||
"description": "Slack app to upload files to Backblaze B2 with unique URLs",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"start": "node index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"@slack/bolt": "^4.2.0",
|
||||
"@slack/web-api": "^7.8.0",
|
||||
"backblaze-b2": "^1.3.0",
|
||||
"cors": "^2.8.5",
|
||||
"dotenv": "^10.0.0",
|
||||
"multer": "^1.4.5-lts.1",
|
||||
"node-fetch": "^2.6.1",
|
||||
"p-limit": "^6.2.0",
|
||||
"winston": "^3.17.0"
|
||||
},
|
||||
"author": "deployor",
|
||||
"license": "MIT"
|
||||
}
|
||||
17
src/api.js
Normal file
17
src/api.js
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
const express = require('express');
|
||||
const multer = require('multer');
|
||||
const router = express.Router();
|
||||
const upload = multer({dest: 'uploads/'});
|
||||
|
||||
router.post('/upload', upload.single('file'), (req, res) => {
|
||||
if (!req.file) {
|
||||
return res.status(400).send('No file uploaded.');
|
||||
}
|
||||
|
||||
// Handle the uploaded file
|
||||
console.log('Uploaded file:', req.file);
|
||||
|
||||
res.send('File uploaded successfully.');
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
26
src/api/deploy.js
Normal file
26
src/api/deploy.js
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
const logger = require('../config/logger');
|
||||
const {generateApiUrl, getCdnUrl} = require('./utils');
|
||||
|
||||
const deployEndpoint = async (files) => {
|
||||
try {
|
||||
const deployedFiles = files.map(file => ({
|
||||
deployedUrl: generateApiUrl('v3', file.file),
|
||||
cdnUrl: getCdnUrl(),
|
||||
...file
|
||||
}));
|
||||
|
||||
return {
|
||||
status: 200,
|
||||
files: deployedFiles,
|
||||
cdnBase: getCdnUrl()
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Deploy error:', error);
|
||||
return {
|
||||
status: 500,
|
||||
files: []
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
module.exports = {deployEndpoint};
|
||||
85
src/api/index.js
Normal file
85
src/api/index.js
Normal file
|
|
@ -0,0 +1,85 @@
|
|||
const express = require('express');
|
||||
const {validateToken, validateRequest, getCdnUrl} = require('./utils');
|
||||
const {uploadEndpoint, handleUpload} = require('./upload');
|
||||
const logger = require('../config/logger');
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// Require valid API token for all routes
|
||||
router.use((req, res, next) => {
|
||||
const tokenCheck = validateToken(req);
|
||||
if (tokenCheck.status !== 200) {
|
||||
return res.status(tokenCheck.status).json(tokenCheck.body);
|
||||
}
|
||||
next();
|
||||
});
|
||||
|
||||
// Health check route
|
||||
router.get('/health', (req, res) => {
|
||||
res.status(200).json({ status: 'ok' });
|
||||
});
|
||||
|
||||
// Format response based on API version compatibility
|
||||
const formatResponse = (results, version) => {
|
||||
switch (version) {
|
||||
case 1:
|
||||
return results.map(r => r.url);
|
||||
case 2:
|
||||
return results.reduce((acc, r, i) => {
|
||||
const fileName = r.url.split('/').pop();
|
||||
acc[`${i}${fileName}`] = r.url;
|
||||
return acc;
|
||||
}, {});
|
||||
default:
|
||||
return {
|
||||
files: results.map((r, i) => ({
|
||||
deployedUrl: r.url,
|
||||
file: `${i}_${r.url.split('/').pop()}`,
|
||||
sha: r.sha,
|
||||
size: r.size
|
||||
})),
|
||||
cdnBase: getCdnUrl()
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
// Handle bulk file uploads with version-specific responses
|
||||
const handleBulkUpload = async (req, res, version) => {
|
||||
try {
|
||||
const urls = req.body;
|
||||
// Basic validation
|
||||
if (!Array.isArray(urls) || !urls.length) {
|
||||
return res.status(422).json({error: 'Empty/invalid file array'});
|
||||
}
|
||||
|
||||
// Process all URLs concurrently
|
||||
logger.debug(`Processing ${urls.length} URLs`);
|
||||
const results = await Promise.all(
|
||||
urls.map(url => uploadEndpoint(url, req.headers?.authorization))
|
||||
);
|
||||
|
||||
res.json(formatResponse(results, version));
|
||||
} catch (error) {
|
||||
logger.error('Bulk upload failed:', error);
|
||||
res.status(500).json({error: 'Internal server error'});
|
||||
}
|
||||
};
|
||||
|
||||
// API Routes
|
||||
router.post('/v1/new', (req, res) => handleBulkUpload(req, res, 1)); // Legacy support
|
||||
router.post('/v2/new', (req, res) => handleBulkUpload(req, res, 2)); // Legacy support
|
||||
router.post('/v3/new', (req, res) => handleBulkUpload(req, res, 3)); // Current version
|
||||
router.post('/new', (req, res) => handleBulkUpload(req, res, 3)); // Alias for v3 (latest)
|
||||
|
||||
// Single file upload endpoint
|
||||
router.post('/upload', async (req, res) => {
|
||||
try {
|
||||
const result = await handleUpload(req);
|
||||
res.status(result.status).json(result.body);
|
||||
} catch (error) {
|
||||
logger.error('Upload handler error:', error);
|
||||
res.status(500).json({error: 'Internal server error'});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
56
src/api/upload.js
Normal file
56
src/api/upload.js
Normal file
|
|
@ -0,0 +1,56 @@
|
|||
const fetch = require('node-fetch');
|
||||
const crypto = require('crypto');
|
||||
const {uploadToBackblaze} = require('../backblaze');
|
||||
const {generateUrl, getCdnUrl} = require('./utils');
|
||||
const logger = require('../config/logger');
|
||||
|
||||
// Sanitize file name for storage
|
||||
function sanitizeFileName(fileName) {
|
||||
let sanitizedFileName = fileName.replace(/[^a-zA-Z0-9.-]/g, '_');
|
||||
if (!sanitizedFileName) {
|
||||
sanitizedFileName = 'upload_' + Date.now();
|
||||
}
|
||||
return sanitizedFileName;
|
||||
}
|
||||
|
||||
// Handle remote file upload to B2 storage
|
||||
const uploadEndpoint = async (url, authorization = null) => {
|
||||
try {
|
||||
logger.debug(`Downloading: ${url}`);
|
||||
const response = await fetch(url, {
|
||||
headers: authorization ? {'Authorization': authorization} : {}
|
||||
});
|
||||
|
||||
if (!response.ok) throw new Error(`Download failed: ${response.statusText}`);
|
||||
|
||||
// Generate unique filename using SHA1 (hash) of file contents
|
||||
const buffer = await response.buffer();
|
||||
const sha = crypto.createHash('sha1').update(buffer).digest('hex');
|
||||
const originalName = url.split('/').pop();
|
||||
const sanitizedFileName = sanitizeFileName(originalName);
|
||||
const fileName = `${sha}_${sanitizedFileName}`;
|
||||
|
||||
// Upload to B2 storage
|
||||
logger.debug(`Uploading: ${fileName}`);
|
||||
const uploaded = await uploadToBackblaze('s/v3', fileName, buffer);
|
||||
if (!uploaded) throw new Error('Storage upload failed');
|
||||
|
||||
return {
|
||||
url: generateUrl('s/v3', fileName),
|
||||
sha,
|
||||
size: buffer.length
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Upload failed:', error);
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
// Express request handler for file uploads
|
||||
const handleUpload = async (req) => {
|
||||
const url = req.body || await req.text();
|
||||
const result = await uploadEndpoint(url, req.headers?.authorization);
|
||||
return {status: 200, body: result};
|
||||
};
|
||||
|
||||
module.exports = {uploadEndpoint, handleUpload};
|
||||
45
src/api/utils.js
Normal file
45
src/api/utils.js
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
const logger = require('../config/logger');
|
||||
|
||||
const getCdnUrl = () => process.env.CDN_URL;
|
||||
|
||||
const generateUrl = (version, fileName) => {
|
||||
return `${getCdnUrl()}/${version}/${fileName}`;
|
||||
};
|
||||
|
||||
const validateToken = (req) => {
|
||||
const token = req.headers.authorization?.split('Bearer ')[1];
|
||||
if (!token || token !== process.env.API_TOKEN) {
|
||||
return {
|
||||
status: 401,
|
||||
body: {error: 'Unauthorized - Invalid or missing API token'}
|
||||
};
|
||||
}
|
||||
return {status: 200};
|
||||
};
|
||||
|
||||
const validateRequest = (req) => {
|
||||
// First check token
|
||||
const tokenCheck = validateToken(req);
|
||||
if (tokenCheck.status !== 200) {
|
||||
return tokenCheck;
|
||||
}
|
||||
|
||||
// Then check method (copied the thing from old api maybe someone is insane and uses the status and not the code)
|
||||
if (req.method === 'OPTIONS') {
|
||||
return {status: 204, body: {status: 'YIPPE YAY. YOU HAVE CLEARANCE TO PROCEED.'}};
|
||||
}
|
||||
if (req.method !== 'POST') {
|
||||
return {
|
||||
status: 405,
|
||||
body: {error: 'Method not allowed, use POST'}
|
||||
};
|
||||
}
|
||||
return {status: 200};
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
validateRequest,
|
||||
validateToken,
|
||||
generateUrl,
|
||||
getCdnUrl
|
||||
};
|
||||
32
src/backblaze.js
Normal file
32
src/backblaze.js
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
const B2 = require('backblaze-b2');
|
||||
const logger = require('./config/logger');
|
||||
|
||||
const b2 = new B2({
|
||||
applicationKeyId: process.env.B2_APP_KEY_ID,
|
||||
applicationKey: process.env.B2_APP_KEY
|
||||
});
|
||||
|
||||
async function uploadToBackblaze(userDir, uniqueFileName, buffer) {
|
||||
try {
|
||||
await b2.authorize();
|
||||
const {data} = await b2.getUploadUrl({
|
||||
bucketId: process.env.B2_BUCKET_ID
|
||||
});
|
||||
|
||||
await b2.uploadFile({
|
||||
uploadUrl: data.uploadUrl,
|
||||
uploadAuthToken: data.authorizationToken,
|
||||
fileName: `${userDir}/${uniqueFileName}`,
|
||||
data: buffer
|
||||
});
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error('B2 upload failed:', error.message);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {uploadToBackblaze};
|
||||
|
||||
// So easy i love it!
|
||||
23
src/config/logger.js
Normal file
23
src/config/logger.js
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
const winston = require('winston');
|
||||
|
||||
const consoleFormat = winston.format.combine(
|
||||
winston.format.colorize(),
|
||||
winston.format.timestamp(),
|
||||
winston.format.printf(({level, message, timestamp}) => {
|
||||
return `${timestamp} ${level}: ${message}`;
|
||||
})
|
||||
);
|
||||
|
||||
const logger = winston.createLogger({
|
||||
level: process.env.LOG_LEVEL || 'info',
|
||||
format: consoleFormat,
|
||||
transports: [
|
||||
new winston.transports.Console()
|
||||
]
|
||||
});
|
||||
|
||||
logger.on('error', error => {
|
||||
console.error('Logger error:', error);
|
||||
});
|
||||
|
||||
module.exports = logger;
|
||||
230
src/fileUpload.js
Normal file
230
src/fileUpload.js
Normal file
|
|
@ -0,0 +1,230 @@
|
|||
const fetch = require('node-fetch');
|
||||
const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
const logger = require('./config/logger');
|
||||
const {uploadToBackblaze} = require('./backblaze');
|
||||
const {generateFileUrl} = require('./utils');
|
||||
|
||||
const MAX_FILE_SIZE = 2 * 1024 * 1024 * 1024; // 2GB in bytes
|
||||
const CONCURRENT_UPLOADS = 3; // Max concurrent uploads (messages)
|
||||
|
||||
// processed messages
|
||||
const processedMessages = new Map();
|
||||
|
||||
let uploadLimit;
|
||||
|
||||
async function initialize() {
|
||||
const pLimit = (await import('p-limit')).default;
|
||||
uploadLimit = pLimit(CONCURRENT_UPLOADS);
|
||||
}
|
||||
|
||||
// Check if the message is older than 24 hours for when the bot was offline
|
||||
function isMessageTooOld(eventTs) {
|
||||
const eventTime = parseFloat(eventTs) * 1000;
|
||||
const currentTime = Date.now();
|
||||
const timeDifference = currentTime - eventTime;
|
||||
const maxAge = 24 * 60 * 60 * 1000; // 24 hours in milliseconds
|
||||
return timeDifference > maxAge;
|
||||
}
|
||||
|
||||
// check if the message has already been processed
|
||||
function isMessageProcessed(messageTs) {
|
||||
return processedMessages.has(messageTs);
|
||||
}
|
||||
|
||||
function markMessageAsProcessing(messageTs) {
|
||||
processedMessages.set(messageTs, true);
|
||||
}
|
||||
|
||||
// Processing reaction
|
||||
async function addProcessingReaction(client, event, fileMessage) {
|
||||
try {
|
||||
await client.reactions.add({
|
||||
name: 'beachball',
|
||||
timestamp: fileMessage.ts,
|
||||
channel: event.channel_id
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to add processing reaction:', error.message);
|
||||
}
|
||||
}
|
||||
|
||||
// sanitize file names and ensure it's not empty (I don't even know if that's possible but let's be safe)
|
||||
function sanitizeFileName(fileName) {
|
||||
let sanitizedFileName = fileName.replace(/[^a-zA-Z0-9.-]/g, '_');
|
||||
if (!sanitizedFileName) {
|
||||
sanitizedFileName = 'upload_' + Date.now();
|
||||
}
|
||||
return sanitizedFileName;
|
||||
}
|
||||
|
||||
// Generate a unique, non-guessable file name
|
||||
function generateUniqueFileName(fileName) {
|
||||
const sanitizedFileName = sanitizeFileName(fileName);
|
||||
const uniqueFileName = `${Date.now()}-${crypto.randomBytes(16).toString('hex')}-${sanitizedFileName}`;
|
||||
return uniqueFileName;
|
||||
}
|
||||
|
||||
// upload files to the /s/ directory
|
||||
async function processFiles(fileMessage, client) {
|
||||
const uploadedFiles = [];
|
||||
const failedFiles = [];
|
||||
|
||||
const files = fileMessage.files || [];
|
||||
for (const file of files) {
|
||||
if (file.size > MAX_FILE_SIZE) {
|
||||
failedFiles.push(file.name);
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
const buffer = await fetch(file.url_private, {
|
||||
headers: {Authorization: `Bearer ${process.env.SLACK_BOT_TOKEN}`}
|
||||
}).then(res => res.buffer());
|
||||
|
||||
const uniqueFileName = generateUniqueFileName(file.name);
|
||||
const userDir = `s/${fileMessage.user}`;
|
||||
|
||||
const success = await uploadLimit(() => uploadToBackblaze(userDir, uniqueFileName, buffer));
|
||||
if (success) {
|
||||
const url = generateFileUrl(userDir, uniqueFileName);
|
||||
uploadedFiles.push({name: uniqueFileName, url});
|
||||
} else {
|
||||
failedFiles.push(file.name);
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error(`Failed to process file ${file.name}:`, error.message);
|
||||
failedFiles.push(file.name);
|
||||
}
|
||||
}
|
||||
|
||||
return {uploadedFiles, failedFiles};
|
||||
}
|
||||
|
||||
// update reactions based on success
|
||||
async function updateReactions(client, event, fileMessage, success) {
|
||||
try {
|
||||
await client.reactions.remove({
|
||||
name: 'beachball',
|
||||
timestamp: fileMessage.ts,
|
||||
channel: event.channel_id
|
||||
});
|
||||
await client.reactions.add({
|
||||
name: success ? 'white_check_mark' : 'x',
|
||||
timestamp: fileMessage.ts,
|
||||
channel: event.channel_id
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error('Failed to update reactions:', error.message);
|
||||
}
|
||||
}
|
||||
|
||||
// find a file message
|
||||
async function findFileMessage(event, client) {
|
||||
try {
|
||||
const fileInfo = await client.files.info({
|
||||
file: event.file_id,
|
||||
include_shares: true
|
||||
});
|
||||
|
||||
if (!fileInfo.ok || !fileInfo.file) {
|
||||
throw new Error('Could not get file info');
|
||||
}
|
||||
|
||||
const channelShare = fileInfo.file.shares?.public?.[event.channel_id] ||
|
||||
fileInfo.file.shares?.private?.[event.channel_id];
|
||||
|
||||
if (!channelShare || !channelShare.length) {
|
||||
throw new Error('No share info found for this channel');
|
||||
}
|
||||
|
||||
// Get the exact message using the ts from share info
|
||||
const messageTs = channelShare[0].ts;
|
||||
|
||||
const messageInfo = await client.conversations.history({
|
||||
channel: event.channel_id,
|
||||
latest: messageTs,
|
||||
limit: 1,
|
||||
inclusive: true
|
||||
});
|
||||
|
||||
if (!messageInfo.ok || !messageInfo.messages.length) {
|
||||
throw new Error('Could not find original message');
|
||||
}
|
||||
|
||||
return messageInfo.messages[0];
|
||||
} catch (error) {
|
||||
logger.error('Error finding file message:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async function sendResultsMessage(client, channelId, fileMessage, uploadedFiles, failedFiles) {
|
||||
let message = `Hey <@${fileMessage.user}>, `;
|
||||
if (uploadedFiles.length > 0) {
|
||||
message += `here ${uploadedFiles.length === 1 ? 'is your link' : 'are your links'}:\n`;
|
||||
message += uploadedFiles.map(f => `• ${f.name}: ${f.url}`).join('\n');
|
||||
}
|
||||
if (failedFiles.length > 0) {
|
||||
message += `\n\nFailed to process: ${failedFiles.join(', ')}`;
|
||||
}
|
||||
|
||||
await client.chat.postMessage({
|
||||
channel: channelId,
|
||||
thread_ts: fileMessage.ts,
|
||||
text: message
|
||||
});
|
||||
}
|
||||
|
||||
async function handleError(client, channelId, fileMessage, reactionAdded) {
|
||||
if (fileMessage && reactionAdded) {
|
||||
try {
|
||||
await client.reactions.remove({
|
||||
name: 'beachball',
|
||||
timestamp: fileMessage.ts,
|
||||
channel: channelId
|
||||
});
|
||||
} catch (cleanupError) {
|
||||
if (cleanupError.data.error !== 'no_reaction') {
|
||||
logger.error('Cleanup error:', cleanupError);
|
||||
}
|
||||
}
|
||||
try {
|
||||
await client.reactions.add({
|
||||
name: 'x',
|
||||
timestamp: fileMessage.ts,
|
||||
channel: channelId
|
||||
});
|
||||
} catch (cleanupError) {
|
||||
logger.error('Cleanup error:', cleanupError);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function handleFileUpload(event, client) {
|
||||
let fileMessage = null;
|
||||
let reactionAdded = false;
|
||||
|
||||
try {
|
||||
if (isMessageTooOld(event.event_ts)) return;
|
||||
|
||||
fileMessage = await findFileMessage(event, client);
|
||||
if (!fileMessage || isMessageProcessed(fileMessage.ts)) return;
|
||||
|
||||
markMessageAsProcessing(fileMessage.ts);
|
||||
await addProcessingReaction(client, event, fileMessage);
|
||||
reactionAdded = true;
|
||||
|
||||
const {uploadedFiles, failedFiles} = await processFiles(fileMessage, client);
|
||||
await sendResultsMessage(client, event.channel_id, fileMessage, uploadedFiles, failedFiles);
|
||||
|
||||
await updateReactions(client, event, fileMessage, failedFiles.length === 0);
|
||||
|
||||
} catch (error) {
|
||||
logger.error('Upload failed:', error.message);
|
||||
await handleError(client, event.channel_id, fileMessage, reactionAdded);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { handleFileUpload, initialize };
|
||||
32
src/upload.js
Normal file
32
src/upload.js
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const {uploadToBackblaze} = require('../backblaze');
|
||||
const {generateUrl} = require('./utils');
|
||||
const logger = require('../config/logger');
|
||||
|
||||
// Handle individual file upload
|
||||
const handleUpload = async (file) => {
|
||||
try {
|
||||
const buffer = fs.readFileSync(file.path);
|
||||
const fileName = path.basename(file.originalname);
|
||||
const uniqueFileName = `${Date.now()}-${fileName}`;
|
||||
|
||||
// Upload to B2 storage
|
||||
logger.debug(`Uploading: ${uniqueFileName}`);
|
||||
const uploaded = await uploadToBackblaze('s/v3', uniqueFileName, buffer);
|
||||
if (!uploaded) throw new Error('Storage upload failed');
|
||||
|
||||
return {
|
||||
name: fileName,
|
||||
url: generateUrl('s/v3', uniqueFileName)
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('Upload failed:', error);
|
||||
throw error;
|
||||
} finally {
|
||||
// Clean up the temporary file
|
||||
fs.unlinkSync(file.path);
|
||||
}
|
||||
};
|
||||
|
||||
module.exports = {handleUpload};
|
||||
8
src/utils.js
Normal file
8
src/utils.js
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
// Make the CDN URL
|
||||
|
||||
function generateFileUrl(userDir, uniqueFileName) {
|
||||
const cdnUrl = process.env.B2_CDN_URL;
|
||||
return `${cdnUrl}/${userDir}/${uniqueFileName}`;
|
||||
}
|
||||
|
||||
module.exports = {generateFileUrl};
|
||||
11
vercel.json
11
vercel.json
|
|
@ -1,11 +0,0 @@
|
|||
{
|
||||
"version": 2,
|
||||
"functions": {
|
||||
"api/**/*.[jt]s": { "runtime": "vercel-deno@3.0.0" }
|
||||
},
|
||||
"redirects": [
|
||||
{ "source": "/", "destination": "https://github.com/hackclub/cdn" },
|
||||
{ "source": "/api/new", "destination": "/api/v1/new", "permanent": false },
|
||||
{ "source": "/api/newSingle", "destination": "/api/v1/newSingle", "permanent": false }
|
||||
]
|
||||
}
|
||||
Loading…
Add table
Reference in a new issue