Merge pull request #9 from hackclub/v3

V3 deployment
This commit is contained in:
Max Wofford 2025-02-24 22:00:27 -05:00 committed by GitHub
commit 937eb02e04
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
25 changed files with 1527 additions and 437 deletions

17
.env.example Normal file
View file

@ -0,0 +1,17 @@
# Slack
SLACK_BOT_TOKEN=xoxb- # From OAuth & Permissions
SLACK_SIGNING_SECRET= # From Basic Information
SLACK_APP_TOKEN=xapp- # From Basic Information (for Socket Mode)
SLACK_CHANNEL_ID=channel-id # Channel where bot operates
# S3 Config CF in this example
AWS_ACCESS_KEY_ID=1234567890abcdef
AWS_SECRET_ACCESS_KEY=abcdef1234567890
AWS_BUCKET_NAME=my-cdn-bucket
AWS_REGION=auto
AWS_ENDPOINT=https://<accountid>.r2.cloudflarestorage.com
AWS_CDN_URL=https://cdn.beans.com
# API
API_TOKEN=beans # Set a secure random string
PORT=3000

10
.gitignore vendored
View file

@ -1,3 +1,7 @@
.env
.vercel
.vscode
/node_modules/
/splitfornpm/
/.idea/
/.env
/bun.lockb
/package-lock.json
/.history

23
Dockerfile Normal file
View file

@ -0,0 +1,23 @@
# Use the official Bun image as base
FROM oven/bun:1
# install curl for coolify healthcheck
RUN apt-get update && apt-get install -y curl wget
# Set working directory
WORKDIR /app
# Copy package.json and bun.lockb (if exists)
COPY package*.json bun.lockb* ./
# Install dependencies
RUN bun install
# Copy the rest of the application
COPY . .
# Expose the port your Express server runs on
EXPOSE 3000
# Start the server
CMD ["bun", "run", "start"]

329
README.md
View file

@ -1,44 +1,285 @@
<h1 align="center">CDN</h1>
<p align="center"><i>Deep under the waves and storms there lies a <a href="https://app.slack.com/client/T0266FRGM/C016DEDUL87">vault</a>...</i></p>
<p align="center"><img alt="Raft icon" src="http://cloud-pxma0a3yi.vercel.app/underwater.png"></p>
<p align="center">Illustration above by <a href="https://gh.maxwofford.com">@maxwofford</a>.</p>
---
CDN powers the [#cdn](https://app.slack.com/client/T0266FRGM/C016DEDUL87) channel in the [Hack Club Slack](https://hackclub.com/slack).
## Version 2 <img alt="Version 2" src="https://cloud-b46nncb23.vercel.app/0v2.png" align="right" width="300">
Post this JSON...
```js
[
"website.com/somefile.png",
"website.com/somefile.gif",
]
```
And it'll return the following:
```js
{
"0somefile.png": "cdnlink.vercel.app/0somefile.png",
"1somefile.gif": "cdnlink.vercel.app/1somefile.gif"
}
```
## Version 1 <img alt="Version 1" src="https://cloud-6gklvd3ci.vercel.app/0v1.png" align="right" width="300">
Post this JSON...
```js
[
"website.com/somefile.png",
"website.com/somefile.gif",
]
```
And it'll return the following:
```js
[
"cdnlink.vercel.app/0somefile.png",
"cdnlink.vercel.app/1somefile.gif"
]
```
<div align="center">
<img src="https://assets.hackclub.com/flag-standalone.svg" width="100" alt="flag">
<h1>CDN</h1>
<p>A CDN solution for Hack Club!</p>
</div>
<p align="center"><i>Deep under the waves and storms there lies a <a href="https://app.slack.com/client/T0266FRGM/C016DEDUL87">vault</a>...</i></p>
<div align="center">
<img src="https://files.catbox.moe/6fpj0x.png" width="100%" alt="Banner">
<p align="center">Banner illustration by <a href="https://gh.maxwofford.com">@maxwofford</a>.</p>
<a href="https://app.slack.com/client/T0266FRGM/C016DEDUL87">
<img alt="Slack Channel" src="https://img.shields.io/badge/slack-%23cdn-blue.svg?style=flat&logo=slack">
</a>
</div>
## 🚀 Features
- **Multi-version API Support** (v1, v2, v3)
- **Slack Bot Integration**
- Upload up to 10 files per message
- Automatic file sanitization
- file organization
- **Secure API Endpoints**
- **Cost-Effective Storage** (87-98% cost reduction vs. Vercel CDN)
- **Prevent File Deduplication**
- **Organized Storage Structure**
## 🔧 Setup
### 1. Slack App Configuration
1. Create a new Slack App at [api.slack.com](https://api.slack.com/apps)
2. Enable Socket Mode in the app settings
3. Add the following Bot Token Scopes:
- `channels:history`
- `channels:read`
- `chat:write`
- `files:read`
- `files:write`
- `groups:history`
- `reactions:write`
4. Enable Event Subscriptions and subscribe to `file_shared` event
5. Install the app to your workspace
### 2. Storage Configuration
This CDN supports any S3-compatible storage service. Here's how to set it up using Cloudflare R2 as an example:
#### Setting up Cloudflare R2 (Example)
1. **Create R2 Bucket**
- Go to Cloudflare Dashboard > R2
- Click "Create Bucket"
- Name your bucket
- Enable public access
2. **Generate API Credentials**
- Go to R2
- Click "Manage API tokens" in API
- Click "Create API Token"
- Permissions: "Object Read & Write"
- Save both Access Key ID and Secret Access Key (S3)
3. **Get Your URL**
- Go to R2
- Click "Use R2 with APIs" in API
- Select S3 Compatible API
- The URL is your Endpoint
4. **Configure Custom Domain (Optional)**
- Go to R2 > Bucket Settings > Custom Domains
- Add your domain (e.g., cdn.beans.com)
- Follow DNS configuration steps
### 3. Environment Setup
Check out the `example.env` file for getting started!
### **4. Installation & Running**
#### **Install Dependencies**
Make sure you have [Bun](https://bun.sh/) installed, then run:
```bash
bun install
```
#### **Run the Application**
You can start the application using any of the following methods:
```bash
# Using Node.js
node index.js
# Using Bun
bun index.js
# Using Bun with script
bun run start
```
#### **Using PM2 (Optional)**
For auto-starting the application, you can use PM2:
```bash
pm2 start bun --name "HC-CDN1" -- run start
# Optionally, save the process list
pm2 save
# Optionally, generate startup script
pm2 startup
```
## 📡 API Usage
⚠️ **IMPORTANT SECURITY NOTE**:
- All API endpoints require authentication via `Authorization: Bearer api-token` header
- This includes all versions (v1, v2, v3) - no exceptions!
- Use the API_TOKEN from your environment configuration
- Failure to include a valid token will result in 401 Unauthorized responses
### V3 API (Latest)
<img alt="Version 3" src="https://files.catbox.moe/e3ravk.png" align="right" width="300">
**Endpoint:** `POST https://cdn.hackclub.com/api/v3/new`
**Headers:**
```
Authorization: Bearer api-token
Content-Type: application/json
```
**Request Example:**
```bash
curl --location 'https://cdn.hackclub.com/api/v3/new' \
--header 'Authorization: Bearer beans' \
--header 'Content-Type: application/json' \
--data '[
"https://assets.hackclub.com/flag-standalone.svg",
"https://assets.hackclub.com/flag-orpheus-left.png",
"https://assets.hackclub.com/icon-progress-marker.svg"
]'
```
**Response:**
```json
{
"files": [
{
"deployedUrl": "https://cdn.example.dev/s/v3/3e48b91a4599a3841c028e9a683ef5ce58cea372_flag-standalone.svg",
"file": "0_16361167e11b0d172a47e726b40d70e9873c792b_upload_1736985095691",
"sha": "16361167e11b0d172a47e726b40d70e9873c792b",
"size": 90173
},
{
"deployedUrl": "https://cdn.example.dev/s/v3/4e48b91a4599a3841c028e9a683ef5ce58cea372_flag-orpheus-left.png",
"file": "1_16361167e11b0d172a47e726b40d70e9873c792b_upload_1736985095692",
"sha": "16361167e11b0d172a47e726b40d70e9873c792b",
"size": 80234
},
{
"deployedUrl": "https://cdn.example.dev/s/v3/5e48b91a4599a3841c028e9a683ef5ce58cea372_icon-progress-marker.svg",
"file": "2_16361167e11b0d172a47e726b40d70e9873c792b_upload_1736985095693",
"sha": "16361167e11b0d172a47e726b40d70e9873c792b",
"size": 70345
},
{
"deployedUrl": "https://cdn.example.dev/s/v3/6e48b91a4599a3841c028e9a683ef5ce58cea372_flag-orpheus-right.png",
"file": "3_16361167e11b0d172a47e726b40d70e9873c792b_upload_1736985095694",
"sha": "16361167e11b0d172a47e726b40d70e9873c792b",
"size": 60456
}
],
"cdnBase": "https://cdn.example.dev"
}
```
<details>
<summary>V2 API</summary>
<img alt="Version 2" src="https://files.catbox.moe/uuk1vm.png" align="right" width="300">
**Endpoint:** `POST https://cdn.hackclub.com/api/v2/new`
**Headers:**
```
Authorization: Bearer api-token
Content-Type: application/json
```
**Request Example:**
```json
[
"https://assets.hackclub.com/flag-standalone.svg",
"https://assets.hackclub.com/flag-orpheus-left.png",
"https://assets.hackclub.com/icon-progress-marker.svg"
]
```
**Response:**
```json
{
"flag-standalone.svg": "https://cdn.example.dev/s/v2/flag-standalone.svg",
"flag-orpheus-left.png": "https://cdn.example.dev/s/v2/flag-orpheus-left.png",
"icon-progress-marker.svg": "https://cdn.example.dev/s/v2/icon-progress-marker.svg"
}
```
</details>
<details>
<summary>V1 API</summary>
<img alt="Version 1" src="https://files.catbox.moe/tnzdfe.png" align="right" width="300">
**Endpoint:** `POST https://cdn.hackclub.com/api/v1/new`
**Headers:**
```
Authorization: Bearer api-token
Content-Type: application/json
```
**Request Example:**
```json
[
"https://assets.hackclub.com/flag-standalone.svg",
"https://assets.hackclub.com/flag-orpheus-left.png",
"https://assets.hackclub.com/icon-progress-marker.svg"
]
```
**Response:**
```json
[
"https://cdn.example.dev/s/v1/0_flag-standalone.svg",
"https://cdn.example.dev/s/v1/1_flag-orpheus-left.png",
"https://cdn.example.dev/s/v1/2_icon-progress-marker.svg"
]
```
</details>
## 🤖 Slack Bot Features
- **Multi-file Upload:** Upload up to 10 files in a single message no more than 3 messages at a time!
- **File Organization:** Files are stored as `/s/{slackUserId}/{timestamp}_{sanitizedFilename}`
- **Error Handling:** Error Handeling
- **File Sanitization:** Automatic filename cleaning
- **Size Limits:** Enforces files to be under 2GB
## Legacy API Notes
- V1 and V2 APIs are maintained for backwards compatibility
- All versions now require authentication via Bearer token
- We recommend using V3 API for new implementations
## Technical Details
- **Storage Structure:** `/s/v3/{HASH}_{filename}`
- **File Naming:** `/s/{slackUserId}/{unix}_{sanitizedFilename}`
- **Cost Efficiency:** Uses object storage for significant cost savings
- **Security:** Token-based authentication for API access
## 💻 Slack Bot Behavior
- Reacts to file uploads with status emojis:
- ⏳ Processing
- ✅ Success
- ❌ Error
- Supports up to 10 files per message
- Max 3 messages concurrently!
- Maximum file size: 2GB per file
## 💰 Cost Optimization
- Uses Object storage
- 87-98% cost reduction compared to Vercel CDN
<div align="center">
<br>
<p>Made with 💜 for Hack Club</p>
<p>All illustrations by <a href="https://gh.maxwofford.com">@maxwofford</a></p>
</div>

View file

@ -1,128 +0,0 @@
import { urlParse } from "https://deno.land/x/url_parse/mod.ts";
const endpoint = (path: string) => {
// https://vercel.com/docs/api#api-basics/authentication/accessing-resources-owned-by-a-team
let url = "https://api.vercel.com/" + path;
if (Deno.env.get("ZEIT_TEAM")) {
url += ("?teamId=" + Deno.env.get("ZEIT_TEAM"));
}
return url;
};
const deploy = async (
files: { sha: string; file: string; path: string; size: number }[],
) => {
const req = await fetch(endpoint("v12/now/deployments"), {
method: "POST",
headers: {
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
name: "cloud",
files: files.map((f) => ({
sha: f.sha,
file: f.file,
size: f.size,
})),
projectSettings: {
framework: null,
},
}),
});
const json = await req.text();
console.log(json)
const baseURL = JSON.parse(json).url;
const fileURLs = files.map((f) => "https://" + baseURL + "/" + f.path);
return { status: req.status, fileURLs };
};
export default async (req: Request) => {
if (req.method == "OPTIONS") {
return new Response(
JSON.stringify(
{ status: "YIPPE YAY. YOU HAVE CLEARANCE TO PROCEED." },
),
{
status: 204
},
);
}
if (req.method == "GET") {
return new Response(
JSON.stringify(
{ error: "*GET outta here!* (Method not allowed, use POST)" },
),
{
status: 405
},
);
}
if (req.method == "PUT") {
return new Response(
JSON.stringify(
{ error: "*PUT that request away!* (Method not allowed, use POST)" },
),
{
status: 405,
},
);
}
if (req.method != "POST") {
return new Response(
JSON.stringify({ error: "Method not allowed, use POST" }),
{
status: 405,
},
);
}
const decoder = new TextDecoder();
console.log(req)
const buf = await req.arrayBuffer();
console.log(decoder.decode(buf))
console.log(buf)
const fileURLs = JSON.parse(decoder.decode(buf));
console.log(fileURLs)
console.log(fileURLs.length)
console.log(typeof fileURLs)
if (!Array.isArray(fileURLs) || fileURLs.length < 1) {
return new Response(
JSON.stringify({ error: "Empty file array" }),
{ status: 422 }
);
}
const authorization = req.headers.get("Authorization");
const uploadedURLs = await Promise.all(fileURLs.map(async (url, index) => {
const { pathname } = urlParse(url);
const filename = index + pathname.substr(pathname.lastIndexOf("/") + 1);
const headers = {
"Content-Type": "application/json",
"Authorization": ""
}
if (authorization) {
headers['Authorization'] = authorization;
}
const res = await (await fetch("https://cdn.hackclub.com/api/newSingle", {
method: "POST",
headers,
body: url,
})).json();
res.file = "public/" + filename;
res.path = filename;
return res;
}));
const result = await deploy(uploadedURLs);
return new Response(
JSON.stringify(result.fileURLs),
{ status: result.status }
);
};

View file

@ -1,64 +0,0 @@
import { Hash } from "https://deno.land/x/checksum@1.4.0/mod.ts";
const endpoint = (path: string) => {
// https://vercel.com/docs/api#api-basics/authentication/accessing-resources-owned-by-a-team
let url = "https://api.vercel.com/" + path;
if (Deno.env.get("ZEIT_TEAM")) {
url += ("?teamId=" + Deno.env.get("ZEIT_TEAM"));
}
return url;
};
const uploadFile = async (url: string, authorization: string|null) => {
const options = {
method: 'GET', headers: { 'Authorization': "" }
}
if (authorization) {
options.headers = { 'Authorization': authorization }
}
const req = await fetch(url, options);
const data = new Uint8Array(await req.arrayBuffer());
const sha = new Hash("sha1").digest(data).hex();
const size = data.byteLength;
await fetch(endpoint("v2/now/files"), {
method: "POST",
headers: {
"Content-Length": size.toString(),
"x-now-digest": sha,
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
},
body: data.buffer,
});
return {
sha,
size,
};
};
export default async (req: Request) => {
if (req.method != "POST") {
return new Response(
JSON.stringify({ error: "Method not allowed, use POST" }),
{
status: 405,
},
);
}
const decoder = new TextDecoder();
const buf = await req.arrayBuffer();
const singleFileURL = decoder.decode(buf);
if (typeof singleFileURL != "string") {
return new Response(
JSON.stringify({ error: "newSingle only accepts a single URL" }),
{
status: 422
},
);
}
const uploadedFileURL = await uploadFile(singleFileURL, req.headers.get("Authorization"));
return new Response(JSON.stringify(uploadedFileURL))
};

View file

@ -1,40 +0,0 @@
import { endpoint } from "./utils.ts";
// Other functions can import this function to call this serverless endpoint
export const deployEndpoint = async (
files: { sha: string; file: string; size: number }[],
) => {
return await deploy(files);
};
const deploy = async (
files: { sha: string; file: string; size: number }[],
) => {
const req = await fetch(endpoint("v12/now/deployments"), {
method: "POST",
headers: {
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
name: "cloud",
files: files.map((f) => ({
sha: f.sha,
file: f.file,
size: f.size,
})),
projectSettings: {
framework: null,
},
}),
});
const json = await req.json();
const deployedFiles = files.map((file) => ({
deployedUrl: `https://${json.url}/public/${file.file}`,
...file,
}));
return { status: req.status, files: deployedFiles };
};
export default deploy;

View file

@ -1,37 +0,0 @@
import { urlParse } from "https://deno.land/x/url_parse/mod.ts";
import { uploadEndpoint } from "./upload.ts";
import { deployEndpoint } from "./deploy.ts";
import { ensurePost, parseBody } from "./utils.ts";
export default async (req: Request) => {
if (!ensurePost(req)) return null;
const body = new TextDecoder().decode(await req.arrayBuffer());
const fileURLs = JSON.parse(body);
if (!Array.isArray(fileURLs) || fileURLs.length < 1) {
return new Response(
JSON.stringify({ error: "Empty/invalid file array" }),
{ status: 422 }
);
}
const authorization = req.headers.get('Authorization')
const uploadArray = await Promise.all(fileURLs.map(f => uploadEndpoint(f, authorization)));
const deploymentFiles = uploadArray.map(
(file: { url: string; sha: string; size: number }, index: number) => {
const { pathname } = urlParse(file.url);
const filename = index + pathname.substr(pathname.lastIndexOf("/") + 1);
return { sha: file.sha, file: filename, size: file.size };
},
);
const deploymentData = await deployEndpoint(deploymentFiles);
return new Response(
JSON.stringify(deploymentData.files),
{ status: deploymentData.status }
);
};

View file

@ -1,53 +0,0 @@
import { Hash } from "https://deno.land/x/checksum@1.4.0/hash.ts";
import { endpoint, ensurePost, parseBody } from "./utils.ts";
// Other functions can import this function to call this serverless endpoint
export const uploadEndpoint = async (url: string, authorization: string | null) => {
const options = { method: 'POST', body: url, headers: {} }
if (authorization) {
options.headers = { 'Authorization': authorization }
}
console.log({ options})
const response = await fetch("https://cdn.hackclub.com/api/v2/upload", options);
const result = await response.json();
console.log({result})
return result;
};
const upload = async (url: string, authorization: string | null) => {
const options = { headers: {} }
if (authorization) {
options.headers = { 'Authorization': authorization }
}
const req = await fetch(url, options);
const reqArrayBuffer = await req.arrayBuffer();
const data = new Uint8Array(reqArrayBuffer);
const sha = new Hash("sha1").digest(data).hex();
const size = data.byteLength;
await fetch(endpoint("v2/now/files"), {
method: "POST",
headers: {
"Content-Length": size.toString(),
"x-now-digest": sha,
"Authorization": `Bearer ${Deno.env.get("ZEIT_TOKEN")}`,
},
body: data.buffer,
});
return {
url,
sha,
size,
};
};
export default async (req: Request) => {
if (!ensurePost(req)) return null;
const body = new TextDecoder().decode(await req.arrayBuffer());
const uploadedFileUrl = await upload(body, req.headers.get("Authorization"));
return new Response(JSON.stringify(uploadedFileUrl));
};

View file

@ -1,57 +0,0 @@
export const endpoint = (path: string) => {
// https://vercel.com/docs/api#api-basics/authentication/accessing-resources-owned-by-a-team
let url = "https://api.vercel.com/" + path;
if (Deno.env.get("ZEIT_TEAM")) {
url += ("?teamId=" + Deno.env.get("ZEIT_TEAM"));
}
return url;
};
export const parseBody = async (body: Request["body"]) => {
const decoder = new TextDecoder();
const buf = await Deno.readAll(body);
const result = decoder.decode(buf);
return result;
};
export const ensurePost = (req: Request) => {
if (req.method == "OPTIONS") {
return new Response(
JSON.stringify(
{ status: "YIPPE YAY. YOU HAVE CLEARANCE TO PROCEED." },
),
{
status: 204
},
);
}
if (req.method == "GET") {
return new Response(
JSON.stringify(
{ error: "*GET outta here!* (Method not allowed, use POST)" },
),
{
status: 405
},
);
}
if (req.method == "PUT") {
return new Response(
JSON.stringify(
{ error: "*PUT that request away!* (Method not allowed, use POST)" },
),
{
status: 405,
},
);
}
if (req.method != "POST") {
return new Response(
JSON.stringify({ error: "Method not allowed, use POST" }),
{
status: 405,
},
);
}
return true;
};

86
index.js Normal file
View file

@ -0,0 +1,86 @@
const dotenv = require('dotenv');
dotenv.config();
const logger = require('./src/config/logger');
logger.info('Starting CDN application 🚀');
// const {App} = require('@slack/bolt');
const fileUpload = require('./src/fileUpload');
const express = require('express');
const cors = require('cors');
const apiRoutes = require('./src/api/index.js');
const BOT_START_TIME = Date.now() / 1000;
// const app = new App({
// token: process.env.SLACK_BOT_TOKEN,
// signingSecret: process.env.SLACK_SIGNING_SECRET,
// socketMode: false,
// appToken: process.env.SLACK_APP_TOKEN
// });
// API server
const expressApp = express();
expressApp.use(cors());
expressApp.use(express.json());
expressApp.use(express.urlencoded({ extended: true }));
// Mount API for all versions
expressApp.use('/api', apiRoutes);
// redirect route to "https://github.com/hackclub/cdn"
expressApp.get('/', (req, res) => {
res.redirect('https://github.com/hackclub/cdn');
});
// Error handling middleware
expressApp.use((err, req, res, next) => {
logger.error('API Error:', {
error: err.message,
stack: err.stack,
path: req.path,
method: req.method
});
res.status(500).json({ error: 'Internal server error' });
});
// Fallback route for unhandled paths
expressApp.use((req, res, next) => {
logger.warn(`Unhandled route: ${req.method} ${req.path}`);
res.status(404).json({ error: 'Not found' });
});
// Event listener for file_shared events
// app.event('file_shared', async ({event, client}) => {
// if (parseFloat(event.event_ts) < BOT_START_TIME) return;
// if (event.channel_id !== process.env.SLACK_CHANNEL_ID) return;
// try {
// await fileUpload.handleFileUpload(event, client);
// } catch (error) {
// logger.error(`Upload failed: ${error.message}`);
// }
// });
// Startup LOGs
(async () => {
try {
await fileUpload.initialize();
// await app.start();
const port = parseInt(process.env.PORT || '4553', 10);
expressApp.listen(port, () => {
logger.info('CDN started successfully 🔥', {
slackMode: 'Socket Mode',
apiPort: port,
startTime: new Date().toISOString()
});
});
} catch (error) {
logger.error('Failed to start application:', {
error: error.message,
stack: error.stack
});
process.exit(1);
}
})();

9
logger.js Normal file
View file

@ -0,0 +1,9 @@
const winston = require('winston');
const logger = winston.createLogger({
level: 'info',
format: winston.format.combine(winston.format.colorize(), winston.format.simple()),
transports: [new winston.transports.Console()],
});
module.exports = logger;

22
package.json Normal file
View file

@ -0,0 +1,22 @@
{
"name": "cdn-v2-hackclub",
"version": "1.0.0",
"description": "Slack app and API to upload files to S3-compatible storage with unique URLs",
"main": "index.js",
"scripts": {
"start": "node index.js"
},
"dependencies": {
"@aws-sdk/client-s3": "^3.478.0",
"@slack/bolt": "^4.2.0",
"@slack/web-api": "^7.8.0",
"cors": "^2.8.5",
"dotenv": "^10.0.0",
"multer": "^1.4.5-lts.1",
"node-fetch": "^2.6.1",
"p-limit": "^6.2.0",
"winston": "^3.17.0"
},
"author": "",
"license": "MIT"
}

17
src/api.js Normal file
View file

@ -0,0 +1,17 @@
const express = require('express');
const multer = require('multer');
const router = express.Router();
const upload = multer({dest: 'uploads/'});
router.post('/upload', upload.single('file'), (req, res) => {
if (!req.file) {
return res.status(400).send('No file uploaded.');
}
// Handle the uploaded file
console.log('Uploaded file:', req.file);
res.send('File uploaded successfully.');
});
module.exports = router;

27
src/api/deploy.js Normal file
View file

@ -0,0 +1,27 @@
const logger = require('../config/logger');
const {generateApiUrl, getCdnUrl} = require('./utils');
const deployEndpoint = async (files) => {
try {
const deployedFiles = files.map(file => ({
deployedUrl: generateApiUrl('v3', file.file),
cdnUrl: getCdnUrl(),
contentType: file.contentType || 'application/octet-stream',
...file
}));
return {
status: 200,
files: deployedFiles,
cdnBase: getCdnUrl()
};
} catch (error) {
logger.error('S3 deploy error:', error);
return {
status: 500,
files: []
};
}
};
module.exports = {deployEndpoint};

85
src/api/index.js Normal file
View file

@ -0,0 +1,85 @@
const express = require('express');
const {validateToken, validateRequest, getCdnUrl} = require('./utils');
const {uploadEndpoint, handleUpload} = require('./upload');
const logger = require('../config/logger');
const router = express.Router();
// Require valid API token for all routes
router.use((req, res, next) => {
const tokenCheck = validateToken(req);
if (tokenCheck.status !== 200) {
return res.status(tokenCheck.status).json(tokenCheck.body);
}
next();
});
// Health check route
router.get('/health', (req, res) => {
res.status(200).json({ status: 'ok' });
});
// Format response based on API version compatibility
const formatResponse = (results, version) => {
switch (version) {
case 1:
return results.map(r => r.url);
case 2:
return results.reduce((acc, r, i) => {
const fileName = r.url.split('/').pop();
acc[`${i}${fileName}`] = r.url;
return acc;
}, {});
default:
return {
files: results.map((r, i) => ({
deployedUrl: r.url,
file: `${i}_${r.url.split('/').pop()}`,
sha: r.sha,
size: r.size
})),
cdnBase: getCdnUrl()
};
}
};
// Handle bulk file uploads with version-specific responses
const handleBulkUpload = async (req, res, version) => {
try {
const urls = req.body;
// Basic validation
if (!Array.isArray(urls) || !urls.length) {
return res.status(422).json({error: 'Empty/invalid file array'});
}
// Process all URLs concurrently
logger.debug(`Processing ${urls.length} URLs`);
const results = await Promise.all(
urls.map(url => uploadEndpoint(url, req.headers?.authorization))
);
res.json(formatResponse(results, version));
} catch (error) {
logger.error('Bulk upload failed:', error);
res.status(500).json({error: 'Internal server error'});
}
};
// API Routes
router.post('/v1/new', (req, res) => handleBulkUpload(req, res, 1)); // Legacy support
router.post('/v2/new', (req, res) => handleBulkUpload(req, res, 2)); // Legacy support
router.post('/v3/new', (req, res) => handleBulkUpload(req, res, 3)); // Current version
router.post('/new', (req, res) => handleBulkUpload(req, res, 3)); // Alias for v3 (latest)
// Single file upload endpoint
router.post('/upload', async (req, res) => {
try {
const result = await handleUpload(req);
res.status(result.status).json(result.body);
} catch (error) {
logger.error('S3 upload handler error:', error);
res.status(500).json({error: 'Storage upload failed'});
}
});
module.exports = router;

93
src/api/upload.js Normal file
View file

@ -0,0 +1,93 @@
const fetch = require('node-fetch');
const crypto = require('crypto');
const {uploadToStorage} = require('../storage');
const {generateUrl, getCdnUrl} = require('./utils');
const logger = require('../config/logger');
// Sanitize file name for storage
function sanitizeFileName(fileName) {
let sanitizedFileName = fileName.replace(/[^a-zA-Z0-9.-]/g, '_');
if (!sanitizedFileName) {
sanitizedFileName = 'upload_' + Date.now();
}
return sanitizedFileName;
}
// Handle remote file upload to S3 storage
const uploadEndpoint = async (url, authorization = null) => {
try {
logger.debug('Starting download', { url });
const response = await fetch(url, {
headers: authorization ? {'Authorization': authorization} : {}
});
if (!response.ok) {
const error = new Error(`Download failed: ${response.statusText}`);
error.statusCode = response.status;
throw error;
}
// Generate unique filename using SHA1 (hash) of file contents
const buffer = await response.buffer();
const sha = crypto.createHash('sha1').update(buffer).digest('hex');
const originalName = url.split('/').pop();
const sanitizedFileName = sanitizeFileName(originalName);
const fileName = `${sha}_${sanitizedFileName}`;
// Upload to S3 storage
logger.debug(`Uploading: ${fileName}`);
const uploadResult = await uploadToStorage('s/v3', fileName, buffer, response.headers.get('content-type'));
if (uploadResult.success === false) {
throw new Error(`Storage upload failed: ${uploadResult.error}`);
}
return {
url: generateUrl('s/v3', fileName),
sha,
size: buffer.length,
type: response.headers.get('content-type')
};
} catch (error) {
logger.error('Upload process failed', {
url,
error: error.message,
statusCode: error.statusCode,
stack: error.stack
});
// Format error (pain)
const statusCode = error.statusCode || 500;
const errorResponse = {
error: {
message: error.message,
code: error.code || 'INTERNAL_ERROR',
details: error.details || null
},
success: false
};
throw { statusCode, ...errorResponse };
}
};
// Express request handler for file uploads
const handleUpload = async (req) => {
try {
const url = req.body || await req.text();
const result = await uploadEndpoint(url, req.headers?.authorization);
return { status: 200, body: result };
} catch (error) {
return {
status: error.statusCode || 500,
body: {
error: error.error || {
message: 'Internal server error',
code: 'INTERNAL_ERROR'
},
success: false
}
};
}
};
module.exports = {uploadEndpoint, handleUpload};

45
src/api/utils.js Normal file
View file

@ -0,0 +1,45 @@
const logger = require('../config/logger');
const getCdnUrl = () => process.env.AWS_CDN_URL;
const generateUrl = (version, fileName) => {
return `${getCdnUrl()}/${version}/${fileName}`;
};
const validateToken = (req) => {
const token = req.headers.authorization?.split('Bearer ')[1];
if (!token || token !== process.env.API_TOKEN) {
return {
status: 401,
body: {error: 'Unauthorized - Invalid or missing API token'}
};
}
return {status: 200};
};
const validateRequest = (req) => {
// First check token
const tokenCheck = validateToken(req);
if (tokenCheck.status !== 200) {
return tokenCheck;
}
// Then check method (copied the thing from old api maybe someone is insane and uses the status and not the code)
if (req.method === 'OPTIONS') {
return {status: 204, body: {status: 'YIPPE YAY. YOU HAVE CLEARANCE TO PROCEED.'}};
}
if (req.method !== 'POST') {
return {
status: 405,
body: {error: 'Method not allowed, use POST'}
};
}
return {status: 200};
};
module.exports = {
validateRequest,
validateToken,
generateUrl,
getCdnUrl
};

19
src/config/logger.js Normal file
View file

@ -0,0 +1,19 @@
const winston = require('winston');
const logger = winston.createLogger({
level: 'info',
format: winston.format.combine(
winston.format.timestamp(),
winston.format.colorize(),
winston.format.printf(({ level, message, timestamp, ...meta }) => {
let output = `${timestamp} ${level}: ${message}`;
if (Object.keys(meta).length > 0) {
output += ` ${JSON.stringify(meta)}`;
}
return output;
})
),
transports: [new winston.transports.Console()]
});
module.exports = logger;

137
src/config/messages.js Normal file
View file

@ -0,0 +1,137 @@
const messages = {
success: {
singleFile: "Hey <@{userId}>, here's your link:",
multipleFiles: "Hey <@{userId}>, here are your links:",
alternateSuccess: [
"thanks!",
"thanks, i'm gonna sell these to adfly!",
"tysm!",
"file away!"
]
},
fileTypes: {
gif: [
"_gif_ that file to me and i'll upload it",
"_gif_ me all all your files!"
],
heic: [
"What the heic???"
],
mov: [
"I'll _mov_ that to a permanent link for you"
],
html: [
"Oh, launching a new website?",
"uwu, what's this site?",
"WooOOAAah hey! Are you serving a site?",
"h-t-m-ello :wave:"
],
rar: [
".rawr xD",
"i also go \"rar\" sometimes!"
]
},
errors: {
tooBig: {
messages: [
"File too big!",
"That's a chonky file!",
"_orpheus struggles to lift the massive file_",
"Sorry, that file's too thicc for me to handle!"
],
images: [
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/2too_big_4.png",
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/3too_big_2.png",
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/4too_big_1.png",
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/6too_big_5.png",
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/7too_big_3.png"
]
},
generic: {
messages: [
"_orpheus sneezes and drops the files on the ground before blowing her nose on a blank jpeg._",
"_orpheus trips and your files slip out of her hands and into an inconveniently placed sewer grate._",
"_orpheus accidentally slips the files into a folder in her briefcase labeled \"homework\". she starts sweating profusely._"
],
images: [
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/0generic_3.png",
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/1generic_2.png",
"https://cloud-3tq9t10za-hack-club-bot.vercel.app/5generic_1.png"
]
}
}
};
function getRandomItem(array) {
return array[Math.floor(Math.random() * array.length)];
}
function getFileTypeMessage(fileExtension) {
const ext = fileExtension.toLowerCase();
return messages.fileTypes[ext] ? getRandomItem(messages.fileTypes[ext]) : null;
}
function formatErrorMessage(failedFiles, isSizeError = false) {
const errorType = isSizeError ? messages.errors.tooBig : messages.errors.generic;
const errorMessage = getRandomItem(errorType.messages);
const errorImage = getRandomItem(errorType.images);
return [
errorMessage,
`Failed files: ${failedFiles.join(', ')}`,
'',
`<${errorImage}|image>`
].join('\n');
}
function formatSuccessMessage(userId, files, failedFiles = [], sizeFailedFiles = []) {
const messageLines = [];
const baseMessage = files.length === 1 ?
messages.success.singleFile :
messages.success.multipleFiles;
messageLines.push(baseMessage.replace('{userId}', userId), '');
const fileGroups = new Map();
files.forEach(file => {
const ext = file.originalName.split('.').pop();
const typeMessage = getFileTypeMessage(ext);
const key = typeMessage || 'noType';
if (!fileGroups.has(key)) {
fileGroups.set(key, []);
}
fileGroups.get(key).push(file);
});
fileGroups.forEach((groupFiles, typeMessage) => {
if (typeMessage !== 'noType') {
messageLines.push('', typeMessage);
}
groupFiles.forEach(file => {
messageLines.push(`${file.originalName}: ${file.url}`);
});
});
if (sizeFailedFiles.length > 0) {
messageLines.push(formatErrorMessage(sizeFailedFiles, true));
}
if (failedFiles.length > 0) {
messageLines.push(formatErrorMessage(failedFiles, false));
}
if (files.length > 0) {
messageLines.push('', `_${getRandomItem(messages.success.alternateSuccess)}_`);
}
return messageLines.join('\n');
}
module.exports = {
messages,
getFileTypeMessage,
formatSuccessMessage,
formatErrorMessage,
getRandomItem
};

302
src/fileUpload.js Normal file
View file

@ -0,0 +1,302 @@
const fetch = require('node-fetch');
const crypto = require('crypto');
const logger = require('./config/logger');
const storage = require('./storage');
const {generateFileUrl} = require('./utils');
const path = require('path');
const {
messages,
formatSuccessMessage,
formatErrorMessage,
getFileTypeMessage
} = require('./config/messages');
const MAX_FILE_SIZE = 50 * 1024 * 1024; // 50MB in bytes
const CONCURRENT_UPLOADS = 3; // Max concurrent uploads (messages)
const processedMessages = new Map();
let uploadLimit;
async function initialize() {
const pLimit = (await import('p-limit')).default;
uploadLimit = pLimit(CONCURRENT_UPLOADS);
}
// Basic stuff
function isMessageTooOld(eventTs) {
const eventTime = parseFloat(eventTs) * 1000;
return (Date.now() - eventTime) > 24 * 60 * 60 * 1000;
}
function isMessageProcessed(messageTs) {
return processedMessages.has(messageTs);
}
function markMessageAsProcessing(messageTs) {
processedMessages.set(messageTs, true);
}
// File processing
function sanitizeFileName(fileName) {
let sanitized = fileName.replace(/[^a-zA-Z0-9.-]/g, '_');
return sanitized || `upload_${Date.now()}`;
}
function generateUniqueFileName(fileName) {
return `${Date.now()}-${crypto.randomBytes(16).toString('hex')}-${sanitizeFileName(fileName)}`;
}
// upload functionality
async function processFiles(fileMessage, client) {
const uploadedFiles = [];
const failedFiles = [];
const sizeFailedFiles = [];
const fileTypeResponses = new Set();
logger.info(`Processing ${fileMessage.files?.length || 0} files`);
for (const file of fileMessage.files || []) {
try {
if (file.size > MAX_FILE_SIZE) {
sizeFailedFiles.push(file.name);
continue;
}
// Get file extension message if applicable
const ext = path.extname(file.name).slice(1);
const typeMessage = getFileTypeMessage(ext);
if (typeMessage) fileTypeResponses.add(typeMessage);
const response = await fetch(file.url_private, {
headers: {Authorization: `Bearer ${process.env.SLACK_BOT_TOKEN}`}
});
if (!response.ok) throw new Error('Download failed');
const buffer = await response.buffer();
const uniqueFileName = generateUniqueFileName(file.name);
const userDir = `s/${fileMessage.user}`;
const success = await uploadLimit(() =>
storage.uploadToStorage(userDir, uniqueFileName, buffer, file.mimetype)
);
if (!success) throw new Error('Upload failed');
uploadedFiles.push({
name: uniqueFileName,
originalName: file.name,
url: generateFileUrl(userDir, uniqueFileName),
contentType: file.mimetype
});
} catch (error) {
logger.error(`Failed: ${file.name} - ${error.message}`);
failedFiles.push(file.name);
}
}
return {
uploadedFiles,
failedFiles,
sizeFailedFiles,
isSizeError: sizeFailedFiles.length > 0
};
}
// Slack interaction
async function addProcessingReaction(client, event, fileMessage) {
try {
await client.reactions.add({
name: 'beachball',
timestamp: fileMessage.ts,
channel: event.channel_id
});
} catch (error) {
logger.error('Failed to add reaction:', error.message);
}
}
async function updateReactions(client, event, fileMessage, totalFiles, failedCount) {
try {
await client.reactions.remove({
name: 'beachball',
timestamp: fileMessage.ts,
channel: event.channel_id
});
// Choose reaction based on how many files failed or well succeded
let reactionName;
if (failedCount === totalFiles) {
reactionName = 'x'; // All files failed
} else if (failedCount > 0) {
reactionName = 'warning'; // Some files failed
} else {
reactionName = 'white_check_mark'; // All files succeeded
}
await client.reactions.add({
name: reactionName,
timestamp: fileMessage.ts,
channel: event.channel_id
});
} catch (error) {
logger.error('Failed to update reactions:', error.message);
}
}
async function findFileMessage(event, client) {
try {
const fileInfo = await client.files.info({
file: event.file_id,
include_shares: true
});
if (!fileInfo.ok || !fileInfo.file) {
throw new Error('Could not get file info');
}
const channelShare = fileInfo.file.shares?.public?.[event.channel_id] ||
fileInfo.file.shares?.private?.[event.channel_id];
if (!channelShare || !channelShare.length) {
throw new Error('No share info found for this channel');
}
// Get the EXACT message using the ts from share info (channelShare)
const messageTs = channelShare[0].ts;
const messageInfo = await client.conversations.history({
channel: event.channel_id,
latest: messageTs,
limit: 1,
inclusive: true
});
if (!messageInfo.ok || !messageInfo.messages.length) {
throw new Error('Could not find original message');
}
return messageInfo.messages[0];
} catch (error) {
logger.error('Error finding file message:', error);
return null;
}
}
async function sendResultsMessage(client, channelId, fileMessage, uploadedFiles, failedFiles, sizeFailedFiles) {
try {
let message;
if (uploadedFiles.length === 0 && (failedFiles.length > 0 || sizeFailedFiles.length > 0)) {
// All files failed - use appropriate error type
message = formatErrorMessage(
[...failedFiles, ...sizeFailedFiles],
sizeFailedFiles.length > 0 && failedFiles.length === 0 // Only use size error if all failures are size-related (i hope this is how it makes most sense)
);
} else {
// Mixed success/failure or all success
message = formatSuccessMessage(
fileMessage.user,
uploadedFiles,
failedFiles,
sizeFailedFiles
);
}
const lines = message.split('\n');
const attachments = [];
let textBuffer = '';
for (const line of lines) {
if (line.match(/^<.*\|image>$/)) {
const imageUrl = line.replace(/^<|>$/g, '').replace('|image', '');
attachments.push({
image_url: imageUrl,
fallback: 'Error image'
});
} else {
textBuffer += line + '\n';
}
}
await client.chat.postMessage({
channel: channelId,
thread_ts: fileMessage.ts,
text: textBuffer.trim(),
attachments: attachments.length > 0 ? attachments : undefined
});
} catch (error) {
logger.error('Failed to send results message:', error);
throw error;
}
}
async function handleError(client, channelId, fileMessage, reactionAdded) {
if (fileMessage && reactionAdded) {
try {
await client.reactions.remove({
name: 'beachball',
timestamp: fileMessage.ts,
channel: channelId
});
} catch (cleanupError) {
if (cleanupError.data.error !== 'no_reaction') {
logger.error('Cleanup error:', cleanupError);
}
}
try {
await client.reactions.add({
name: 'x',
timestamp: fileMessage.ts,
channel: channelId
});
} catch (cleanupError) {
logger.error('Cleanup error:', cleanupError);
}
}
}
async function handleFileUpload(event, client) {
let fileMessage = null;
let reactionAdded = false;
try {
if (isMessageTooOld(event.event_ts)) return;
fileMessage = await findFileMessage(event, client);
if (!fileMessage || isMessageProcessed(fileMessage.ts)) return;
markMessageAsProcessing(fileMessage.ts);
await addProcessingReaction(client, event, fileMessage);
reactionAdded = true;
const {uploadedFiles, failedFiles, sizeFailedFiles} = await processFiles(fileMessage, client);
const totalFiles = uploadedFiles.length + failedFiles.length + sizeFailedFiles.length;
const failedCount = failedFiles.length + sizeFailedFiles.length;
await sendResultsMessage(
client,
event.channel_id,
fileMessage,
uploadedFiles,
failedFiles,
sizeFailedFiles
);
await updateReactions(
client,
event,
fileMessage,
totalFiles,
failedCount
);
} catch (error) {
logger.error(`Upload failed: ${error.message}`);
await handleError(client, event.channel_id, fileMessage, reactionAdded);
throw error;
}
}
module.exports = { handleFileUpload, initialize };

310
src/storage.js Normal file
View file

@ -0,0 +1,310 @@
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const path = require('path');
const crypto = require('crypto');
const logger = require('./config/logger');
const {generateFileUrl} = require('./utils');
const MAX_FILE_SIZE = 2 * 1024 * 1024 * 1024; // 2GB in bytes
const CONCURRENT_UPLOADS = 3; // Max concurrent uploads (messages)
// processed messages
const processedMessages = new Map();
let uploadLimit;
async function initialize() {
const pLimit = (await import('p-limit')).default;
uploadLimit = pLimit(CONCURRENT_UPLOADS);
}
// Check if the message is older than 24 hours for when the bot was offline
function isMessageTooOld(eventTs) {
const eventTime = parseFloat(eventTs) * 1000;
const currentTime = Date.now();
const timeDifference = currentTime - eventTime;
const maxAge = 24 * 60 * 60 * 1000; // 24 hours in milliseconds
return timeDifference > maxAge;
}
// check if the message has already been processed
function isMessageProcessed(messageTs) {
return processedMessages.has(messageTs);
}
function markMessageAsProcessing(messageTs) {
processedMessages.set(messageTs, true);
}
// Processing reaction
async function addProcessingReaction(client, event, fileMessage) {
try {
await client.reactions.add({
name: 'beachball',
timestamp: fileMessage.ts,
channel: event.channel_id
});
} catch (error) {
logger.error('Failed to add processing reaction:', error.message);
}
}
// sanitize file names and ensure it's not empty (I don't even know if that's possible but let's be safe)
function sanitizeFileName(fileName) {
let sanitizedFileName = fileName.replace(/[^a-zA-Z0-9.-]/g, '_');
if (!sanitizedFileName) {
sanitizedFileName = 'upload_' + Date.now();
}
return sanitizedFileName;
}
// Generate a unique file name
function generateUniqueFileName(fileName) {
const sanitizedFileName = sanitizeFileName(fileName);
const uniqueFileName = `${Date.now()}-${crypto.randomBytes(16).toString('hex')}-${sanitizedFileName}`;
return uniqueFileName;
}
// upload files to the /s/ directory
async function processFiles(fileMessage, client) {
const uploadedFiles = [];
const failedFiles = [];
logger.debug('Starting file processing', {
userId: fileMessage.user,
fileCount: fileMessage.files?.length || 0
});
const files = fileMessage.files || [];
for (const file of files) {
logger.debug('Processing file', {
name: file.name,
size: file.size,
type: file.mimetype,
id: file.id
});
if (file.size > MAX_FILE_SIZE) {
logger.warn('File exceeds size limit', {
name: file.name,
size: file.size,
limit: MAX_FILE_SIZE
});
failedFiles.push(file.name);
continue;
}
try {
logger.debug('Fetching file from Slack', {
name: file.name,
url: file.url_private
});
const response = await fetch(file.url_private, {
headers: {Authorization: `Bearer ${process.env.SLACK_BOT_TOKEN}`}
});
if (!response.ok) {
throw new Error(`Slack download failed: ${response.status} ${response.statusText}`);
}
const buffer = await response.buffer();
const contentType = file.mimetype || 'application/octet-stream';
const uniqueFileName = generateUniqueFileName(file.name);
const userDir = `s/${fileMessage.user}`;
const uploadResult = await uploadLimit(() =>
uploadToStorage(userDir, uniqueFileName, buffer, contentType)
);
if (uploadResult.success === false) {
throw new Error(uploadResult.error);
}
const url = generateFileUrl(userDir, uniqueFileName);
uploadedFiles.push({
name: uniqueFileName,
url,
contentType
});
} catch (error) {
logger.error('File processing failed', {
fileName: file.name,
error: error.message,
stack: error.stack,
slackFileId: file.id,
userId: fileMessage.user
});
failedFiles.push(file.name);
}
}
logger.debug('File processing complete', {
successful: uploadedFiles.length,
failed: failedFiles.length
});
return {uploadedFiles, failedFiles};
}
// update reactions based on success
async function updateReactions(client, event, fileMessage, success) {
try {
await client.reactions.remove({
name: 'beachball',
timestamp: fileMessage.ts,
channel: event.channel_id
});
await client.reactions.add({
name: success ? 'white_check_mark' : 'x',
timestamp: fileMessage.ts,
channel: event.channel_id
});
} catch (error) {
logger.error('Failed to update reactions:', error.message);
}
}
// find a file message
async function findFileMessage(event, client) {
try {
const fileInfo = await client.files.info({
file: event.file_id,
include_shares: true
});
if (!fileInfo.ok || !fileInfo.file) {
throw new Error('Could not get file info');
}
const channelShare = fileInfo.file.shares?.public?.[event.channel_id] ||
fileInfo.file.shares?.private?.[event.channel_id];
if (!channelShare || !channelShare.length) {
throw new Error('No share info found for this channel');
}
// Get the exact message using the ts from share info
const messageTs = channelShare[0].ts;
const messageInfo = await client.conversations.history({
channel: event.channel_id,
latest: messageTs,
limit: 1,
inclusive: true
});
if (!messageInfo.ok || !messageInfo.messages.length) {
throw new Error('Could not find original message');
}
return messageInfo.messages[0];
} catch (error) {
logger.error('Error finding file message:', error);
return null;
}
}
async function sendResultsMessage(client, channelId, fileMessage, uploadedFiles, failedFiles) {
let message = `Hey <@${fileMessage.user}>, `;
if (uploadedFiles.length > 0) {
message += `here ${uploadedFiles.length === 1 ? 'is your link' : 'are your links'}:\n`;
message += uploadedFiles.map(f => `${f.name}: ${f.url}`).join('\n');
}
if (failedFiles.length > 0) {
message += `\n\nFailed to process: ${failedFiles.join(', ')}`;
}
await client.chat.postMessage({
channel: channelId,
thread_ts: fileMessage.ts,
text: message
});
}
async function handleError(client, channelId, fileMessage, reactionAdded) {
if (fileMessage && reactionAdded) {
try {
await client.reactions.remove({
name: 'beachball',
timestamp: fileMessage.ts,
channel: channelId
});
} catch (cleanupError) {
if (cleanupError.data.error !== 'no_reaction') {
logger.error('Cleanup error:', cleanupError);
}
}
try {
await client.reactions.add({
name: 'x',
timestamp: fileMessage.ts,
channel: channelId
});
} catch (cleanupError) {
logger.error('Cleanup error:', cleanupError);
}
}
}
async function handleFileUpload(event, client) {
let fileMessage = null;
let reactionAdded = false;
try {
if (isMessageTooOld(event.event_ts)) return;
fileMessage = await findFileMessage(event, client);
if (!fileMessage || isMessageProcessed(fileMessage.ts)) return;
markMessageAsProcessing(fileMessage.ts);
await addProcessingReaction(client, event, fileMessage);
reactionAdded = true;
const {uploadedFiles, failedFiles} = await processFiles(fileMessage, client);
await sendResultsMessage(client, event.channel_id, fileMessage, uploadedFiles, failedFiles);
await updateReactions(client, event, fileMessage, failedFiles.length === 0);
} catch (error) {
logger.error('Upload failed:', error.message);
await handleError(client, event.channel_id, fileMessage, reactionAdded);
throw error;
}
}
const s3Client = new S3Client({
region: process.env.AWS_REGION,
endpoint: process.env.AWS_ENDPOINT,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
});
async function uploadToStorage(userDir, uniqueFileName, buffer, contentType = 'application/octet-stream') {
try {
const params = {
Bucket: process.env.AWS_BUCKET_NAME,
Key: `${userDir}/${uniqueFileName}`,
Body: buffer,
ContentType: contentType,
CacheControl: 'public, immutable, max-age=31536000'
};
logger.info(`Uploading: ${uniqueFileName}`);
await s3Client.send(new PutObjectCommand(params));
return true;
} catch (error) {
logger.error(`Upload failed: ${error.message}`, {
path: `${userDir}/${uniqueFileName}`,
error: error.message
});
return false;
}
}
module.exports = {
handleFileUpload,
initialize,
uploadToStorage
};

35
src/upload.js Normal file
View file

@ -0,0 +1,35 @@
const fs = require('fs');
const path = require('path');
const {uploadToStorage} = require('../storage');
const {generateUrl} = require('./utils');
const logger = require('../config/logger');
// Handle individual file upload
const handleUpload = async (file) => {
try {
const buffer = fs.readFileSync(file.path);
const fileName = path.basename(file.originalname);
// content type detection for S3
const contentType = file.mimetype || 'application/octet-stream';
const uniqueFileName = `${Date.now()}-${fileName}`;
// Upload to S3
logger.debug(`Uploading: ${uniqueFileName}`);
const uploaded = await uploadToStorage('s/v3', uniqueFileName, buffer, contentType);
if (!uploaded) throw new Error('Storage upload failed');
return {
name: fileName,
url: generateUrl('s/v3', uniqueFileName),
contentType
};
} catch (error) {
logger.error('Upload failed:', error);
throw error;
} finally {
// Clean up the temporary file
fs.unlinkSync(file.path);
}
};
module.exports = {handleUpload};

8
src/utils.js Normal file
View file

@ -0,0 +1,8 @@
// Make the CDN URL
function generateFileUrl(userDir, uniqueFileName) {
const cdnUrl = process.env.AWS_CDN_URL;
return `${cdnUrl}/${userDir}/${uniqueFileName}`;
}
module.exports = {generateFileUrl};

View file

@ -1,11 +0,0 @@
{
"version": 2,
"functions": {
"api/**/*.[jt]s": { "runtime": "vercel-deno@3.0.0" }
},
"redirects": [
{ "source": "/", "destination": "https://github.com/hackclub/cdn" },
{ "source": "/api/new", "destination": "/api/v1/new", "permanent": false },
{ "source": "/api/newSingle", "destination": "/api/v1/newSingle", "permanent": false }
]
}