Cloudflare worker to watch RSS feeds and post updates to Discord webhooks
1
fork

Configure Feed

Select the types of activity you want to include in your feed.

Initial commit: RSS/Atom feed monitor posting to Discord

Cloudflare Worker that polls feeds on a 5-minute cron, detects new
entries using KV-stored timestamps, and posts them as Discord embeds.
Uses feedsmith for multi-format feed parsing (RSS, Atom, RDF, JSON Feed).

dietrich ayala d30bb6ee

+287
+3
.gitignore
··· 1 + .dev.vars 2 + node_modules/ 3 + .wrangler/
+86
README.md
··· 1 + # bzzrt 2 + 3 + A Cloudflare Worker that polls RSS and Atom feeds for new entries and posts them to Discord channels via webhooks. 4 + 5 + ## Features 6 + 7 + - **Multi-format feed support** — Handles RSS, Atom, RDF, and JSON Feed via [feedsmith](https://github.com/macieklamberski/feedsmith) 8 + - **Multi-feed, multi-channel** — Monitor any number of feeds and route each to one or more Discord channels 9 + - **Deduplication** — Tracks the last-seen entry date per feed in Cloudflare KV to avoid reposting 10 + - **Cron-driven** — Runs automatically every 5 minutes via Cloudflare's cron triggers 11 + - **Manual trigger** — `GET /check` to trigger a feed check on demand 12 + - **Discord embeds** — Posts entries as rich embeds with title, link, summary, timestamp, and author 13 + 14 + ## Architecture 15 + 16 + ``` 17 + feeds.json — Feed URL → Discord channel mapping 18 + worker.js — Main worker: cron handler, fetch handler, feed checking, Discord posting 19 + wrangler.toml — Cloudflare Worker configuration 20 + .dev.vars — Local-only secrets (WEBHOOKS) 21 + ``` 22 + 23 + **Flow:** 24 + 25 + 1. Every 5 minutes (or on `GET /check`), the worker iterates over feeds defined in `feeds.json` 26 + 2. For each feed, it fetches the URL, parses it with `parseFeed()`, and normalizes entries into a common shape 27 + 3. It compares entry dates against the last-posted timestamp stored in Cloudflare KV 28 + 4. New entries are posted as embeds to the configured Discord webhook(s) 29 + 5. The latest entry date is saved back to KV 30 + 31 + ## Setup 32 + 33 + ### Prerequisites 34 + 35 + - Node.js 36 + - A Cloudflare account with Workers enabled 37 + - A Cloudflare KV namespace 38 + - One or more Discord webhook URLs 39 + 40 + ### Configuration 41 + 42 + **feeds.json** — Map feed URLs to Discord channel names: 43 + 44 + ```json 45 + { 46 + "https://example.com/feed.xml": ["general"], 47 + "https://example.com/blog/atom.xml": ["blog", "general"] 48 + } 49 + ``` 50 + 51 + **WEBHOOKS secret** — A JSON object mapping channel names to Discord webhook URLs. Set it as a Cloudflare secret: 52 + 53 + ```bash 54 + npx wrangler secret put WEBHOOKS 55 + # Paste: {"general":"https://discord.com/api/webhooks/...","blog":"https://discord.com/api/webhooks/..."} 56 + ``` 57 + 58 + For local development, put the same value in `.dev.vars`: 59 + 60 + ``` 61 + WEBHOOKS = {"general":"https://discord.com/api/webhooks/..."} 62 + ``` 63 + 64 + **wrangler.toml** — Update the KV namespace ID if creating a new one: 65 + 66 + ```bash 67 + npx wrangler kv namespace create KV 68 + # Update the id in wrangler.toml 69 + ``` 70 + 71 + ## Development 72 + 73 + ```bash 74 + npm install 75 + npx wrangler dev --test-scheduled 76 + ``` 77 + 78 + The worker runs locally at `http://localhost:8787`. Visit `/check` to trigger a feed check, or `/__scheduled` to simulate a cron trigger. 79 + 80 + ## Deployment 81 + 82 + ```bash 83 + npx wrangler deploy 84 + ``` 85 + 86 + Wrangler bundles the worker and deploys it to Cloudflare with the cron trigger active.
+3
feeds.json
··· 1 + { 2 + "https://tangled.org/burrito.space/peek/blob/main/docs/feed.xml": ["peek"] 3 + }
+68
package-lock.json
··· 1 + { 2 + "name": "bzzrt", 3 + "version": "1.0.0", 4 + "lockfileVersion": 3, 5 + "requires": true, 6 + "packages": { 7 + "": { 8 + "name": "bzzrt", 9 + "version": "1.0.0", 10 + "license": "MIT", 11 + "dependencies": { 12 + "feedsmith": "^2.9.0" 13 + } 14 + }, 15 + "node_modules/entities": { 16 + "version": "7.0.1", 17 + "resolved": "https://registry.npmjs.org/entities/-/entities-7.0.1.tgz", 18 + "integrity": "sha512-TWrgLOFUQTH994YUyl1yT4uyavY5nNB5muff+RtWaqNVCAK408b5ZnnbNAUEWLTCpum9w6arT70i1XdQ4UeOPA==", 19 + "license": "BSD-2-Clause", 20 + "engines": { 21 + "node": ">=0.12" 22 + }, 23 + "funding": { 24 + "url": "https://github.com/fb55/entities?sponsor=1" 25 + } 26 + }, 27 + "node_modules/fast-xml-parser": { 28 + "version": "5.3.5", 29 + "resolved": "https://registry.npmjs.org/fast-xml-parser/-/fast-xml-parser-5.3.5.tgz", 30 + "integrity": "sha512-JeaA2Vm9ffQKp9VjvfzObuMCjUYAp5WDYhRYL5LrBPY/jUDlUtOvDfot0vKSkB9tuX885BDHjtw4fZadD95wnA==", 31 + "funding": [ 32 + { 33 + "type": "github", 34 + "url": "https://github.com/sponsors/NaturalIntelligence" 35 + } 36 + ], 37 + "license": "MIT", 38 + "dependencies": { 39 + "strnum": "^2.1.2" 40 + }, 41 + "bin": { 42 + "fxparser": "src/cli/cli.js" 43 + } 44 + }, 45 + "node_modules/feedsmith": { 46 + "version": "2.9.0", 47 + "resolved": "https://registry.npmjs.org/feedsmith/-/feedsmith-2.9.0.tgz", 48 + "integrity": "sha512-TYucytOx4bTrD4ON0iuJG9y0Me7fiT0EZ+7MIE0xptvd8TL6nY0Z1jVPa9W39WMJUtPqV2r27TQxL/z5DCCmdA==", 49 + "license": "MIT", 50 + "dependencies": { 51 + "entities": "^7.0.0", 52 + "fast-xml-parser": "^5.3.3" 53 + } 54 + }, 55 + "node_modules/strnum": { 56 + "version": "2.1.2", 57 + "resolved": "https://registry.npmjs.org/strnum/-/strnum-2.1.2.tgz", 58 + "integrity": "sha512-l63NF9y/cLROq/yqKXSLtcMeeyOfnSQlfMSlzFt/K73oIaD8DGaQWd7Z34X9GPiKqP5rbSh84Hl4bOlLcjiSrQ==", 59 + "funding": [ 60 + { 61 + "type": "github", 62 + "url": "https://github.com/sponsors/NaturalIntelligence" 63 + } 64 + ], 65 + "license": "MIT" 66 + } 67 + } 68 + }
+15
package.json
··· 1 + { 2 + "name": "bzzrt", 3 + "version": "1.0.0", 4 + "description": "Discord webhook which polls RSS feeds for changes and posts updates to channels", 5 + "license": "MIT", 6 + "author": "Dietrich Ayala <me@burrito.space> (https://metafluff.com/)", 7 + "type": "module", 8 + "main": "index.js", 9 + "scripts": { 10 + "test": "echo \"Error: no test specified\" && exit 1" 11 + }, 12 + "dependencies": { 13 + "feedsmith": "^2.9.0" 14 + } 15 + }
+97
worker.js
··· 1 + import { parseFeed } from 'feedsmith'; 2 + import feeds from './feeds.json' with { type: "json" }; 3 + 4 + export default { 5 + async scheduled(event, env, ctx) { 6 + ctx.waitUntil(checkAllFeeds(env)); 7 + }, 8 + 9 + async fetch(request, env) { 10 + if (new URL(request.url).pathname === '/check') { 11 + await checkAllFeeds(env); 12 + return new Response('Checked all feeds'); 13 + } 14 + return new Response('Multi-feed monitor. GET /check to trigger manually.'); 15 + } 16 + }; 17 + 18 + async function checkAllFeeds(env) { 19 + const webhooks = JSON.parse(env.WEBHOOKS); 20 + for (const [feedUrl, channels] of Object.entries(feeds)) { 21 + try { 22 + await checkFeed(env, feedUrl, channels, webhooks); 23 + } catch (e) { 24 + console.error(`Error checking ${feedUrl}:`, e.message); 25 + } 26 + } 27 + } 28 + 29 + async function checkFeed(env, feedUrl, channels, webhooks) { 30 + const res = await fetch(feedUrl); 31 + if (!res.ok) return console.error(`Feed fetch failed: ${res.status}`); 32 + 33 + const text = await res.text(); 34 + const entries = parseEntries(text); 35 + if (!entries.length) return; 36 + 37 + const kvKey = `last:${new URL(feedUrl).pathname}`; 38 + const lastPosted = await env.KV.get(kvKey); 39 + const lastDate = lastPosted ? new Date(lastPosted) : new Date(0); 40 + 41 + const newEntries = entries 42 + .filter(e => e.date && new Date(e.date) > lastDate) 43 + .sort((a, b) => new Date(a.date) - new Date(b.date)); 44 + 45 + if (!newEntries.length) return; 46 + 47 + for (const entry of newEntries) { 48 + for (const channel of channels) { 49 + const webhook = webhooks[channel]; 50 + if (webhook) await postToDiscord(webhook, entry, feedUrl); 51 + } 52 + } 53 + 54 + const latestDate = newEntries[newEntries.length - 1].date; 55 + await env.KV.put(kvKey, latestDate); 56 + } 57 + 58 + function parseEntries(text) { 59 + const { format, feed } = parseFeed(text); 60 + 61 + if (format === 'atom') { 62 + return (feed.entries || []).map(e => ({ 63 + title: e.title, 64 + link: e.links?.find(l => !l.rel || l.rel === 'alternate')?.href || e.links?.[0]?.href, 65 + summary: e.summary || e.content, 66 + date: e.updated || e.published, 67 + author: e.authors?.[0]?.name, 68 + })); 69 + } 70 + 71 + // RSS, RDF, JSON 72 + return (feed.items || []).map(e => ({ 73 + title: e.title, 74 + link: e.link, 75 + summary: e.description, 76 + date: e.pubDate, 77 + author: e.authors?.[0], 78 + })); 79 + } 80 + 81 + async function postToDiscord(webhook, entry, feedUrl) { 82 + const feedName = new URL(feedUrl).pathname.split('/')[1] || 'Feed'; 83 + await fetch(webhook, { 84 + method: 'POST', 85 + headers: { 'Content-Type': 'application/json' }, 86 + body: JSON.stringify({ 87 + embeds: [{ 88 + title: entry.title || 'New activity', 89 + url: entry.link || feedUrl, 90 + description: entry.summary?.slice(0, 300) || '', 91 + color: 0x5865F2, 92 + timestamp: entry.date || new Date().toISOString(), 93 + footer: { text: entry.author ? `${feedName} • ${entry.author}` : feedName } 94 + }] 95 + }) 96 + }); 97 + }
+15
wrangler.toml
··· 1 + name = "bzzrt" 2 + main = "worker.js" 3 + compatibility_date = "2024-01-01" 4 + 5 + [[kv_namespaces]] 6 + binding = "KV" 7 + id = "2fa0fe539e8c4f1c9848206e8f6d4003" 8 + 9 + [triggers] 10 + crons = ["*/5 * * * *"] 11 + 12 + # Optional: add rules to include the JSON config 13 + # [[rules]] 14 + # type = "Data" 15 + # globs = ["feeds.json"]