HN Digest
A cron agent that fetches the top Hacker News stories three times a day, filters for AI and LLM relevance, reads the most interesting articles, and posts a formatted digest to Slack.
This cookbook assumes you've already installed the CLI and set up an API key.
1. Scaffold the Agent
bash
swarmlord init hn-digest \
-s "0 13,18,23 * * *" \
-p "Fetch today's top Hacker News stories using the fetch_hn tool. Focus on AI, AI agents, LLMs, and AI use cases. Summarize the most notable stories into a concise, well-formatted Slack digest." \
-m "anthropic/claude-haiku-4.5"2. Configure Schedule, Output, and Tools
Replace swarmlord.jsonc — this agent needs webfetch (to read article content) and a custom tool, but nothing else:
jsonc
{
"$schema": "https://swarmlord.ai/config.json",
"name": "hn-digest",
"description": "Scheduled Hacker News digest — fetches top stories, summarizes AI and agent news, posts to Slack",
"model": "anthropic/claude-haiku-4.5",
"schedule": {
"cron": "0 13,18,23 * * *",
"prompt": "Fetch today's top Hacker News stories using the fetch_hn tool. Focus on AI, AI agents, LLMs, and AI use cases. Summarize the most notable stories into a concise, well-formatted Slack digest.",
},
"outputs": [
{
"provider": "slack",
"action": "message",
"config": {
"token": "{{secrets.SLACK_BOT_TOKEN}}",
"channel": "hn-ai-digest",
},
"template": "{{result}}",
},
],
"tools": {
"bash": false,
"read": false,
"write": false,
"skill": false,
"websearch": false,
"webfetch": true,
"browser": false,
"todowrite": false,
"todoread": false,
},
"permission": { "*": "allow" },
}Key choices:
schedule.cron— fires at 1 PM, 6 PM, and 11 PM UTC daily. Cron expressions are always UTC; adjust the hours for your team's timezone.outputs— posts the agent's final result to the#hn-ai-digestSlack channel. The{{secrets.SLACK_BOT_TOKEN}}reference is resolved at runtime from encrypted secrets.{{result}}is the agent's final output.- Locked-down tools — only
webfetchis enabled (for reading full articles). The customfetch_hntool is always available. No bash, file I/O, or browser needed.
3. Write the Instructions
Edit SOUL.md:
markdown
# HN Digest Agent
You deliver curated Hacker News digests focused on AI, agents, LLMs, and AI
use cases.
## Workflow
1. Call `fetch_hn` with count 50 to get the current top stories
2. Filter for AI/ML/agent/LLM relevance based on title and domain
3. For the top 5–8 relevant stories, use `webfetch` to read the article content
4. Compose a Slack digest in mrkdwn format
## Digest Format
- Linked title with score and comment count
- One-line summary of each story
- "Also Trending" section with 3–5 notable non-AI stories (title + link only)
- Keep total output under 3000 characters for clean Slack rendering
## Guidelines
- On slow news days, keep the digest short — don't pad with irrelevant content
- Use Slack mrkdwn: `*bold*`, `<url|title>` for links, `>` for blockquotes4. Add the Custom Tool
Create tools/fetch_hn.ts:
typescript
import { tool } from "swarmlord/tool";
import { z } from "zod";
interface HNItem {
id: number;
title: string;
url?: string;
score: number;
by: string;
time: number;
descendants?: number;
type: string;
}
export default tool({
description:
"Fetch the current top stories from Hacker News. Returns titles, URLs, scores, " +
"and comment counts for the top N stories (default 30). Use this to get a snapshot " +
"of what's trending on HN right now.",
input: z.object({
count: z.number().min(1).max(100).optional().describe("Number of top stories to fetch (default 30)"),
}),
timeout: 30_000,
retries: 1,
handler: async ({ count = 30 }, ctx) => {
const idsRes = await fetch("https://hacker-news.firebaseio.com/v0/topstories.json");
if (!idsRes.ok) {
throw new Error(`Failed to fetch top stories: HTTP ${idsRes.status}`);
}
const allIds: number[] = await idsRes.json();
const ids = allIds.slice(0, count);
const batchSize = 10;
const items: HNItem[] = [];
for (let i = 0; i < ids.length; i += batchSize) {
const batch = ids.slice(i, i + batchSize);
const results = await Promise.all(
batch.map(async id => {
const res = await fetch(`https://hacker-news.firebaseio.com/v0/item/${id}.json`);
if (!res.ok) {
ctx.log(`Failed to fetch item ${id}: HTTP ${res.status}`);
return null;
}
return (await res.json()) as HNItem;
})
);
for (const item of results) {
if (item) items.push(item);
}
}
const stories = items
.filter(item => item.type === "story")
.map(item => ({
title: item.title,
url: item.url ?? `https://news.ycombinator.com/item?id=${item.id}`,
score: item.score,
comments: item.descendants ?? 0,
by: item.by,
hn_link: `https://news.ycombinator.com/item?id=${item.id}`,
}));
return JSON.stringify({ count: stories.length, stories });
},
});The tool fetches story IDs from the HN API, then hydrates them in batches of 10 to avoid hammering the API. The timeout and retries fields handle transient network failures.
5. Add Dependencies
Create tools/package.json:
json
{
"name": "hn-digest-agent",
"private": true,
"type": "module",
"devDependencies": {
"swarmlord": "latest",
"zod": "^4.3.6"
}
}The CLI installs these automatically at deploy time. swarmlord provides the tool() helper and zod provides input validation.
6. Set Up Secrets
The Slack output references {{secrets.SLACK_BOT_TOKEN}}. Store it with:
bash
swarmlord secret put SLACK_BOT_TOKENYou'll be prompted to paste the token. It's encrypted at rest and injected at runtime — it never appears in your config or logs.
7. Final Project Structure
hn-digest/
├── swarmlord.jsonc
├── SOUL.md
└── tools/
├── package.json
└── fetch_hn.ts8. Test It
Run the agent manually to verify the tool and output format:
bash
cd hn-digest
swarmlord run "Fetch today's top Hacker News stories using the fetch_hn tool. Focus on AI, AI agents, LLMs, and AI use cases. Summarize the most notable stories into a concise, well-formatted Slack digest."The agent will call fetch_hn, read a few articles with webfetch, and produce a Slack-formatted digest. Verify the output looks right before deploying.
To test the scheduled trigger path without waiting for cron:
bash
swarmlord trigger hn-digest9. Deploy
bash
swarmlord deploy✓ Bundled hn-digest (1 custom tool)
✓ hn-digest deployed
Schedule: 0 13,18,23 * * * (next: today 13:00 UTC)The agent now runs three times daily. Each run fetches fresh stories, filters for AI relevance, and posts to Slack.