We wanted a simple workflow: grab the latest AI updates every day and push a clean 10-item briefing to Telegram.
The catch: many “search the web” integrations require paid keys (for example, Brave Search). So we designed a version that works without any paid search API. It’s not “search everything on the internet,” but it is stable, free, and surprisingly effective—especially if your focus is a handful of sources like Google, OpenAI, and Anthropic, plus GitHub’s daily trending repos.
What “news grabbing” means in OpenClaw
OpenClaw doesn’t need a special “news” tool. The workflow is:
- Fetch trusted sources (RSS or official newsroom pages).
- Extract titles + links + timestamps.
- Summarize into a short briefing.
- (Optional) Schedule it to send automatically via cron.
In practice, the most important design choice is the source layer.
Source strategy: use official feeds first
Search engines are great for breadth, but official feeds are great for signal.
1) OpenAI: Official RSS (most reliable)
OpenAI provides an RSS feed that’s easy to parse:
https://openai.com/news/rss.xml
Why we like it:
- It’s machine-readable.
- It contains
title,description,link, andpubDate. - It changes slowly (rarely breaks).
2) Anthropic: Newsroom listing
Anthropic’s newsroom works well as a lightweight “latest updates” source:
https://www.anthropic.com/news
It’s not RSS-first, but it is official and typically clean enough for HTML extraction.
3) Google: two angles (research + dev ecosystem)
For Google we used two high-signal entry points:
- DeepMind blog/news list:
https://deepmind.google/blog/ - Google Developers Blog search results for Gemini:
https://developers.googleblog.com/en/search/?query=Gemini
This covers both:
- model / research announcements (DeepMind)
- developer-facing updates and tooling (DevBlog)
4) GitHub: Daily Trending (to catch what builders are shipping)
GitHub Trending is an underrated “AI news” source because it captures what developers actually adopt:
https://github.com/trending?since=daily
From this page we can extract:
- repo name (owner/name)
- short description
- stars today
Why fetch failed happens (and why it’s usually not the model)
When we started integrating multiple providers, we hit fetch failed. The key lesson:
In Codex/OpenAI integration,
fetch failedis most often a network path issue, not “the model is down.”
The top three causes we observed:
- Model switching window: during rapid switching (Gemini ⇄ OpenAI), there can be a short failure window.
- Proxy link jitter: if your proxy bridge is unstable (for us it was
host.orb.internal:7897), requests can intermittently timeout. - Upstream transient timeout / rate limiting: retry after a few seconds often succeeds.
This matters because it changes the best response from “reconfigure everything” to “retry + quick network self-check.”
Turning it into an automated Telegram push (cron)
Once you can generate a briefing on-demand, automation is just scheduling.
Conceptually:
- A cron job runs daily (e.g., 09:00 Asia/Shanghai).
- It triggers an isolated agent run.
- The agent fetches sources, summarizes, and posts to Telegram.
We recommend isolated sessions for scheduled runs because they are predictable and don’t depend on whatever we were chatting about earlier.
Example cron shape (conceptual)
- schedule:
0 9 * * * - payload: an
agentTurnprompt like:- “Fetch OpenAI RSS, Anthropic Newsroom, DeepMind blog list, DevBlog Gemini search, GitHub daily trending; output top 10 items with links and 1–2 line summaries.”
- delivery: announce to Telegram
(We purposely keep API keys and tokens out of the post—your OpenClaw config should hold secrets, not your blog.)
A small improvement: add more free RSS sources later
If you want more breadth without paid search, you can add more RSS feeds over time:
- arXiv (topic searches)
- Hacker News
- vendor RSS feeds (e.g., company blogs)
The model can deduplicate and rank items after ingestion.
Takeaway
If your goal is “daily AI news,” you don’t need an expensive search API on day one.
We can get a solid daily briefing by:
- prioritizing official RSS/newsroom sources (high signal)
- adding GitHub Trending (what developers are building)
- optionally scheduling everything with cron to deliver a Telegram push
It’s a simple pipeline—but once it’s running, it feels like having a tiny newsroom on standby.