Cloudflare Developer Platform vs Langfuse
Workers, Pages, D1, R2, KV — edge-native serverless
vs. Open-source LLM engineering platform — observability, prompts, evals
Pricing tiers
Cloudflare Developer Platform
Free
Workers: 100k req/day, 10ms CPU/req. D1: 5M reads/day. KV: 100k reads/day. R2: 10 GB/mo. Durable Objects: 100k req/day.
Free
Workers Paid
$5/month minimum. 10M Workers req/mo + overages. Generous D1, KV, R2 allowances. All services unlocked.
$5/mo
Enterprise
Custom. Higher SLAs, dedicated support, account managers.
Custom
Langfuse
Hobby (Cloud Free)
Free. 50k units/month included. 30 days data access. 2 users. Community support.
Free
Self-Hosted (OSS)
MIT-licensed. Docker Compose or Kubernetes deployment. Unlimited.
$0 base (usage-based)
Core
$29/month. 100k units included ($8 per 100k overage). 90 days retention. Unlimited users. In-app support.
$29/mo
Pro
$199/month. 100k units included + same overage. 3 YEARS retention. Unlimited annotation queues. High rate limits.
$199/mo
Teams Add-on
+$300/month. Adds Enterprise SSO + fine-grained RBAC + dedicated Slack support to Pro.
$300/mo
Enterprise
$2,499/month. Everything + custom rate limits, uptime SLA, dedicated support engineer. Yearly options.
$2499/mo
Free-tier quotas head-to-head
Comparing free on Cloudflare Developer Platform vs hobby on Langfuse.
| Metric | Cloudflare Developer Platform | Langfuse |
|---|---|---|
| d1 reads day | 5000000 rows/day | — |
| d1 storage gb | 5 GB | — |
| d1 writes day | 100000 rows/day | — |
| durable objects gb sec | 13000 GB-sec/day | — |
| durable objects requests | 100000 req/day | — |
| kv reads day | 100000 reads/day | — |
| kv storage gb | 1 GB | — |
| kv writes day | 1000 writes/day | — |
| r2 class a ops | 1000000 ops/month | — |
| r2 class b ops | 10000000 ops/month | — |
| r2 egress | free (unlimited) egress | — |
| r2 storage gb | 10 GB/month | — |
| workers cpu ms | 10 ms/invocation | — |
| workers logs events | 200000 events/day | — |
| workers requests day | 100000 req/day | — |
Features
Cloudflare Developer Platform · 18 features
- AI Gateway — Proxy for OpenAI/Anthropic/Gemini with caching, analytics, retries, rate limitin…
- D1 — Managed serverless SQLite via sqlite-wasm. Read replicas via replicas keyword.
- Durable Objects — Strongly-consistent objects with storage, pinned to a region. Good for sync/stat…
- Email Routing — Catchall email routing + Email Workers for programmatic handling.
- Hyperdrive — Connection pooler + cache for Postgres. Makes your DB edge-fast.
- Images — Store, resize, transform images. Polish on-the-fly.
- KV — Eventually-consistent key-value at every POP. Good for config, caching.
- Pages — Static + SSR framework hosting (Next.js via OpenNext, Remix, Nuxt, SvelteKit).
- Pages Functions — Workers integrated into Pages for backend logic.
- Queues — Persistent message queues with batched consumers. Exactly-once per consumer grou…
- R2 — S3-compatible object storage with **zero egress fees**.
- Stream — Video upload, encoding, adaptive streaming.
- Vectorize — Vector DB at the edge. For RAG + semantic search.
- Workers — V8-isolate serverless at 300+ POPs. Sub-ms cold starts. JS/TS/Rust/Python (beta)…
- Workers AI — Run LLMs (Llama, Mistral) + image, speech, embedding models at edge via @cf/... …
- Workers Logs — Structured logs with 3-day retention on paid.
- Workflows — Durable step-functions for long-running tasks.
- Zaraz — Third-party script management at edge (analytics, marketing tags).
Langfuse · 16 features
- Annotation Queues — Human reviewers rate traces. Unlimited on Pro+.
- Dashboards — Aggregate metrics, cost, quality across projects.
- Datasets — Curate test sets from production traces. Run experiments.
- EU Cloud Region — GDPR-compliant hosting in EU.
- Evaluations — LLM-as-judge, manual scores, custom model-graded evaluators.
- LLM Cost Tracking — Automatic cost calculation per provider/model.
- OpenTelemetry Native — OTel SDK → Langfuse endpoint works out of box.
- Playground — Test prompts + models + variables live.
- Prompt Management — Version, tag, label prompts. Reference from code by label.
- Public API — Full REST API for ingest, query, prompt management.
- Python @observe decorator — One-line decorator to trace any function.
- Self-Hosting — Docker Compose + k8s Helm chart.
- Sessions — Group related traces (conversations, agent runs).
- Tracing — Capture every LLM call, tool call, nested span with inputs/outputs/cost.
- Users Tracking — Segment traces by user ID, track per-user cost.
- Webhooks — Subscribe to trace completion events.
Developer interfaces
| Kind | Cloudflare Developer Platform | Langfuse |
|---|---|---|
| CLI | Wrangler CLI | — |
| SDK | — | langfuse-js, langfuse-python |
| REST | Cloudflare REST API, D1 HTTP (via Worker), R2 S3-compatible API | Langfuse REST API |
| MCP | Cloudflare MCP | Langfuse MCP Server |
| OTHER | Workers Runtime (V8) | Langfuse Dashboard, OpenTelemetry endpoint |
Staxly is an independent catalog of developer platforms. Outbound links to Cloudflare Developer Platform and Langfuse are plain references to their official websites. Pricing is verified against vendor pages at publication time — reconfirm before buying.
Want this comparison in your AI agent's context? Install the free Staxly MCP server.