Table of Contents
Quick Answer
Use Next.js 15 + TanStack Table + Recharts, back with ClickHouse or PostgreSQL + TimescaleDB for fast time-series. Add an AI query layer that translates natural-language questions into SQL. Self-host on Coolify for near-zero cost vs paid analytics.
- Time to MVP: 1-2 weeks
- Cost: $10-50/mo (vs $200-2000 for Mixpanel/Amplitude)
- Stack: Next.js, ClickHouse/Timescale, Recharts
What You'll Need
- Next.js 15, TypeScript
- ClickHouse or PostgreSQL + TimescaleDB
- Recharts or Tremor for charts
- AI API for NL-to-SQL
- Event source (PostHog, your own tracker)
Steps
- Set up event store. ClickHouse in Docker: docker run -d -p 8123:8123 clickhouse/clickhouse-server. Create events table: event_name, user_id, properties (JSON), timestamp.
- Ingest events. Simple endpoint: POST /api/track → insert to ClickHouse. Client-side: tiny JS snippet on your site.
- Pre-aggregate common queries. Materialized views for DAU/WAU/MAU, funnel steps, top events. ClickHouse handles millions of rows/sec.
- Build dashboard layout. Ask AI: "Generate a Tremor dashboard with 4 stat cards (DAU, MAU, events today, conversion), a line chart for 30-day trend, a funnel for signup → first-action."
- Time-series chart. Recharts <LineChart> fed from a query like SELECT toDate(timestamp) AS day, COUNT(*) FROM events WHERE event_name = 'signup' GROUP BY day.
- Filters & segments. Date range, user cohort, event type, property filter. URL-sync state so dashboards are shareable.
- Add NL-to-SQL. User types "How many signups last week from India?" → AI generates SQL using schema context → execute → return result + chart. Always validate AI SQL (no writes, table allowlist).
- AI auto-insights. Cron job: analyze yesterday's data, detect anomalies (>2σ from rolling mean), summarize in plain English, surface on dashboard.
Common Mistakes
- Using Postgres for millions of events: Works at first, dies at scale. ClickHouse or Timescale from day 1.
- No index on (timestamp, event_name): Queries get slow fast.
- Querying raw events every time: Pre-aggregate hot paths.
- Unsafe NL-to-SQL: Never allow DELETE/UPDATE; allowlist tables, parameterize values.
- Pretty but useless dashboards: Every chart must answer a specific decision. Delete the rest.
Top Tools
Tool
Best For
Price
ClickHouse
Event analytics DB
Free
Timescale
PG extension for time-series
Free
Tremor
React dashboard components
Free
Recharts
Charts
Free
PostHog (self-hosted)
Analytics as-a-service
Free
FAQs
Q: ClickHouse vs Timescale vs Postgres?
Postgres: <10M events. Timescale: 10M-1B. ClickHouse: 1B+ or heavy aggregations.
Q: Can I replace Google Analytics?
Yes — GA4 is bloated & privacy-hostile. Self-hosted Plausible or your own dashboard wins.
Q: Real-time vs batch?
Batch (5-min intervals) is fine for 95% of dashboards. Real-time for ops monitoring.
Q: How do I handle privacy / GDPR?
Don't store PII unless needed. Anonymize IPs. Respect DNT. Provide data export/deletion.
Q: Can I sell this as a service?
Yes — plenty of niches underserved (Shopify analytics, YouTube channel analytics, etc.).
Q: How do I share dashboards?
URL-state encoding + read-only tokens. Embed via iframe for stakeholders.
Conclusion
Self-hosted analytics in 2026 beats paid tools on cost, privacy, and customization. Start with your own product's metrics, add NL queries via AI, and you'll never pay Mixpanel again. Our SEO guide pairs well for measuring content ROI.