Why Berserk
How Berserk approaches observability for the AI era
Software is producing more telemetry than ever. With every deploy, the volume and noise grows. AI agents make the problem worse — they produce text-heavy outputs that capture the decisions driving our businesses, but don't fit structured telemetry schemas. Yet they carry the same operational signals.
What we need isn't an auxiliary system for AI logs. We need a unified system that can correlate logs, metrics, traces, and AI outputs.
Berserk is built for telemetry in the AI era. It is schemaless, fast, and designed to handle large text-heavy logs alongside traditional telemetry — while remaining exceptionally affordable, even at petabyte scale.
Open Standards with OpenTelemetry
Berserk speaks OpenTelemetry natively — OTLP over gRPC and HTTP with gzip, zstd, and lz4 compression. No proprietary agents, no vendor lock-in. Your existing instrumentation works out of the box. Traces, metrics, and logs flow through a single pipeline with full semantic convention support.
AI Monitoring and Analysis
Every prompt, completion, token count, and latency measurement lands in Berserk without schema planning. Track model performance, detect regressions, and debug hallucinations across your entire AI stack. The schemaless architecture means new fields appear instantly — no migration needed when you add a new model or change your prompt format.
Trace-Joins Across Spans
Traditional trace backends let you view one trace at a time. Berserk's Trace-Joins let you query across all your traces as a single dataset — join spans from different services, correlate error patterns across deployments, and find the needle in a haystack of distributed systems. Think SQL joins, but across your entire trace corpus.
Zero Schema Migrations
Add new fields, rename services, change tag cardinality — Berserk adapts automatically. There's no schema to define, no migrations to run, no downtime windows to schedule. Your data shape evolves with your code.
AI Tooling Friendly
Berserk's KQL query language and structured JSON API are designed for AI agents and automation. Feed query results directly to LLMs for root cause analysis, generate dashboards programmatically, and build AI-powered runbooks that investigate incidents autonomously.
Blazing Query Performance
Berserk's Rust-powered query engine uses columnar segment files with lazy deserialization — only the columns you query get read. Combined with aggressive predicate pushdown and parallel execution across time slices, you get sub-second results even at petabyte scale. No pre-aggregation required.
Unlimited Retention
Berserk stores data on your own S3, GCS, Azure Blob, or self-hosted MinIO. There are no retention limits — keep a decade of logs if you want. Cold data costs pennies per GB per month, and queries remain fast thanks to smart indexing and segment-level metadata. Your data, your storage, your rules.