Does it work alongside my existing analytics?

+

Yes. Velum complements tools like Amplitude, Mixpanel, and PostHog. It sits on top of your existing stack, uses the same events you already track, and adds a healing layer: detecting hidden friction patterns and telling you what to fix first.

What LLMs does it support?

+

Any OpenAI-compatible API. Groq is the default (free tier available), but you can use OpenAI, Together, Mistral, Fireworks, or self-hosted models like Ollama and vLLM. Just set the provider and api_key in config. AI is optional — the core pattern detection is fully deterministic.

How much data can it handle?

+

Velum processes event batches per API call — send hundreds to thousands of events per request. The pipeline is stateless and fast: pattern detection is deterministic (no LLM calls), and AI layers only fire for unknown vocab and final summaries. PostgreSQL handles baseline storage. It's built for production traffic.

Is my data sent anywhere?

+

When self-hosted, nothing leaves your network. Your events stay in your PostgreSQL database. The only external calls are to your configured LLM provider (Groq, OpenAI, etc.) for AI features — and those are optional. Velum has no telemetry, no analytics, no phone-home.

Do I need to define an event schema?

+

No. Velum is truly zero-config. Send any JSON events and the Context Enricher (Layer 0) auto-detects which field is the event name, user ID, timestamp, etc. Event properties are automatically classified as dimensions, targets, conditions, or measures.

Can I plug Velum into my AI agent?

+

Absolutely. Velum is a single POST endpoint that returns structured JSON — patterns, severity scores, confidence levels, and AI hypotheses. Your agent can call it, parse the response, and act on it: trigger alerts, file tickets, adjust feature flags, or feed results into other tools. No SDK, no UI interaction, no query language — just one API call and a machine-readable response.

What license is Velum under?

+

MIT. Fully open source, free to use, modify, and deploy commercially. No usage limits, no feature gates.