LLM Observability & Monitoring
Tools for tracing, debugging, monitoring, and optimizing LLM-powered applications in production. Track costs, latency, token usage, and prompt performance.
Quick Comparison
| Tool | Pricing | Open Source | Self-Hosted | Maturity |
|---|---|---|---|---|
| Arize Phoenix | open-source | Yes | Yes | growing |
| Braintrust | freemium | No | No | growing |
| Helicone | freemium | Yes | Yes | growing |
| Langfuse | freemium | Yes | Yes | growing |
| LangSmith | freemium | No | Yes | established |
Arize Phoenix
growingOpen-source LLM observability with ML monitoring roots
Open Source open-source Free Tier
LLM tracing with OpenTelemetry Retrieval evaluation (RAG) Teams already using Arize for ML monitoring
Braintrust
growingEnterprise AI product platform with eval-first approach
freemium Free Tier
LLM evaluation and scoring Prompt experimentation AI proxy with logging
Helicone
growingLLM observability platform with one-line integration
Open Source freemium Free Tier
Quick setup request logging Cost monitoring and optimization Rate limiting and caching
Langfuse
growingOpen-source LLM engineering platform
Open Source freemium Free Tier
LLM tracing and debugging Prompt management Cost tracking
LangSmith
establishedDeveloper platform for LLM application lifecycle
freemium Free Tier
LangChain ecosystem users LLM application testing and evaluation Prompt debugging and iteration