Skip to content

TraceVerde

The most comprehensive OpenTelemetry auto-instrumentation library for LLM/GenAI applications.

TraceVerde provides production-ready, zero-code instrumentation for GenAI applications. Install, set two environment variables, and get complete observability across 19+ LLM providers, 8 multi-agent frameworks, and 20+ MCP tools.

Key Features

  • Zero-Code Setup - Just install and set env vars, or add one line of code
  • 19+ LLM Providers - OpenAI, Anthropic, Google AI, AWS Bedrock, Azure, Cohere, Mistral, Together AI, Groq, Ollama, and more
  • 8 Multi-Agent Frameworks - CrewAI, LangGraph, Google ADK, AutoGen, OpenAI Agents SDK, Pydantic AI, Haystack, DSPy
  • Automatic Cost Tracking - 1,050+ model pricing database with per-request cost breakdown
  • GPU Metrics - Real-time NVIDIA and AMD GPU monitoring (utilization, memory, temperature, power)
  • MCP Tool Instrumentation - Databases, caches, vector DBs, message queues, object storage
  • Built-in Evaluation - PII detection, toxicity, bias, prompt injection, restricted topics, hallucination detection
  • OpenTelemetry Native - Works with any OTel-compatible backend (Grafana, Jaeger, Datadog, etc.)

Quick Start

pip install genai-otel-instrument
import genai_otel
genai_otel.instrument()

# Your existing code works unchanged
import openai
client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)
# Traces, metrics, and costs are captured automatically

Next Steps

Examples

90+ ready-to-run examples for every provider, framework, and evaluation feature:

Browse all examples in the examples/ directory.

Community