Use Python instrumentation for LLM observability

Supported frameworks and fibraries

OpenLLMetry automatically instruments the following provider APIs and libraries:

  • OpenAI / Azure OpenAI
  • Anthropic Cohere
  • Ollama
  • Mistral AI
  • HuggingFace
  • Bedrock (AWS)
  • SageMaker (AWS)
  • Replicate
  • Vertex AI (GCP)
  • Google Generative AI (Gemini)
  • IBM Watsonx AI
  • Together AI
  • Aleph Alpha
  • Groq

Supported vector databases

The following vector databases are supported:

  • Chroma
  • Pinecone
  • Qdrant
  • Weaviate
  • Milvus
  • Marqo
  • LanceDB

Supported frameworks

The following LLM framsworks are supported:

  • LangChain
  • LlamaIndex
  • Haystack
  • LiteLLM
  • CrewAI

Miscellaneous

  • Model Context Protocol

Install and configure the Python instrumentation

Perform the following tasks:

  1. Add traceloop SDK as a dependency:
pip install traceloop-sdk
  1. Initialize the SDK:
Traceloop.init(  
    app_name="<YOUR_SERVICE_NAME>",
    telemetry_enabled=False
  )

Set environment variables

Set the following environment variables:

Environment variable

Example Values

Description

Optional?

TRACELOOP_BASE_URL

http://<YOUR_OBSERVE_AGENT_HOSTNAME>:4318

OTLP endpoint

No

TRACELOOP_TRACE_CONTENT

true or false

Enables or disables extraction of inputs and outputs from LLM calls

Yes

OTEL_RESOURCE_ATTRIBUTES

deployment.environment=dev, service.namespace=inference

A list of key=value resource attributes you wish to add to your spans

Yes

Add attributes at runtime

To annotate your span data with custom attributes like customer_id, user_id, and so on, we recommend OpenLLMetry's set_association_properties. For example:

Traceloop.set_association_properties({ "user_id": "user12345", "chat_id": "chat12345" })

Frequently asked questions

Instrumentation does not work on Gunicorn-based app

This may happen because frameworks like Gunicorn use a fork process model. The recommended approach for Gunicorn is to call the instrumentor in the post_fork operation. For example, modify your gunicorn_conf.py to add the following:

def post_fork(server: Any, worker: Any) -> None:  
    Traceloop.init(
      app_name="<YOUR_SERVICE_NAME>",
      telemetry_enabled=False
    )

Interoperability with OpenTelemetry zero-code or automatic instrumentation

To ensure OpenLLMetry does not conflict with OpenTelemetry, you would need to ensure the following:

  • If you use zero-code instrumentation via opentelemetry-instrument, you would need to switch to a programmatic approach. To ensure correct ordering of instrumentations, auto_instrumentation.initialize() must be called after Traceloop.init(...)
  • Add an environment variable to inform the OpenTelemetry SDK to ignore OpenLLMetry instrumentations:
    OTEL_PYTHON_DISABLED_INSTRUMENTATIONS=pinecone_client,qdrant_client,mistralai,haystack-ai,chromadb,llama-index,crewai,lancedb,openai,langchain,milvus,aleph_alpha_client,google_generativeai,weaviate_client,marqo,cohere,replicate,groq,ibm-watson-machine-learning,together,google_cloud_aiplatform,ollama,mcp
  • The app_name in your Traceloop.init(…) must match your OpenTelemetry SDK identifiers or the OTEL_SERVICE_NAME environment variable.