LLM instrumentation
Instrument with OpenLLMetry
To use LLM Observability, you must instrument your application code with OpenLLMetry, an open source SDK which leverages the power of OpenTelemetry auto-instrumentation.
Perform the following tasks:
- Install the Observe Agent in your environment.
- Follow the relevant instrumentation guide:
Python
Send Python application data to Observe.
Node.js
Send Node.js application data to Observe.
Other
For all other languages, refer to these instructions to instrument your LLM-powered app.
Frequently asked questions
Dropped spans or missing attributes
This may occur when there are too many prompts / completions in a given span. Increase the default attribute limit (128) through the environment variable OTEL_ATTRIBUTE_COUNT_LIMIT. Note however, that this can still result in dropped spans or attributes if the limit is exceeded, or the Observe ingest payload limit is breached.
"Missing span" warning in Observe trace UI
This may happen due to incorrect instrumentations, or gen_ai.is_llm_root not being set under mixed instrumentation scenarios such as using OpenTelemetry auto-instrumentation with OpenLLMetry for LLM observability. In the latter case, you may solve it by manually setting the attribute, or by using a span processor. The following example (Python) marks the root span based on the presence of a context key injected by OpenLLMetry:
from typing import Any, Optional
from opentelemetry import context
from opentelemetry.context import Context
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.metric_exporter import OTLPMetricExporter
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
class LLMExplorerSpanProcessor(BatchSpanProcessor):
"""Processor to mark root spans for LLM Explorer."""
def on_start(self, span: Span, parent_context: Optional[Context] = None) -> None:
"""Triggered during every span start."""
association_properties = context.get_value("association_properties", parent_context)
parent_is_llm_span = context.get_value("IS_LLM_SPAN", parent_context)
is_llm_span = "False"
if association_properties is not None:
is_llm_span = "True"
if not parent_is_llm_span or (
isinstance(parent_is_llm_span, str) and parent_is_llm_span.lower() == "false"
):
span.set_attribute("gen_ai.is_llm_root", True)
else:
is_llm_span = "False"
context.attach(context.set_value("IS_LLM_SPAN", is_llm_span, parent_context))
# Later, specify the processor in the traceloop-sdk or the OpenTelemetry tracer provider
Traceloop.init(
...
processor=LLMExplorerSpanProcessor(OTLPSpanExporter()),
metrics_exporter=OTLPMetricExporter(),
...
)Updated 11 days ago