Enhance LLM Observability with Langfuse and OpenLLMetry
LLM Observability with Langfuse and OpenLLMetry
Langfuse provides a backend built on OpenTelemetry for ingesting trace data, and you can use different instrumentation libraries to export traces from your applications.
What is Langfuse? Langfuse (GitHub) is an open-source platform for LLM engineering. It provides tracing and monitoring capabilities for AI agents, helping developers debug, analyze, and optimize their products. Langfuse integrates with various tools and frameworks via native integrations, OpenTelemetry, and SDKs.
Step 1: Install Dependencies
Begin by installing the necessary Python packages. In this example, we need the openai
library to interact with OpenAI’s API and traceloop-sdk
for enabling OpenLLMetry instrumentation.
Step 2: Set Up Environment Variables
Before initiating any requests, configure your environment with the necessary credentials and endpoints. Here, we establish Langfuse authentication by combining your public and secret keys into a Base64-encoded token. Additionally, specify the Langfuse endpoint based on your preferred geographical region (EU or US) and provide your OpenAI API key.
Step 3: Initialize OpenLLMetry Instrumentation
Proceed to initialize the OpenLLMetry instrumentation using the traceloop-sdk
. It is advisable to use disable_batch=True
if you are executing this code in a notebook, as traces are sent immediately without waiting for batching. Once initialized, any action performed using the OpenAI SDK (such as a chat completion request) will be automatically traced and forwarded to Langfuse.
Step 4: Analyze the Trace in Langfuse
After executing the above code, you can examine the generated trace in your Langfuse dashboard:
Was this page helpful?