We are happy to take part in DevFest AI 2024
Explore
Home
Traceloop
Observability Platforms
Instana
Dynatrace
Grafana Tempo
Honeycomb
New Relic
Datadog
Splunk
Traceloop
Highlight
LLM Providers
Github
Open-source Observability for LLMs with OpenTelemetry
Start now with just 2 lines of code
Python
Typescript
import os from openai import OpenAI from traceloop.sdk import Traceloop from traceloop.sdk.decorators import workflow client = OpenAI(api_key=os.environ["OPENAI_API_KEY"]) Traceloop.init(app_name="joke_generation_service") @workflow(name="joke_creation") def create_joke(): completion = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}], ) return completion.choices[0].message.content
import * as traceloop from "@traceloop/node-server-sdk"; import OpenAI from "openai"; traceloop.initialize({ appName: "joke_generation_service" }) const openai = new OpenAI(); class MyLLM { @traceloop.workflow("joke_creation") async create_joke(): completion = await openai.chat.completions.create({ model: "gpt-3.5-turbo", messages: [{"role": "user", "content": "Tell me a joke about opentelemetry"}], }) return completion.choices[0].message.content }
Docs