Observability for OpenAI with Dynatrace
OpenLLMetry lets you trace prompts and embedding calls of OpenAI. With 5 minutes of work you can get complete view of your system directly into Dynatrace. See how below.
Step 1
Install the Traceloop SDK and initialize it. It will automatically log all calls to OpenAI, with prompts and completions as separate spans.
Step 2
Go to your Dynatrace environment and create a new access token under Manage Access Tokens.
The access token needs the following permission scopes that allow the ingest of OpenTelemetry spans, metrics and logs (openTelemetryTrace.ingest, metrics.ingest, logs.ingest).
Set the following environment variables.
Discover use cases
Trace prompts and completions
Call OpenAI and see prompts, completions, and token usage for your call.
Trace your RAG retrieval pipeline
Build a RAG pipeline with Chroma and OpenAI. See vectors returned from Chroma, full prompt in OpenAI and responses