Observability for OpenAI with Highlight

OpenLLMetry lets you trace prompts and embedding calls of OpenAI. With 5 minutes of work you can get complete view of your system directly into Honeycomb. See how below.

Step 1

Install the Traceloop SDK and initialize it. It will automatically log all calls to OpenAI, with prompts and completions as separate spans.

Step 2

Route traces to Highlights’s OTLP endpoint and set the Highlight project ID in the headers.
TRACELOOP_BASE_URL=https://otel.highlight.io:4318
TRACELOOP_HEADERS="x-highlight-project=<YOUR_HIGHLIGHT_PROJECT_ID>"

Discover use cases

Trace prompts and completions

Call OpenAI and see prompts, completions, and token usage for your call.

Trace your RAG retrieval pipeline

Build a RAG pipeline with Chroma and OpenAI. See vectors returned from Chroma, full prompt in OpenAI and responses