Observability for LlamaIndex with Highlight

OpenLLMetry lets you trace workflows and RAG pipelines build with LlamaIndex. With 5 minutes of work you can get complete view of your system directly into Honeycomb. See how below.

Step 1

Install and initialize the SDK in your code. It will automatically detect the workflow structure in build traces.

Step 2

Route traces to Highlights’s OTLP endpoint and set the Highlight project ID in the headers.
TRACELOOP_BASE_URL=https://otel.highlight.io:4318
TRACELOOP_HEADERS="x-highlight-project=<YOUR_HIGHLIGHT_PROJECT_ID>"

Discover use cases

Trace a LlamaIndex-based RAG pipeline

Build a RAG pipeline with Chroma and LlamaIndex. See vectors returned from Chroma, full prompt in OpenAI and responses as traces

Trace a LlamaIndex-based Agent

Build an agent with LlamaIndex. See everything your agent does including call to tools, HTTP requests, and database call as full traces