Observability for Llamaindex with Instana

OpenLLMetry lets you trace prompts and embedding calls of Llamaindex. With 5 minutes of work you can get complete view of your system directly into Instana. See how below.

Step 1

Install and initialize the SDK in your code. It will automatically detect the workflow structure in build traces.

Step 2

Configure your Instana agent to receive traces and metrics from OpenLLMetry

Discover use cases

Trace a LlamaIndex-based Agent

Build an agent with LlamaIndex. See everything your agent does including call to tools, HTTP requests, and database call as full traces

Trace a LlamaIndex-based RAG pipeline

Build a RAG pipeline with Chroma and LlamaIndex. See vectors returned from Chroma, full prompt in OpenAI and responses as traces