Observability for LlamaIndex with Splunk

OpenLLMetry lets you trace workflows and RAG pipelines build with LlamaIndex. With 5 minutes of work you can get complete view of your system directly into Traceloop. See how below.

Step 1

Install and initialize the SDK in your code. It will automatically detect the workflow structure in build traces.

Step 2

Configure your collector to collect traces and send them to Splunk Observability Cloud; point OpenLLMetry to send traces to your Splunk collector

Discover use cases

Trace a LlamaIndex-based RAG pipeline

Build a RAG pipeline with Chroma and LlamaIndex. See vectors returned from Chroma, full prompt in OpenAI and responses as traces

Trace a LlamaIndex-based Agent

Build an agent with LlamaIndex. See everything your agent does including call to tools, HTTP requests, and database call as full traces