Observability for LlamaIndex with Traceloop
OpenLLMetry lets you trace workflows and RAG pipelines build with LlamaIndex. With 5 minutes of work you can get complete view of your system directly into Traceloop. See how below.
Discover use cases
Trace a LlamaIndex-based RAG pipeline
Build a RAG pipeline with Chroma and LlamaIndex. See vectors returned from Chroma, full prompt in OpenAI and responses as traces
Trace a LlamaIndex-based Agent
Build an agent with LlamaIndex. See everything your agent does including call to tools, HTTP requests, and database call as full traces