Observability for OpenAI with Splunk

OpenLLMetry lets you trace prompts and embedding calls of OpenAI. With 5 minutes of work you can get complete view of your system directly into Splunk. See how below.

Step 1

Install the Traceloop SDK and initialize it. It will automatically log all calls to OpenAI, with prompts and completions as separate spans.

Step 2

Configure your collector to collect traces and send them to Splunk Observability Cloud; point OpenLLMetry to send traces to your Splunk collector

Discover use cases

Trace prompts and completions

Call OpenAI and see prompts, completions, and token usage for your call.

Trace your RAG retrieval pipeline

Build a RAG pipeline with Chroma and OpenAI. See vectors returned from Chroma, full prompt in OpenAI and responses