![](https://cdn.prod.website-files.com/6645c0129428882861d078b8/66acf443c84367582b1fb530_Untitled-design-(6).png)
![](https://cdn.prod.website-files.com/6645c0129428882861d078b8/66603a0e51d861aca123e2d8_65e47c1ef6dbffab7662b57f_629b44ab95f79dc9fa7256ed.png)
Observability for Ollama with Splunk
OpenLLMetry lets you trace any model running locally or remotely on Ollama. With 5 minutes of work you can get complete view of your system directly into Splunk. See how below.
Step 1
Install the Traceloop SDK and initialize it. It will automatically log all calls to Ollama, whether you're running a model locally or remotely, with prompts and completions as separate spans.
Discover use cases
No items found.