Learn
Introduction
Monitor, debug and test the quality of your LLM outputs
Traceloop automatically monitors the quality of your LLM outputs. It helps you to debug and test changes to your models and prompts.
- Get real-time alerts about your model’s quality
- Execution tracing for every request
- Gradually rollout changes to models and prompts
- Debug and re-run issues from production in your IDE
Need help using Traceloop? Ping us at [email protected]
Get Started - with OpenLLMetry SDK or Traceloop Hub
Traceloop uses OpenTelemetry to monitor and trace your LLM application. You can install the OpenLLMetry SDK in your application, or use Traceloop Hub as a smart proxy to all your LLM calls.
To get started, pick the language you are using and follow the instructions.
Was this page helpful?