Getting Started with Traceloop Hub
Set up Hub as a smart proxy to all your LLM calls.
Hub is a next generation smart proxy for LLM applications. It centralizes control and tracing of all LLM calls and traces. It’s built in Rust so it’s fast and efficient. It’s completely open-source and free to use.
Installation
Local
- Clone the repo:
-
Copy the
config-example.yaml
file toconfig.yaml
and set the correct values (see below for more information). -
Run the hub by running
cargo run
in the root directory.
With Docker
Traceloop Hub is available as a docker image named traceloop/hub
. Make sure to create a config.yaml
file
following the configuration instructions.
Connecting to Hub
After running the hub and configuring it, you can start using it to invoke available LLM providers. Its API is the standard OpenAI API, so you can use it as a drop-in replacement for your LLM calls.
You can invoke different pipelines by passing the x-traceloop-pipeline
header. If none is specified, the default pipeline will be used.
Was this page helpful?