This is still in beta. Give us feedback at [email protected]
1

Install the SDK

Run the following command in your terminal:

In your LLM app, initialize the Traceloop tracer like this:

If you’re using Rails, this needs to be in config/initializers/traceloop.rb

require "traceloop/sdk"

traceloop = Traceloop::SDK::Traceloop.new
2

Log your prompts

For now, we don’t automatically instrument libraries on Ruby (as opposed to Python and Javascript). This will change in later versions.

This means that you’ll need to manually log your prompts and completions.

require "openai"

client = OpenAI::Client.new

# This tracks the latency of the call and the response
traceloop.workflow("joke_generator") do
  traceloop.llm_call(provider="openai", model="gpt-3.5-turbo") do |tracer|
    # Log the prompt
    tracer.log_prompt(user_prompt="Tell me a joke about OpenTelemetry")

    # Or use the OpenAI Format
    # tracer.log_messages([{ role: "user", content: "Tell me a joke about OpenTelemetry" }])

    # Call OpenAI like you normally would
    response = client.chat(
      parameters: {
        model: "gpt-3.5-turbo",
        messages: [{ role: "user", content: "Tell me a joke about OpenTelemetry" }]
      })

    # Pass the response form OpenAI as is to log the completion and token usage
    tracer.log_response(response)
  end
end
3

Configure trace exporting

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are TRACELOOP_API_KEY and TRACELOOP_BASE_URL.

For Traceloop, read on. For other options, see Exporting.

Using Traceloop Cloud

Go to Traceloop, and create a new account. Then, click on Environments on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click Generate API Key to generate an API key for the developement environment and click Copy API Key to copy it over.

Make sure to copy it as it won’t be shown again.

Set the copied Traceloop’s API key as an environment variable in your app named TRACELOOP_API_KEY.

Done! You’ll get instant visibility into everything that’s happening with your LLM. If you’re calling a vector DB, or any other external service or database, you’ll also see it in the Traceloop dashboard.