We’ve all been there. You followed all the instructions, but you’re not seeing any traces. Let’s fix this.

1. Disable batch sending

Sending traces in batch is useful in production, but can be confusing if you’re working locally. Make sure you’ve disabled batch sending.

2. Check the logs

When Traceloop initializes, it logs a message to the console, specifying the endpoint that it uses. If you don’t see that, you might not be initializing the SDK properly.

Traceloop exporting traces to https://api.traceloop.com

3. (TS/JS only) Fix known instrumentation issues

If you’re using Typescript or Javascript, make sure to import traceloop before any other LLM libraries. This is because traceloop needs to instrument the libraries you’re using, and it can only do that if it’s imported first.

import * as traceloop from "@traceloop/traceloop";
import OpenAI from "openai";
...

If that doesn’t work, you may need to manually instrument the libraries you’re using. See the manual instrumentation guide for more details.

import OpenAI from "openai";
import * as LlamaIndex from "llamaindex";

traceloop.initialize({
  appName: "app",
  instrumentModules: {
    openAI: OpenAI,
    llamaIndex: LlamaIndex,
    // Add or omit other modules you'd like to instrument
  },

4. Is your library supported yet?

Check out OpenLLMetry or OpenLLMetry-JS README files to see which libraries and versions are currently supported. Contributions are always welcome! If you want to add support for a library, please open a PR.

5. Try outputting traces to the console

Use the ConsoleExporter and check if you see traces in the console.

If you see traces in the console, then you probable haven’t configured the exporter properly. Check the integration guide again, and make sure you’re using the right endpoint and API key.

6. Talk to us!

We’re here to help. Reach out any time over Slack, email, and we’d love to assist you.