When Traceloop initializes, it logs a message to the console, specifying the endpoint that it uses.
If you don’t see that, you might not be initializing the SDK properly.
Traceloop exporting traces to https://api.traceloop.com
If you’re using Typescript or Javascript, make sure to import traceloop before any other LLM libraries.
This is because traceloop needs to instrument the libraries you’re using, and it can only do that if it’s imported first.
Copy
Ask AI
import * as traceloop from "@traceloop/traceloop";import OpenAI from "openai";...
If that doesn’t work, you may need to manually instrument the libraries you’re using.
See the manual instrumentation guide for more details.
Copy
Ask AI
import OpenAI from "openai";import * as LlamaIndex from "llamaindex";traceloop.initialize({ appName: "app", instrumentModules: { openAI: OpenAI, llamaIndex: LlamaIndex, // Add or omit other modules you'd like to instrument },
Check out OpenLLMetry or OpenLLMetry-JS README files to see which libraries and versions are currently supported.
Contributions are always welcome! If you want to add support for a library, please open a PR.
Use the ConsoleExporter and check if you see traces in the console.
Copy
Ask AI
from opentelemetry.sdk.trace.export import ConsoleSpanExporterTraceloop.init(exporter=ConsoleSpanExporter())
If you see traces in the console, then you probable haven’t configured the exporter properly.
Check the integration guide again, and make sure you’re using the right endpoint and API key.