Quickstart
Get started with Laminar Tracing in 2 minutes.
Getting Started in 2 Minutes
1. Get Your Project API Key
To get your Project API Key, navigate to your project settings page on the Laminar dashboard and create new project API key.
Next, you’ll need to set this key as an environment variable in your project. Create a .env
file in the root of your project (if you don’t have one already) and add the following line:
Replace your_project_api_key_here
with the actual key you copied.
2. Initialize Laminar in Your Application
Adding just two lines to your application enables comprehensive tracing:
Laminar should be initialized once in your application. This could be at the server startup, or in the entry point of your application.
This will automatically instrument all major LLM provider SDKs, LLM frameworks including LangChain and LlamaIndex, and calls to vector databases.
For Node JS setups, you need to manually pass the modules you want to instrument, such as OpenAI. See the section on manual instrumentation.
For more information, refer to the instrumentation docs.
3. That’s it! Your LLM API Calls Are Now Traced
Once initialized, Laminar automatically traces LLM API calls. For example, after initialization, this standard OpenAI call:
Will automatically create a span in your Laminar dashboard:
Laminar automatically captures important LLM metrics including latency, token usage, and cost calculations based on the specific model used.
Tracing Custom Functions
Beyond automatic LLM tracing, you can use the observe
decorator/wrapper to trace specific functions in your application:
You can instrument specific functions by wrapping them in observe()
.
This is especially helpful when you want to trace functions, or group
separate functions into a single trace.
You can instrument specific functions by wrapping them in observe()
.
This is especially helpful when you want to trace functions, or group
separate functions into a single trace.
You can instrument specific functions by adding the @observe()
decorator.
This is especially helpful when you want to trace functions, or group
separate functions into a single trace.
We are now recording my_function
and the OpenAI call, which is nested inside it, in the same trace. Notice that the OpenAI span is a child of my_function
. Parent-child relationships are automatically detected and visualized with tree hierarchy.
You can nest as many spans as you want inside each other. By observing both the functions and the LLM/vector DB calls you can have better visualization of execution flow which is useful for debugging and better understanding of the application.
Input arguments to the function are automatically recorded as inputs of the span. The return value is automatically recorded as the output of the span.
Passing arguments to the function in TypeScript is slightly non-obvious. Example:
Next Steps
- Explore our integrations to see how Laminar works with your favorite tools:
- Continue to Trace Structure to learn more about adding structure to your traces.
- Explore Browser agent observability to trace browser sessions and agent execution steps.
- If you want to get into details on OpenTelemetry, check out the in-depth OpenTelemetry guide.