Glossary

  • Span – a unit of work representing a single operation in your application. A single “block” on the “waterfall” trace.
  • Trace – collection of spans involved in processing a request in your LLM application. Consists of one or more nested spans. A root span is the first span in a trace, and marks the beginning and end of the trace. Trace holds spans and aggregated metadata from the spans.
  • Event – a key-value pair of data with a timestamp representing an event within your application. Must happen within a span.
  • Session – a collection of traces that were serving the same user or the same interaction.

Concept

Laminar offers comprehensive tracing and analytics of your entire application. For every run, the entire execution trace is logged, so the information you can see in logs includes, but is not limited to:

  • Total execution time
  • Total execution tokens and cost
  • Span-level execution time and token counts
  • Inputs and outputs of each span

Getting started

Prerequisites

Make sure to install the package and get your API key from the Laminar dashboard. Read more in Installation.

Annotate your code

Below is an example of how to initialize Laminar and instrument your code.

You can instrument your code by adding the @observe() decorator to your functions and setting the instruments parameter to the libraries you want to be automatically instrumented.

import os
from openai import OpenAI

from lmnr import observe, Laminar as L, Instruments

L.initialize(
    project_api_key=os.environ["LMNR_PROJECT_API_KEY"],
    instruments={Instruments.OPENAI},
)
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

@observe()  # annotate all functions you want to trace
def poem_writer(topic="turbulence"):
    prompt = f"write a poem about {topic}"

    # OpenAI calls are automatically instrumented due to adding `Instruments.OPENAI` in `initialize`
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": prompt},
        ],
    )

    poem = response.choices[0].message.content

    return poem

if __name__ == "__main__":
    print(poem_writer(topic="laminar flow"))

Learn more about instrumenting your code by checking our Python SDK and JavaScript/TypeScript SDK.

Accessing traces

  1. Go to the traces page from the navbar on the left side of the page
  2. Click on each row to see the detailed breakdown, and waterfalll of each log on the sidebar.
  3. Click “Filter” and filter by the required criteria.
Screenshot of the traces page

Example traces page

Viewing more details

Simply click on any of the rows in the logs page and you will see the details in the side.

Click on each span to see its details, including inputs, outputs and metadata, as well as associated events.

OpenTelemetry compatibility

Laminar is compatible with OpenTelemetry tracing through the means of OpenLLMetry.

This means that you can use OpenTelemetry SDKs to send traces to Laminar, and they will be displayed in the Laminar UI.

To get started, in your application, set the OpenTelemetry exporter to the Laminar gRPC endpoint: https://api.lmnr.ai:8443/v1/traces.

Read on to the Next section to learn more about the OpenTelemetry objects and attributes that Laminar uses.