Manual code instrumentation and best practices for tracing with Laminar
lmnr.span.type
– must be set to 'LLM'
.gen_ai.response.model
– must be set to the model name returned by an LLM API (e.g. gpt-4o-mini
).gen_ai.system
– must be set to the provider name (e.g. ‘openai’, ‘anthropic’).gen_ai.usage.input_tokens
– number of tokens used in the input.gen_ai.usage.output_tokens
– number of tokens used in the output.llm.usage.total_tokens
– total number of tokens used in the call.gen_ai.usage.input_cost
, gen_ai.usage.output_cost
, gen_ai.usage.cost
– can
all be reported explicitly. However, Laminar calculates the cost of the major providers
using the values of gen_ai.usage.input_tokens
, gen_ai.usage.output_tokens
, and gen_ai.response.model
.
For the list of supported providers and models, see this section.Laminar
class.
observe
to create a span and Laminar.setSpanAttributes
to set its attributes.Laminar.start_as_current_span
if you want to record a chunk of your code using with
statement.
Laminar.start_as_current_span
is a context manager that creates a new span and sets it as the current span.
Under the hood it uses bare OpenTelemetry start_as_current_span
method, but it also sets some
Laminar-specific attributes.
name
(str
): name of the span.input
(Any
): input to the span. It will be serialized to JSON and recorded as span input.span_type
(Literal['DEFAULT'] | Literal['LLM']
): type of the span. If not specified, it will be 'DEFAULT'
.Span
: the span object. It is a bare OpenTelemetry span, so you can use it to set attributes.
We still recommend using Laminar.set_span_output
and Laminar.set_span_attributes
to set attributes, unless you want
to set some custom attributes.Laminar.startSpan()
and Laminar.withSpan()
.
Laminar.withSpan()
is a thin convenience wrapper around opentelemetry.context.with
.Span
. You can import it from @lmnr-ai/lmnr
or directly from @opentelemetry/api
.true
as the third argument (endOnExit
)
to the last Laminar.withSpan
.outer
as the top level span,
where any spans created with use_span
/Laminar.withSpan
will be children of outer
.
In this example, outer
is the top level span, foo
called OpenAI, which got auto-instrumented,
and bar
is a custom span that we created manually.
LaminarSpanContext
is simply a typed record, so you can pass its stringified JSON representation. You can use Laminar.serializeLaminarSpanContext
to create and serialize the span context.serializeLaminarSpanContext
optionally takes a span object and returns a string. If no span is provided, it will use the current active span.firstHandler
as the top level span, and secondHandler
as a child span.gen_ai.system
attribute manually to
one of the following values.
For the model names, we use the names in the provider’s API.
Provider | Names | Example model name | Link to other model names |
---|---|---|---|
Anthropic | anthropic | claude-3-5-sonnet , claude-3-5-sonnet-20241022 , or claude-3-5-sonnet-20241022-v2:0 | docs.anthropic.com |
Azure OpenAI | azure-openai | gpt-4o-mini or gpt-4o-mini-2024-07-18 | learn.microsoft.com |
Bedrock Anthropic | bedrock-anthropic | claude-3-5-sonnet or claude-3-5-sonnet-20241022-v2:0 | docs.aws.amazon.com |
Gemini | gemini , google-genai | models/gemini-1.5-pro | ai.google.dev |
Groq | groq | llama-3.1-70b-versatile | console.groq.com |
Mistral | mistral | mistral-large-2407 | docs.mistral.ai |
OpenAI | openai | gpt-4o or gpt-4o-2024-11-20 | platform.openai.com |