Troubleshooting
Troubleshooting common issues with Laminar tracing
My JS auto-instrumentation is not working.
Problem
I have instrumented my JavaScript code with Laminar, and I can see the manual spans in the Laminar UI, but I don’t see the auto-instrumented spans.
Cause
Due to how JS bundles the modules, in some runtimes, the modules need to be imported after Laminar’s initialization.
Solution
There are two options:
- Importing the LLM modules after Laminar initialization. This may not always work.
- [Recommended] Passing the imported modules as
instrumentModules
option.
Option 1. Importing the LLM modules after Laminar initialization
This option is good for quick testing, but is against linting best practices.
Option 2. Passing the imported modules as instrumentModules
option
For a full list of supported modules, see the instrumentModules
doc comment or read this section of instrumentation docs.
My JS traces are not showing up in the Laminar UI.
Problem
I have instrumented my JavaScript code with Laminar, but I don’t see any traces in the Laminar UI.
Cause
This often happens in Edge runtimes, Lambda functions, or in one-off scripts that are not running in a long-lived process.
JavaScript’s OpenTelemetry batch span processor runs in a background async function, and, if the process exits before the function has a chance to
send the traces, they will be lost. In theory, there is a way to force flush the pending spans onShutdown
, but it is not implemented in the
OpenTelemetry JS SDK. Apparently, doing that may cause the consequent incoming spans to block the process.
See originating commit where onShutdown
was only implemented for the browser contexts,
but not for Node.js.
Solution
Use Laminar’s shutdown()
function to ensure that the traces are sent before the process exits.
My Python auto-instrumentation is not working.
Problem
I have instrumented my Python code with Laminar, and I can see the manual spans in the Laminar UI, but I don’t see the auto-instrumented spans.
Cause
Most likely, you have not installed the right extras that enable the auto-instrumentations.
Solution
Install the extras for LLM providers you are using and wish to instrument. For instance,
Read the installation docs for more information.
I can see traces and spans in the UI, but they are not showing inputs and/or outputs.
Problem
I see traces and spans in the Laminar UI, but they are not showing the inputs and/or outputs. They have attributes and duration, but no inputs or outputs.
Cause
One possible reason is that we send span inputs and outputs as span attributes, and OpenTelemetry has a limited set of possible attribute types, namely
string
, number
, boolean
, and array
of each. If the input or output is not one of these types, it will not be sent as an attribute.
To work around this, we serialize the input and output to JSON and send it as a string. We do our best to serialize the inputs and outputs, but it may be that the serialization fails.
Solution
Make sure inputs and outputs to your functions or spans are serializable to JSON. If you are using custom objects, make sure their fields are serializible.
In Python, that may mean having to implement default()
method on some of them, or on the parent object.
[Next.js] Many of my Next.js route requests are instrumented in the Laminar UI, even though I don’t want that.
Problem
I see many traces and spans in the UI, but most of them are just noise, they merely are route requests.
Some examples of noise spans:
- GET /my/route
- resolve page component
- executing api route (app) /my/route
- start response
Cause
Next.js telemetry sends numerous spans, which may seem noisy in Laminar’s dashboard.
Laminar ignores Next.js spans by default, but this is only introduced in @lmnr-ai/lmnr@0.4.29
.
Solution
Upgrade to @lmnr-ai/lmnr@0.4.29
or higher.
If you still want to see Next.js spans, set preserveNextjsSpans
to true
in Laminar.initialize()
after upgrading.
Many of my JS route requests are instrumented in the Laminar UI, even though I don’t want that.
Problem
I see many traces and spans in the UI, but most of them are just noise, they merely are route requests.
Some examples of noise spans:
GET /my/route
POST /my/other/route
dns.lookup
middleware - query
tcp - connect
fs statSync
Cause
Even though Laminar only enables OpenLLMetry-specific instrumentations, i.e. just model calls, some other dependency in your code may have enabled other instrumentations for Node libraries. In particular, we have seen this issue when such initialization is done before Laminar’s initialization. For example:
Read more on OpenTelemetry’s Node.js auto-instrumentations in OpenTelemetry.
Solution
The most effective solution is to disable the unwanted instrumentations by exporting OTEL_NODE_DISABLED_INSTRUMENTATIONS
environment variable.
The most noisy ones are fs
, http
, dns
, undici
(GET/POST requests), express
(if you are using Express JS).
However, if you still want to use the basic instrumentations above, you could:
- Initialize Laminar first, and then the OpenTelemetry SDK.
- Explicitly enable only the instrumentations you need in the OpenTelemetry SDK.
Observe decorator is not working in Python with async generators, i.e. model stream responses
Problem
@observe
decorator in Python is not correctly capturing the spans when I use async generators, e.g. when I am streaming responses from a model.
Cause
Async generators only end when they are exhausted, i.e. when the async for
loop is done.
Given the nature of Python’s async, the chance of the interruptions within stream, and the
way OpenTelemetry’s contexts work, there is no way to ensure that @observe
works in 100% of the cases.
Solution
One possible workaround is to observe only the outer functions that are calling the async generator, and handle the final result, not the generator itself.
However, if you want to track partial stream results as well, you can create the spans manually. Refer to manually creating spans for more information.
ERR_BUFFER_OUT_OF_BOUNDS in Node.js
Problem
I see the following error in my Node.js application:
Cause
This is most likely due to a bug in NodeJS versions 22.6 and 22.7 with handling utf-8 buffers. See more in the Node.js issue.
Solution
Upgrade to Node.js version 22.8.0 or higher. You can also downgrade to 22.5.