ERR_BUFFER_OUT_OF_BOUNDS in Node.js

Problem

I see the following error in my Node.js application:

node:internal/buffer:1066
      throw new ERR_BUFFER_OUT_OF_BOUNDS('length');
      ^

Cause

This is most likely due to a bug in NodeJS versions 22.6 and 22.7 with handling utf-8 buffers. See more in the Node.js issue.

Solution

Upgrade to Node.js version 22.8.0 or higher. You can also downgrade to 22.5.

My JS traces are not showing up in the Laminar UI.

Problem

I have instrumented my JavaScript code with Laminar, but I don’t see any traces in the Laminar UI.

Cause

This often happens in Edge runtimes, Lambda functions, or in one-off scripts that are not running in a long-lived process.

JavaScript’s OpenTelemetry batch span processor runs in a background async function, and, if the process exits before the function has a chance to send the traces, they will be lost. In theory, there is a way to force flush the pending spans onShutdown, but it is not implemented in the OpenTelemetry JS SDK. Apparently, doing that may cause the consequent incoming spans to block the process.

See originating commit where the onShutdown was only implemented for the browser contexts, but not for Node.js.

Solution

Use Laminar’s shutdown() function to ensure that the traces are sent before the process exits.

import { Laminar as L } from '@lmnr-ai/lmnr';

L.initialize({ projectApiKey: process.env.LMNR_PROJECT_API_KEY });

// your code here

// at the end of your script, when you are done tracing
await L.shutdown(); // NOTE: don't forget to await

I can see my trace in the Laminar UI, but it is not showing the spans.

Problem

Our UI relies on the presence of a “parent” span in every trace. If the parent span is missing, the trace will not be correctly displayed.

Cause

Parent span may have not been sent correctly from the SDK. This typically happens when an exception occurs before the parent span is properly ended.

Common causes:

If this happens elsewhere, especially in the Python’s @observe decorator, it is likely a bug in the SDK. Please report it in our Discord server.

Solution

Make sure that the observe() function is awaited in JavaScript, and that the with statement is used in Python for manual span creation.

If you are sure that the spans are created correctly, but the issue persists, please report it in our Discord server.

Observe decorator is not working in Python with async generators, i.e. model stream responses

Problem

@observe decorator in Python is not correctly capturing the spans when I use async generators, e.g. when I am streaming responses from a model.

Cause

Async generators only end when they are exhausted, i.e. when the async for loop is done. Given the nature of Python’s async, the chance of the interruptions within stream, and the way OpenTelemetry’s contexts work, there is no way to ensure that @observe works in 100% of the cases.

Solution

One possible workaround is to observe only the functions that are calling the async generator, and handle the final result, not the generator itself.

However, if you want to track partial stream results as well, you can create the spans manually.

Refer to manually creating spans for more information.