Troubleshooting non-Laminar SDKs
Troubleshooting for using Laminar as the backend for OpenTelemetry.
This page is relevant for the users not using Laminar’s SDK for tracing, but sending their OpenTelemetry traces to Laminar. For a getting started guide, please refer to the OpenTelemetry page.
[Node.js / TypeScript] Error: 16 UNAUTHENTICATED: Failed to authenticate request
Problem
I am using the OpenTelemetry Node.js SDK and I see the following error:
I have checked that my LMNR_PROJECT_API_KEY
is correct and I set the headers when initializing the exporter.
Cause
One common cause is that the authorization
header on the gRPC exporter is set
as a raw HTTP header, instead of a gRPC metadata header. Read the
authorization section for more information.
Solution
Make sure that the authorization
header is passed to the exporter as part
of the metadata object.
[Node.js / TypeScript] Error: Parse Error: Expected HTTP/
Problem
I am using the OpenTelemetry Node.js SDK and I see the following error:
Cause
This error occurs when the exporter is configured to use the HTTP endpoint, but sends traces to the gRPC endpoint. The error message indicates the mismatch between the HTTP versions. gRPC uses HTTP/2.0, while the HTTP exporter uses HTTP/1.1.
Solution
Make sure that you are importing the gRPC exporter from the
@opentelemetry/exporter-trace-otlp-grpc
package.
[Node.js / TypeScript] OTLPExporterError: Not Found
Problem
I am using the OpenTelemetry Node.js SDK and I see the following error:
Cause
OTLP HTTP exporter called the correct base URL, but not the path.
Solution
We recommend using the gRPC exporter,
as it is more reliable and faster. Make sure you are importing
OTLPTraceExporter
from @opentelemetry/exporter-trace-otlp-grpc
.
If you have to use the HTTP exporter, make sure that you are using the correct endpoint.
The endpoint for HTTP is https://api.lmnr.ai:443/v1/traces
(port 443).
[Node.js / TypeScript] OTLPExporterError: Internal Server Error
Problem
I am using the OpenTelemetry Node.js SDK and I see the following error:
Cause
While this may indicate any issue with the Laminar server, one of the most common cases is attempting to send HTTP/json traces to HTTP/proto endpoint. Laminar does NOT support HTTP/json traces.
Solution
We recommend using the gRPC exporter,
as it is more reliable and faster. Make sure you are importing
OTLPTraceExporter
from @opentelemetry/exporter-trace-otlp-grpc
.
If you have to use the HTTP exporter, make sure that you are using are importing
OTLPTraceExporter
from @opentelemetry/exporter-trace-otlp-proto
, NOT from
@opentelemetry/exporter-trace-otlp-http
, as the latter is for HTTP/json traces.
[Python] TypeError: not all arguments converted during string formatting
Problem
I am using the OpenTelemetry Python SDK and I see the following error:
Cause
This error indicates that some of the headers that you passed to the gRPC trace exporter
are not within the allowed set of keys. For Laminar uses, this is mosth likely the
authorization
header.
Solution
Check that the headers you pass to the trace exporter are valid. Most likely,
make sure that the authorization
header starts with lowercase a
.
[Python] ConnectionResetError: [Errno 54] Connection reset by peer
Problem
I am using the OpenTelemetry Python SDK and I see the following error:
Cause
This error indicates that the connection was reset at Laminar’s backend. Most likely, this error indicates an HTTP version mismatch. One common cause for this is using the HTTP exporter against the gRPC endpoint.
Solution
Laminar accepts traces via both gRPC and HTTP. We recommend using the gRPC exporter,
as it is more reliable and faster. Make sure you are importing
OTLPTraceExporter
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter
and not
from opentelemetry.exporter.otlp.proto.http.trace_exporter
.
If you have to use the HTTP exporter, make sure that you are using the correct endpoint.
The endpoint for HTTP is https://api.lmnr.ai:443/v1/traces
(port 443).
[Python] Failed to export batch code: 404, reason:
Problem
I am using the OpenTelemetry Python SDK and I see the following error:
Cause
OTLP HTTP exporter called the correct base URL, but not the path.
Solution
We recommend using the gRPC exporter,
as it is more reliable and faster. Make sure you are importing
OTLPTraceExporter
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter
and not
from opentelemetry.exporter.otlp.proto.http.trace_exporter
.
If you have to use the HTTP exporter, make sure that you are using the correct endpoint.
The endpoint for HTTP is https://api.lmnr.ai:443/v1/traces
(port 443).