Vercel AI SDK is a great library to add LLM features to your JS/TS applications. It supports tracing using OpenTelemetry.

Laminar tracing is based on OpenTelemetry, so it is fully compatible with Vercel AI SDK tracing and you can start sending Vercel AI SDK traces to Laminar right away.

1

Initialize Laminar

Initialize Laminar with your project API key.

Initialize Laminar
import { Laminar } from '@lmnr-ai/lmnr';

Laminar.initialize({
    projectApiKey: process.env.LMNR_API_KEY,
});
2

Enable Vercel AI SDK tracing

Enable experimental_telemetry in the Vercel AI SDK configuration.

Generate text
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

await generateText({
    model: openai( 'gpt-4o-mini'), messages: [
        {
            role: 'user',
            content: 'What is the capital of France?'
        }
    ],
    experimental_telemetry: {
        isEnabled: true
    }
});
3

NextJS

If you are using AI SDK in the NextJS route handler, your code might look like this:

NextJS
import { Laminar } from '@lmnr-ai/lmnr';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

Laminar.initialize({
    projectApiKey: process.env.LMNR_API_KEY,
});

export async function GET() {

    const response = await generateText({
        model: openai('gpt-4o-mini'), messages: [
            {
                role: 'user',
                content: 'What is the capital of France?'
            }
        ],
        experimental_telemetry: {
            isEnabled: true
        }
    });

    return Response.json(response);
}
4

Results

And that’s it! You should now see traces in Laminar.