Vercel AI SDK is a great library to add LLM features to your JS/TS applications. It supports tracing using OpenTelemetry.
Laminar tracing is based on OpenTelemetry, so it is fully compatible with Vercel AI SDK tracing and you can start sending Vercel AI SDK traces to Laminar right away.
Initialize Laminar
Initialize Laminar with your project API key.
import { Laminar } from '@lmnr-ai/lmnr';
Laminar.initialize({
projectApiKey: process.env.LMNR_API_KEY,
});
Enable Vercel AI SDK tracing
Enable experimental_telemetry
in the Vercel AI SDK configuration.
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
await generateText({
model: openai( 'gpt-4o-mini'), messages: [
{
role: 'user',
content: 'What is the capital of France?'
}
],
experimental_telemetry: {
isEnabled: true
}
});
NextJS
If you are using AI SDK in the NextJS route handler, your code might look like this:
import { Laminar } from '@lmnr-ai/lmnr';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
Laminar.initialize({
projectApiKey: process.env.LMNR_API_KEY,
});
export async function GET() {
const response = await generateText({
model: openai('gpt-4o-mini'), messages: [
{
role: 'user',
content: 'What is the capital of France?'
}
],
experimental_telemetry: {
isEnabled: true
}
});
return Response.json(response);
}
Results
And that’s it! You should now see traces in Laminar.