Overview

Next.js is a popular React framework for building web applications.

For a full example app, see the Next.js guide and the Next.js + AI SDK guide.

Getting Started

1. Install Laminar

npm add @lmnr-ai/lmnr

2. Update your next.config.ts

Add the following to your next.config.ts file:

next.config.ts
const nextConfig = {
  serverExternalPackages: ['@lmnr-ai/lmnr'],
};

export default nextConfig;

This is because Laminar depends on OpenTelemetry, which uses some Node.js-specific functionality, and we need to inform Next.js about it. Learn more in the Next.js docs.

3. Initialize Laminar

To instrument your entire Next.js app, place Laminar initialization in instrumentation.{ts,js} file. Learn more about instrumentation.{ts,js} here.

instrumentation.ts
export async function register() {
  if (process.env.NEXT_RUNTIME === 'nodejs') {
    const { Laminar } = await import('@lmnr-ai/lmnr');
    Laminar.initialize({
      projectApiKey: process.env.LMNR_API_KEY,
    });
  }
}

instrumentation.ts is experimental in Next.js < 15.

If you use Next.js < 15, add the following to your next.config.js:

next.config.js
module.exports = {
    experimental: { instrumentationHook: true }
};

4. Patch LLM SDKs

AI SDK is already instrumented implicitly, but you need to direct it to use Laminar tracer.

import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
import { getTracer } from '@lmnr-ai/lmnr';

const { text } = await generateText({
  model: openai('gpt-4.1-nano'),
  prompt: 'What is Laminar flow?',
  experimental_telemetry: {
    isEnabled: true,
    tracer: getTracer(),
  },
});

5. Grouping traces within one route

If your app makes multiple LLM calls within one route, you may want to group them together.

You might get this functionality by default, if your app is instrumented with OpenTelemetry and some Next.js instrumentation, e.g. @vercel/otel or @sentry/nextjs.

Otherwise, you can achieve this by using observe function wrapper, e.g. something like

app/api/chat/route.ts
import { NextRequest } from 'next/server';
import { getTracer, observe } from '@lmnr-ai/lmnr';

export const GET = observe(async (req: NextRequest) => {
  const tracer = getTracer();

  const { firstText, secondText } = await observe(
    {
      name: 'GET /api/chat',
    },
    async () => {
      const { firstText } = await generateText({
      model: openai('gpt-4.1-nano'),
      prompt: 'What is Laminar flow?',
      experimental_telemetry: {
        isEnabled: true,
        tracer,
      },
    });

    const { secondText } = await generateText({
      model: openai('gpt-4.1-nano'),
      prompt: 'What is Laminar flow?',
      experimental_telemetry: {
        isEnabled: true,
        tracer,
      },
    });

    return { firstText, secondText };
    }
  );

  return NextResponse.json({ firstText, secondText });
});

Learn more about observe function wrapper here.