Next.js + AI SDK Instrumentation Example
Next.js Integration with Laminar Tracing for AI Applications
Overview
We’ll explore a simple emotional support chat application built with Next.js that:
- Provides a chat interface for users seeking emotional support
- Uses OpenAI’s GPT model to generate empathetic responses
- Traces the entire process with Laminar
- Utilizes Vercel’s AI SDK for AI interactions
Setup
You can use the example app from our GitHub repo.
Alternatively, for a clean install, follow these steps:
Initialize a Next.js app
Learn more in the Next.js docs.
Install Dependencies
Let’s now install the required packages:
Environment Setup
And then fill in the .env.local
file.
Get Laminar project API key.
Get OpenAI API key
Project Structure
The project has the following structure:
Implementation
Let’s look at the key components of our Next.js application with Laminar tracing:
1. instrumentation.ts
This file is crucial as it initializes Laminar for tracing. Next.js automatically loads this file during initialization.
Laminar only works in the ‘nodejs’ runtime of Next.js.
Learn more about the instrumentation.ts
file in the Next.js docs.
Laminar must be initialized at the entry point of the application.
For Next.js, the instrumentation.ts
file is ideal for this purpose as it’s loaded early in the application lifecycle.
2. app/page.tsx
This file contains the main page layout for our chat application:
3. app/api/chat/route.ts
This file contains the API route handler that processes chat messages and communicates with the OpenAI API:
The experimental_telemetry
option is enabled, which allows AI SDK to send telemetry data.
This works seamlessly with Laminar’s tracing capabilities.
4. components/chat-ui.tsx
This component handles the chat interface and manages the chat state.
Feel free to modify the UI as you see fit, this is just an example.
Running the Application
Start the Next.js development server:
Testing the Application
- Navigate to
http://localhost:3000
in your browser - Interact with the chat interface by typing messages
Viewing Traces
After interacting with the chat, you can view the traces in your Laminar dashboard at https://www.lmnr.ai. The trace will show:
- The Next.js API route execution
- The OpenAI API call made through Vercel’s AI SDK
- Token usage and response details
Key Features Demonstrated
- Next.js Instrumentation: Using Next.js’s instrumentation API to initialize Laminar
- AI SDK Integration: Seamless integration with Vercel’s AI SDK
- OpenAI Tracing: Automatic tracing of OpenAI API calls
- Token Usage: Automatic calculation of tokens used for each OpenAI call
- Cost Estimation: Automatic estimation of the cost of each OpenAI call
Example Traces
A screenshot of a trace from the example app.
A screenshot of the similar trace if you enable `preserveNextJsSpans: true` in the `instrumentation.ts` file.
Troubleshooting
If you encounter issues:
- Check that your API keys are correctly set in the
.env.local
file - Verify that Laminar is properly initialized in the
instrumentation.ts
file - Ensure all dependencies are installed
- Review the Next.js development server logs for any application errors