Overview

Laminar automatically instruments the official Anthropic package with a single line of code, allowing you to trace and monitor all your Anthropic API calls without modifying your existing code. This provides complete visibility into your AI application’s performance, costs, and behavior.

Getting Started

1. Install Laminar and Anthropic

npm install @lmnr-ai/lmnr @anthropic-ai/sdk

2. Set up your environment variables

Store your API keys in a .env file:

# .env file
LMNR_PROJECT_API_KEY=your-laminar-project-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key

Then load them in your application using a package like dotenv.

If you are using Anthropic with Next.js, please follow the Next.js integration guide for best practices and setup instructions.

3. Initialize Laminar

Just add a single line at the start of your application or file to instrument Anthropic with Laminar.

import { Laminar } from '@lmnr-ai/lmnr';
import Anthropic from '@anthropic-ai/sdk';
import 'dotenv/config'; // Load environment variables

// This single line instruments all Anthropic API calls
Laminar.initialize({
  instrumentModules: { anthropic: Anthropic }
});

// Initialize Anthropic client as usual
const anthropic = new Anthropic();

It is important to pass Anthropic to instrumentModules as a named export.

4. Use Anthropic as usual

// Make API calls to Anthropic as you normally would
const response = await anthropic.messages.create({
  model: "claude-3-7-sonnet",
  max_tokens: 1024,
  messages: [
    { role: "user", content: "Hello, how are you?" }
  ],
});

console.log(response.content);

All Anthropic API calls are now automatically traced in Laminar.

These features allow you to build more structured traces, add context to your LLM calls, and gain deeper insights into your AI application’s performance.

Monitoring Your Anthropic Usage

After instrumenting your Anthropic calls with Laminar, you’ll be able to:

  1. View detailed traces of each Anthropic API call, including request and response
  2. Track token usage and cost across different models
  3. Monitor latency and performance metrics
  4. Open LLM span in Playground for prompt engineering
  5. Debug issues with failed API calls or unexpected model outputs

Visit your Laminar dashboard to view your Anthropic traces and analytics.

Advanced Features

  • Sessions - Learn how to add session structure to your traces
  • Metadata - Discover how to add additional context to your LLM spans
  • Trace structure - Explore creating custom spans and more advanced tracing
  • Realtime Monitoring - See how to monitor your Anthropic calls in real-time