Skip to main content

Overview

OpenRouter provides a unified API that gives you access to hundreds of AI models through a single endpoint. Many teams use it to keep one SDK while switching models or providers. With Laminar, you can trace OpenRouter traffic by using the OpenAI SDK pointed at the OpenRouter base URL.
OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.
Laminar automatically instruments the OpenAI SDK. If you set the OpenAI SDK base URL to OpenRouter, Laminar will capture those calls as LLM spans.

1. Install Laminar and OpenAI

npm install @lmnr-ai/lmnr openai

2. Set up your environment variables

Store your API keys in a .env file:
# .env file
LMNR_PROJECT_API_KEY=your-laminar-project-api-key
OPENROUTER_API_KEY=your-openrouter-api-key
Then load them in your application using a package like dotenv.
If you are using OpenRouter with Next.js, please follow the Next.js integration guide for best practices and setup instructions.

3. Initialize Laminar and OpenAI

import { Laminar } from '@lmnr-ai/lmnr';
import OpenAI from 'openai';
import 'dotenv/config';

// This single line instruments all OpenAI SDK calls
Laminar.initialize({
  instrumentModules: { OpenAI: OpenAI }
});

const openai = new OpenAI({
  baseURL: 'https://openrouter.ai/api/v1',
  apiKey: process.env.OPENROUTER_API_KEY,
  defaultHeaders: {
    'HTTP-Referer': '<YOUR_SITE_URL>', // Optional
    'X-Title': '<YOUR_SITE_NAME>', // Optional
  },
});

4. Use OpenRouter as usual

const response = await openai.chat.completions.create({
  model: 'openai/gpt-4o-mini',
  messages: [
    { role: 'user', content: 'What is the meaning of life?' }
  ],
});

console.log(response.choices[0].message.content);
All OpenRouter calls made through the OpenAI SDK are now automatically traced in Laminar.

OpenRouter SDK (Beta)

Laminar does not auto-instrument the OpenRouter SDK. Wrap your call with observe() to capture inputs/outputs and group it in a trace. LLM-specific fields (tokens, cost) are not added automatically in this mode.
import { Laminar, observe } from '@lmnr-ai/lmnr';
import { OpenRouter } from '@openrouter/sdk';

Laminar.initialize();

const openRouter = new OpenRouter({
  apiKey: '<OPENROUTER_API_KEY>',
  defaultHeaders: {
    'HTTP-Referer': '<YOUR_SITE_URL>', // Optional
    'X-Title': '<YOUR_SITE_NAME>', // Optional
  },
});

const completion = await observe({ name: 'openrouter.chat.send' }, async () =>
  openRouter.chat.send({
    model: 'openai/gpt-4o-mini',
    messages: [
      { role: 'user', content: 'What is the meaning of life?' },
    ],
    stream: false,
  })
);

console.log(completion.choices[0].message.content);

Using the OpenRouter API directly

You can use the interactive Request Builder to generate OpenRouter API requests in the language of your choice.
Direct HTTP calls are captured as custom spans via observe(). LLM-specific fields (tokens, cost) are not extracted automatically from raw responses.
import json
import requests

from lmnr import Laminar, observe

Laminar.initialize()

@observe()
def call_openrouter():
    response = requests.post(
        url="https://openrouter.ai/api/v1/chat/completions",
        headers={
            "Authorization": "Bearer <OPENROUTER_API_KEY>",
            "HTTP-Referer": "<YOUR_SITE_URL>", # Optional
            "X-Title": "<YOUR_SITE_NAME>", # Optional
        },
        data=json.dumps({
            "model": "openai/gpt-4o-mini",
            "messages": [
                {"role": "user", "content": "What is the meaning of life?"}
            ]
        })
    )
    return response.json()

print(call_openrouter())

Monitoring Your OpenRouter Usage

When you use the OpenAI SDK integration above, Laminar will create LLM spans so you can:
  1. View detailed traces of each OpenRouter request, including request and response
  2. Track token usage when OpenRouter returns usage data
  3. See cost estimates when pricing data for the model is available in Laminar
  4. Monitor latency and performance metrics
  5. Open LLM spans in Playground for prompt engineering
Visit your Laminar dashboard to view your OpenRouter traces and analytics.

Advanced Features

  • Enrich traces with sessions, user IDs, metadata, and tags via the SDK reference.
  • Wrap custom functions with observe to capture business logic alongside model calls.
  • Images sent to vision-capable models are captured automatically - see Tracing Images.