OpenRouter provides a unified API that gives you access to hundreds of AI models through a single endpoint. Many teams use it to keep one SDK while switching models or providers. With Laminar, you can trace OpenRouter traffic by using the OpenAI SDK pointed at the OpenRouter base URL.
OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.
response = client.chat.completions.create( model="openai/gpt-4o-mini", messages=[ {"role": "user", "content": "What is the meaning of life?"} ],)print(response.choices[0].message.content)
All OpenRouter calls made through the OpenAI SDK are now automatically traced in Laminar.
Laminar does not auto-instrument the OpenRouter SDK. Wrap your call with observe() to capture inputs/outputs and group it in a trace. LLM-specific fields (tokens, cost) are not added automatically in this mode.
Copy
import { Laminar, observe } from '@lmnr-ai/lmnr';import { OpenRouter } from '@openrouter/sdk';Laminar.initialize();const openRouter = new OpenRouter({ apiKey: '<OPENROUTER_API_KEY>', defaultHeaders: { 'HTTP-Referer': '<YOUR_SITE_URL>', // Optional 'X-Title': '<YOUR_SITE_NAME>', // Optional },});const completion = await observe({ name: 'openrouter.chat.send' }, async () => openRouter.chat.send({ model: 'openai/gpt-4o-mini', messages: [ { role: 'user', content: 'What is the meaning of life?' }, ], stream: false, }));console.log(completion.choices[0].message.content);