Overview
Laminar automatically instruments the official OpenAI package with a single line of code, allowing you to trace and monitor all your OpenAI API calls without modifying your existing code. This provides complete visibility into your AI application’s performance, costs, and behavior.Getting Started
- TypeScript
- Python
1. Install Laminar and OpenAI
2. Set up your environment variables
Store your API keys in a.env file:If you are using OpenAI with Next.js, please follow the Next.js integration guide for best practices and setup instructions.
3. Initialize Laminar
Just add a single line at the start of your application or file to instrument OpenAI with Laminar.It is important to pass
OpenAI to instrumentModules as a named export.4. Use OpenAI as usual
Monitoring Your OpenAI Usage
After instrumenting your OpenAI calls with Laminar, you’ll be able to:- View detailed traces of each OpenAI API call, including request and response
- Track token usage and cost across different models
- Monitor latency and performance metrics
- Open LLM span in Playground for prompt engineering
- Debug issues with failed API calls or unexpected model outputs
Advanced Features
- Enrich traces with sessions, user IDs, metadata, and tags via the SDK reference.
- Wrap custom functions with
observeto capture business logic alongside model calls. - Images sent to vision-capable models are captured automatically — see Tracing Images.
Vision inputs (images)
Laminar automatically traces image inputs (data URLs and external URLs) sent to vision-capable OpenAI models.- TypeScript
- Python
