Skip to main content
You’ve initialized Laminar. Your LLM calls are traced automatically. Now what? This section covers what to do when auto-instrumentation isn’t enough:
  • Trace your own functions — Your agent has logic beyond LLM calls. Wrap functions with @observe to see them in traces.
  • Add context — Attach user IDs, session IDs, metadata, and tags so you can filter and debug effectively.
  • Control what’s captured — Skip recording inputs/outputs for sensitive operations.
If you just want to trace LLM calls and framework operations, you don’t need this section—Integrations has you covered. Come back here when you want deeper visibility into your own code.

Why Structure Matters

Without structure, each LLM call becomes an isolated trace, making it hard to:
  • Understand relationships between calls (routing → tool use → final answer)
  • Follow multi-step workflows end-to-end
  • Track conversations across turns
  • Find the right trace quickly (by user/session/metadata)
Well-structured traces show a clear tree of spans for each request, plus stable identifiers that help you search and group related work.

Quickstart

Start by creating a parent span for your request/turn, then set user ID, session ID, and metadata inside that span so everything downstream inherits it.
import { Laminar, observe } from '@lmnr-ai/lmnr';

Laminar.initialize({
  projectApiKey: process.env.LMNR_PROJECT_API_KEY,
});

export async function handleRequest(userId: string, requestId: string) {
  return observe({ name: 'handleRequest' }, async () => {
    Laminar.setTraceUserId(userId);
    Laminar.setTraceSessionId(`session-${requestId}`);
    Laminar.setTraceMetadata({ environment: process.env.NODE_ENV });

    // ... your LLM calls / tools / business logic ...
  });
}

Quick Reference

I want to…Use
Trace a function I wroteTrace Functions
Trace code that isn’t a functionTrace Parts of Your Code
Group traces by conversation/workflowSessions
Associate traces with usersUser ID
Add key-value context to tracesMetadata
Add categorical labels to spansTags
Trace images sent to LLMsTracing Images
Continue traces across servicesContinuing Traces
Track costs for custom LLM spansLLM Cost Tracking
Ensure spans send in serverless/CLIFlushing & Shutting Down