Introduction to Laminar

Laminar is an open-source platform for tracing and evaluating AI applications.

Laminar is fully compatible with OpenTelemetry, so you can use OpenLLMetry to trace your applications on Laminar.

Laminar’s OpenTelemetry backend supports both gRPC and HTTP trace exporters.

The recommended setup is to use gRPC, as it’s more efficient. You will need to create a gRPC exporter and pass it to the Traceloop SDK.

1

Install dependencies

pip install traceloop-sdk openai
2

Set up environment variables

To get your API key, either sign up on Laminar and get it from the project settings, or spin up Laminar locally.

import os
os.environ["LMNR_PROJECT_API_KEY"] = "<YOUR_LMNR_PROJECT_API_KEY>"
os.environ["LMNR_BASE_URL"] = "https://api.lmnr.ai:8443"
3

Initialize the OpenTelemetry gRPC exporter

import os
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
    OTLPSpanExporter,
)

exporter = OTLPSpanExporter(
    endpoint=os.environ["LMNR_BASE_URL"],
    # IMPORTANT: note that "authorization" must be lowercase
    headers={
        "authorization": f"Bearer {os.environ['LMNR_PROJECT_API_KEY']}"
    }
)
4

Initialize the Traceloop SDK

from traceloop.sdk import Traceloop
Traceloop.init(exporter=exporter)
5

Run your application

from openai import OpenAI
openai_client = OpenAI()

chat_completion = openai_client.chat.completions.create(
    messages=[
        {
          "role": "user",
          "content": "What is Laminar flow?",
        }
    ],
    model="gpt-4.1-nano",
)

print(chat_completion)
6

Example trace in Laminar

Example trace in Laminar. (Direct link)

(Alternative) HTTP quick setup

Laminar’s backend also supports accepting traces over HTTP, so for a minimal configuration change you can do:

TRACELOOP_BASE_URL="https://api.lmnr.ai"
TRACELOOP_HEADERS="Authorization=<YOUR_LMNR_PROJECT_API_KEY>"

and skip step 3 (exporter setup) above.

For more information check out the Laminar docs.