traceloop home pagelight logodark logo
  • Community
  • Support
  • Start now
  • Start now
Integrations
LLM Observability with Service Now Cloud Observability and OpenLLMetry
  • Documentation
  • OpenLLMetry
  • Hub
  • Dashboard API
  • Community
  • GitHub
  • Introduction
    • What is OpenLLMetry?
    Quick Start
    • Python
    • Node.js
    • Next.js
    • Go
    • Ruby
    • SDK Initialization Options
    • Troubleshooting
    Tracing
    • What's Supported?
    • Workflow Annotations
    • Versioning
    • Associating Entities with Traces
    • Tracking User Feedback
    • Manually reporting calls to LLMs and Vector DBs
    • Issues with Auto-instrumentation (Typescript / Javascript)
    • Usage with Threads (Python)
    • Without OpenLLMetry SDK
    Integrations
    • Overview
    • Traceloop
    • Axiom
    • Azure Application Insights
    • Braintrust
    • BMC Helix
    • Dash0
    • Datadog
    • Dynatrace
    • Google Cloud
    • Grafana
    • Highlight
    • Honeycomb
    • HyperDX
    • Instana
    • KloudMate
    • Langfuse
    • LangSmith
    • Middleware
    • New Relic
    • OpenTelemetry Collector
    • OCI Application Performance Monitoring
    • Service Now Cloud Observability
    • SigNoz
    • Sentry
    • Splunk
    Privacy
    • Prompts, Completions and Embeddings
    • Telemetry
    Contribute
    • Overview
    • Local Development
    • GenAI Semantic Conventions
    Integrations

    LLM Observability with Service Now Cloud Observability and OpenLLMetry

    Since Service Now Cloud Observability natively supports OpenTelemetry, you just need to route the traces to Service Now Cloud Observability’s endpoint and set the access token:

    TRACELOOP_BASE_URL=https://ingest.lightstep.com
    TRACELOOP_HEADERS="lightstep-access-token=<YOUR_ACCESS_TOKEN>"
    

    Was this page helpful?

    Suggest edits
    OCI Application Performance MonitoringSigNoz
    githubtwitter
    Powered by Mintlify