Our Python SDK got smarter. We developed a Typscript SDK too. We are updating our SDK code blocks. Python SDKhere.Typscript SDKhere.
Description
Integrations

Databricks

1. What the integration does

Databricks customers can run agentic pipelines within the Databricks ecosystem (eg. in a serverless notebook, or an endpoint where their model is deployed). These agents can be traced for observability. The Patronus AI SDK allows users to direct the agent traces to Patronus AI backend, allowing customers to gain real‑time visibility, evaluation metrics, and alerting—with zero extra code.

2. One-minute setup (Patronus SDK)

Set up tracing in two quick steps using the official patronus-sdk helper.

2.1 Install the SDK (and OTEL dependencies)

%pip install -U -qqqq mlflow databricks-langchain databricks-agents uv 
langgraph==0.3.4 openinference.instrumentation.langchain patronus

2.2 Configure the tracer provider in your notebook or job entrypoint

Example chain_config.json (referenced in the python code below).

{
  "LLM_ENDPOINT_NAME": "databricks-meta-llama-3-3-70b-instruct",
  "PATRONUS_PROJECT_NAME": "databricks-langchain-test-project",
  "PATRONUS_SERVICE": "langchain-service",
}
 
############################################
# Patronus Tracing Setup
############################################
import os
from patronus import init
from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace
 
model_config = mlflow.models.ModelConfig(development_config="chain_config.yaml")
 
PATRONUS_API_KEY = os.environ.get("PATRONUS_API_KEY")
# Initialize Patronus with LangChain instrumentation
patronus_ctx = init(
    project_name=model_config.get("PATRONUS_PROJECT_NAME"),
    api_key=PATRONUS_API_KEY,
    service=model_config.get("PATRONUS_SERVICE"),
    integrations=[LangChainInstrumentor()]
)
trace.set_tracer_provider(patronus_ctx.tracer_provider)

After this, MLflow (and any other OpenTelemetry‑instrumented libraries) automatically streams traces to Patronus AI.

3. Smoke-test the connection (optional)

import mlflow, time
 
with mlflow.start_span(name="databricks-smoke-test") as span:
 
    span.set_inputs({"check": "otel"})
 
    time.sleep(1)
 
    span.set_outputs({"status": "ok"})

Open Patronus AI → Traces and confirm the span appears under the service name you set.

4. How it works under the hood

  1. MLflow generates OTel traces as experiments run.
  2. The OTLP exporter sends those traces to Patronus’s public endpoint: "https://otel.patronus.ai:4317".
  3. The Collector authenticates with your API key, tags data with pat-project-name (if provided), and forwards it to Patronus ingestion.
  4. Patronus AI renders dashboards, stores history, and triggers alerts—no on‑prem infra required.

5. Troubleshooting quick fixes

  • No spans in Patronus after 5 min — Ensure the Databricks cluster has outbound access on ports 443 / 4317.
  • “UNAUTHENTICATED” error — Verify your API key is correct and active in Patronus AI → Settings → API Keys.
  • High latency or dropped spans — Batch traces by adding OTEL_BSP_SCHEDULE_DELAY=5000 (ms).

  • On this page