Integrations
Databricks
1. What the integration does
Databricks customers can run agentic pipelines within the Databricks ecosystem (eg. in a serverless notebook, or an endpoint where their model is deployed). These agents can be traced for observability. The Patronus AI SDK allows users to direct the agent traces to Patronus AI backend, allowing customers to gain real‑time visibility, evaluation metrics, and alerting—with zero extra code.
2. One-minute setup (Patronus SDK)
Set up tracing in two quick steps using the official patronus-sdk helper.
2.1 Install the SDK (and OTEL dependencies)
2.2 Configure the tracer provider in your notebook or job entrypoint
Example chain_config.json
(referenced in the python code below).
After this, MLflow (and any other OpenTelemetry‑instrumented libraries) automatically streams traces to Patronus AI.
3. Smoke-test the connection (optional)
Open Patronus AI → Traces and confirm the span appears under the service name you set.
4. How it works under the hood
- MLflow generates OTel traces as experiments run.
- The OTLP exporter sends those traces to Patronus’s public endpoint: "https://otel.patronus.ai:4317".
- The Collector authenticates with your API key, tags data with pat-project-name (if provided), and forwards it to Patronus ingestion.
- Patronus AI renders dashboards, stores history, and triggers alerts—no on‑prem infra required.