Our docs got a refresh! Check out the new content and improved navigation. For detailed API reference see our Python SDK docs and TypeScript SDK.
Traces
Logging
Record events, debug information, and track the execution of your LLM applications
Logging is an essential feature of the Patronus SDK that allows you to record events, debug information, and track the execution of your LLM applications. This page covers how to set up and use logging in your code.
The Patronus SDK provides a simple logging interface that integrates with Python's standard logging module while also automatically exporting logs to the Patronus AI Platform:
import patronuspatronus.init()log = patronus.get_logger()# Basic logginglog.info("Processing user query")# Different log levels are availablelog.debug("Detailed debug information")log.warning("Something might be wrong")log.error("An error occurred")log.critical("System cannot continue")
By default, Patronus logs are sent to the Patronus AI Platform but are not printed to the console. To display logs in your console output, you can add a standard Python logging handler:
import sysimport loggingimport patronuspatronus.init()log = patronus.get_logger()# Add a console handler to see logs in your terminalconsole_handler = logging.StreamHandler(sys.stdout)log.addHandler(console_handler)# Now logs will appear in both console and Patronus Platformlog.info("This message appears in the console and is sent to Patronus")
You can also customize the format of console logs:
import sysimport loggingimport patronuspatronus.init()log = patronus.get_logger()formatter = logging.Formatter('[%(asctime)s] %(levelname)-8s: %(message)s')console_handler = logging.StreamHandler(sys.stdout)console_handler.setFormatter(formatter)log.addHandler(console_handler)# Logs will now include timestamp and levellog.info("Formatted log message")
Patronus integrates with Python's logging module, allowing for advanced configuration options. The SDK uses two main loggers:
patronus.sdk - For client-emitted messages that are automatically exported to the Patronus AI Platform
patronus.core - For library-emitted messages related to the SDK's internal operations
Here's how to configure these loggers using standard library methods:
import loggingimport patronus# Initialize Patronus before configuring loggingpatronus.init()# Configure the root Patronus loggerpatronus_root_logger = logging.getLogger("patronus")patronus_root_logger.setLevel(logging.WARNING) # Set base level for all Patronus loggers# Add a console handler with custom formattingconsole_handler = logging.StreamHandler()formatter = logging.Formatter( fmt='[%(asctime)s] %(levelname)-8s %(name)s: %(message)s', datefmt='%Y-%m-%d %H:%M:%S')console_handler.setFormatter(formatter)patronus_root_logger.addHandler(console_handler)# Configure specific loggerspatronus_core_logger = logging.getLogger("patronus.core")patronus_core_logger.setLevel(logging.WARNING) # Only show warnings and above for internal SDK messagespatronus_sdk_logger = logging.getLogger("patronus.sdk")patronus_sdk_logger.setLevel(logging.INFO) # Show info and above for your application logs
Patronus logging integrates seamlessly with the tracing system, allowing you to correlate logs with specific spans in your application flow:
import patronusfrom patronus import traced, start_spanpatronus.init()log = patronus.get_logger()@traced()def process_user_query(query): log.info("Processing query") with start_span("Query Analysis"): log.info("Analyzing query intent") ... with start_span("Response Generation"): log.info("Generating LLM response") ... return "Response to: " + query# Logs will be associated with the appropriate spansresult = process_user_query("Tell me about machine learning")