Our docs got a refresh! Check out the new content and improved navigation. For detailed API reference see our Python SDK docs and TypeScript SDK.
Description
Prompts

Overview

Version, retrieve, and render prompts in your LLM applications

The Patronus SDK provides tools to version, retrieve, and render prompts in your LLM applications.

Quick start

Creating a prompt

import patronus
from patronus.prompts import Prompt, push_prompt
 
patronus.init()
 
prompt = Prompt(
    name="support/troubleshooting/login-issues",
    body="You are a support specialist for {product_name}. Solve this {issue_type} issue: {issue_description}",
    description="Support prompt for login issues",
    metadata={"temperature": 0.7}
)
 
# Push to Patronus
loaded_prompt = push_prompt(prompt)
 
# Render with variables
rendered = prompt.render(
    product_name="CloudWorks",
    issue_type="authentication",
    issue_description="Cannot log in with correct credentials"
)

Loading a prompt

from patronus.prompts import load_prompt
 
# Load latest version
prompt = load_prompt(name="support/troubleshooting/login-issues")
 
# Load specific revision or label
prompt = load_prompt(name="support/troubleshooting/login-issues", revision=3)
prompt = load_prompt(name="support/troubleshooting/login-issues", label="production")

Using with LLMs

import openai
from patronus.prompts import load_prompt
 
system_prompt = load_prompt(name="support/chat/system")
 
client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4",
    messages=[
        {"role": "system", "content": system_prompt.render(product_name="CloudWorks")},
        {"role": "user", "content": "How do I reset my password?"}
    ]
)

Next steps

On this page