Our docs got a refresh! Check out the new content and improved navigation. For detailed API reference see our Python SDK docs and TypeScript SDK.
Description
Prompts

Creating and Versioning Prompts

Create new prompts and manage versions

Use push_prompt to create new prompts or update existing ones. Patronus automatically handles versioning and duplicate detection.

Creating prompts

Basic creation

import patronus
from patronus.prompts import Prompt, push_prompt
 
patronus.init()
 
prompt = Prompt(
    name="dev/bug-fix/python-error",
    body="Fix this Python code error: {error_message}. Code: ```python\n{code_snippet}\n```",
    description="Template for Python debugging assistance"
)
 
loaded_prompt = push_prompt(prompt)

With metadata

from patronus.prompts import Prompt, push_prompt
 
prompt = Prompt(
    name="support/troubleshooting/login-issues",
    body="You are a support specialist for {product_name}. Solve this issue: {issue_description}",
    description="Support prompt for login issues",
    metadata={
        "temperature": 0.7,
        "max_tokens": 500,
        "tone": "helpful"
    }
)
 
loaded_prompt = push_prompt(prompt)

Async creation

from patronus.prompts import apush_prompt
 
loaded_prompt = await apush_prompt(prompt)

How versioning works

Patronus automatically manages prompt versions:

  • Each time you push a prompt, Patronus checks if the content is identical to an existing revision
  • If the content matches an existing revision, that revision is returned (no duplicate created)
  • If the content is different, a new revision is created with an incremented revision number
  • Revisions are immutable once created
# First push creates revision 1
prompt_v1 = Prompt(
    name="support/greeting",
    body="Hello, how can I help you?"
)
loaded_v1 = push_prompt(prompt_v1)  # Creates revision 1
 
# Pushing identical content returns revision 1
loaded_again = push_prompt(prompt_v1)  # Returns existing revision 1
 
# Pushing different content creates revision 2
prompt_v2 = Prompt(
    name="support/greeting",
    body="Hello! How may I assist you today?"
)
loaded_v2 = push_prompt(prompt_v2)  # Creates revision 2

Multi-line prompts

Use textwrap.dedent for readable multi-line prompts:

import textwrap
from patronus.prompts import Prompt, push_prompt
 
prompt = Prompt(
    name="content/blog/technical-writer",
    body=textwrap.dedent("""
        You are a technical writer for {company_name}.
 
        Topic: {topic}
        Audience: {audience_level}
        Tone: {tone}
 
        Write a {word_count}-word blog post that:
        1. Introduces the topic clearly
        2. Provides practical examples
        3. Concludes with actionable takeaways
 
        Use clear, concise language appropriate for {audience_level} readers.
        """),
    description="Technical blog post writer"
)
 
loaded_prompt = push_prompt(prompt)

Naming conventions

Use a descriptive, hierarchical naming structure:

[domain]/[use-case]/[component]/[prompt-type]

Where [prompt-type] indicates the role in an LLM conversation:

  • system - Sets overall behavior and context
  • instruction - Provides specific task instructions
  • user - User message template
  • assistant - Assistant response template
  • few-shot - Contains example input/output pairs

Examples

# System prompt for support chat
"support/troubleshooting/diagnostic-questions/system"
 
# User message template
"support/chat/user-query/user"
 
# Few-shot examples for code generation
"dev/code-generation/python-function/few-shot"
 
# Instruction prompt for content writing
"marketing/email-campaigns/follow-up-template/instruction"

Consistent prefixes

Use consistent prefixes for related prompts:

# Onboarding flow prompts
"onboarding/chat/welcome/system"
"onboarding/chat/questions/user"
"onboarding/chat/intro/assistant"
 
# Support classifier prompts
"support/classifier/system"
"support/classifier/categories/instruction"

This makes it easier to filter and manage prompt families as your library grows.

On this page