post https://api.patronus.ai/v1beta1/model-integrations
Create integration to an LLM Model
Open AI integration
To integrate with OpenAI, you need an API key and to select a model.
Optionally, you can set default parameters for the model.
If these aren't specified, the model will use its default settings.
Example request body:
{
"name": "my-openai-integration",
"spec": {
"open_ai": {
"api_key": "sk-my-openai-api-key",
"selected_model": "gpt-4",
"default_params": {
"temperature": 0
}
}
}
}
HTTP Integration (webhook style)
Custom HTTP integrations allow connection to any LLM model with an HTTP interface that returns a JSON response.
For security, use secrets to transmit sensitive information like API keys.
Secrets can be included in the method, base URL, path, headers, and body_template
by enclosing the secret key in double braces ({{MY_SECRET}}).
Do not use the following reserved keys for secrets:
prompt
user_prompt
system_prompt
The model integration is expected to return JSON response.
response_jq_path
is necessary to extract text model's text output.
{
"name": "my-custom-integration",
"spec": {
"http": {
"request_method": "POST",
"request_base_url": "https://my-llm.example.com",
"request_path": "/v1/completion",
"request_headers": [
{
"key": "X-API-KEY",
"value": "{{API_KEY}}"
}
],
"request_body_template": "{\"user_prompt\": \"{{prompt}}\"}",
"response_jq_path": ".choices[0].message.content",
"secrets": [
{
"key": "API_KEY",
"value": "my-secret-api-key"
}
]
}
}
}