Prompt Template¶
The Prompt Template policy applies reusable, parameterized prompt templates to requests. Instead of sending a raw prompt, clients send a template reference (template://<name>?param=value) and the gateway substitutes the parameters into the configured template before forwarding the request to the LLM.
Configuration Parameters¶
| Parameter | Required | Description |
|---|---|---|
| Templates | Yes | A list of named templates. Each template has a name (identifier) and a prompt string containing placeholders in the format [[parameter-name]]. |
Template Placeholder Syntax¶
Use [[parameter-name]] inside the prompt string to mark substitution points:
When a client calls template://translate?from=English&to=Spanish&text=Hello, the policy replaces each placeholder with the corresponding query parameter value.
Advanced Settings¶
| Parameter | Default | Description |
|---|---|---|
| JSON Path | — | JSONPath expression to limit template resolution to a specific field in the request payload. If omitted, template references are resolved across the entire request payload. |
| On Missing Template | error |
Behavior when a referenced template name is not found. error returns an immediate error response; passthrough leaves the original template reference unchanged. |
| On Unresolved Placeholder | keep |
Behavior when a placeholder has no matching query parameter. keep leaves the placeholder as-is; empty replaces it with an empty string; error returns an immediate error response. |
Add This Policy¶
- Navigate to AI Workspace > LLM Providers or LLM Proxies.
- Click on the provider or proxy name.
- Go to the Guardrails tab.
- Click + Add Guardrail and select Prompt Template from the sidebar.
- Add one or more templates, each with a name and prompt string using
[[param]]placeholders. - Optionally expand Advanced Settings to configure JSON Path, On Missing Template, or On Unresolved Placeholder behavior.
- Click Add (for providers) or Submit (for proxies).
- Deploy the provider or proxy to apply the changes.
Example: Translation Template¶
Configured template:
| Field | Value |
|---|---|
| Name | translate |
| Prompt | Translate the following text from [[from]] to [[to]]: [[text]] |
Client request body:
{
"model": "gpt-4o",
"messages": [
{
"role": "user",
"content": "template://translate?from=English&to=Spanish&text=Hello, how are you?"
}
]
}
Request forwarded to LLM:
{
"model": "gpt-4o",
"messages": [
{
"role": "user",
"content": "Translate the following text from English to Spanish: Hello, how are you?"
}
]
}
Related¶
- Policies Overview
- Prompt Decorator — Inject fixed content into prompts without client-side changes
- Policy Hub — Full policy specification and latest version