What is a Prompt?
A prompt consists of:- Template — The instruction text, using Liquid templating
- Model — Which LLM to use (GPT-4, Claude, etc.)
- Settings — Temperature, max tokens, and other parameters
- Status — Draft, Live, or Archived
Why Prompts Are Powerful
Versioning
Every save creates a version. Rollback instantly if something breaks.
A/B Testing
Attach multiple prompts to one endpoint. Test which performs better.
Safe Iteration
Test changes in Draft status before promoting to Live.
Model Flexibility
Switch models without changing code. Try GPT-4 vs Claude instantly.
Prompt Lifecycle
Prompts follow a status workflow:| Status | Description | Can be Default? |
|---|---|---|
| Draft | Work in progress, for testing | No |
| Live | Production-ready | Yes |
| Archived | Deprecated, hidden from lists | No |
Prompts vs Endpoints
Understanding the relationship:| Concept | What It Is | Example |
|---|---|---|
| Endpoint | Stable API URL | /api/v1/summarize |
| Prompt | LLM instruction template | ”Summarize this text…” |
| Relationship | One endpoint, many prompts | Endpoint has 3 prompt versions |
Multiple Prompts per Endpoint
You can attach multiple prompts to a single endpoint:- Different models — GPT-4 for quality, GPT-3.5 for speed
- Different approaches — Concise summary vs detailed analysis
- Different versions — v1, v2, v3 of your prompt
- A/B testing — Compare performance between prompts
The Prompt Editor
The prompt editor provides:- Syntax highlighting for Liquid templates
- Variable autocomplete for input fields
- Live preview of rendered output
- Side-by-side testing to run and see results immediately
Key Components
Template
Template
The instruction text sent to the LLM. Uses Liquid templating to inject input values.
Model Selection
Model Selection
Choose from supported providers (OpenAI, Anthropic) and models (GPT-4, Claude).
Temperature
Temperature
Controls randomness. Lower (0.1-0.3) for consistent outputs, higher (0.7-1.0) for creativity.
Max Tokens
Max Tokens
Maximum length of the LLM response. Higher values allow longer outputs but cost more.
System Prompt
System Prompt
Optional system-level instruction that sets the AI’s persona or behavior (for models that support it).

