Core Concepts

Endpoints
An Endpoint is a stable API URL that your application calls. Think of it as the “contract” between your code and the LLM.- Path — The URL path (e.g.,
/api/v1/summarize) - Input Schema — What data the endpoint accepts
- Output Schema — What data the endpoint returns
- Default Prompt — Which prompt to use when called
Prompts
A Prompt is the actual instruction sent to the LLM. Prompts are attached to endpoints and can be versioned independently.- Liquid Templating — Use
{{ inputs.fieldName }}to inject request data - Model Selection — Choose which LLM to use (GPT-4, Claude, etc.)
- Settings — Configure temperature, max tokens, and other parameters
- Versioning — Every save creates a new version you can rollback to
Prompt Status Lifecycle
Prompts follow a status workflow:| Status | Description |
|---|---|
| Draft | Work in progress. Can be tested but not used as default. |
| Live | Production-ready. Can be set as endpoint default. |
| Archived | Deprecated. Hidden from lists but preserved for history. |
Input & Output Schemas
Schemas define the structure of data flowing in and out of your endpoint. Input Schema Example:Input validation happens automatically. If a request doesn’t match your schema, it’s rejected with a clear error message before reaching the LLM.
Request Flow
Here’s what happens when you call an Endprompt endpoint:Template Rendering
Liquid template is rendered with your input data, creating the final prompt text.
Multi-Tenancy
Each Endprompt account is a tenant with its own:- Subdomain (e.g.,
yourcompany.api.endprompt.ai) - Endpoints, prompts, and configurations
- API keys
- Team members
- Usage quotas
All your data is completely isolated from other tenants. Your prompts, logs, and API keys are never visible to others.
What Makes This Different?
Stable Contracts
Your application code calls the same URL forever. Iterate on prompts without deployments.
Type Safety
Schema validation catches errors before they reach the LLM—and before they reach your users.
Version Control
Every prompt change is versioned. Test new versions, rollback bad ones, compare performance.
Observability
See every request, response, latency, and cost. Replay requests to debug issues.

