Skip to main content
The Admin API gives you full programmatic access to manage your Endprompt resources. Use it to automate workflows, integrate with CI/CD pipelines, or connect AI coding tools directly to your Endprompt workspace.

What Can You Do?

Manage Endpoints

Create, update, and delete endpoints. Define input and output schemas programmatically.

Manage Prompts

Create prompts, update templates, promote to Live, and manage the full prompt lifecycle.

Execute & Test

Execute endpoints with inputs and get LLM outputs — ideal for automated testing.

Monitor & Analyze

Query execution stats, browse logs, and compare prompt performance.

Two Ways to Access

The Admin API is available through two interfaces:
InterfaceBest ForProtocol
REST APIScripts, CI/CD, backend integrationsStandard HTTP/JSON
MCP ServerAI coding tools (VS Code, Claude, Cursor)Model Context Protocol
Both interfaces use the same authentication and provide the same capabilities.

Quick Start

1

Create an Admin API Key

Go to API Keys in your dashboard and create a key with the Admin type. Learn more →
2

Choose Your Interface

Use the REST API for scripts and integrations, or the MCP Server to connect your AI coding tool.
3

Start Building

Create an endpoint, add fields, write a prompt, promote it to Live, and execute it — all without touching the dashboard.

Example: Create and Deploy an Endpoint

Here’s the typical workflow using the Admin API:
1. Create an endpoint         POST /admin/v1/endpoints
2. Add input fields           POST /admin/v1/endpoints/{id}/input-fields
3. Add output fields          POST /admin/v1/endpoints/{id}/output-fields
4. Create a prompt            POST /admin/v1/prompts
5. Promote to Live            POST /admin/v1/prompts/{id}/promote
6. Set as default             POST /admin/v1/prompts/{id}/set-default
7. Execute                    POST /api/v1/{endpoint-path}
The same workflow works with MCP tools — AI coding assistants can execute these steps conversationally.

Resources

The Admin API manages these resources:
ResourceDescription
EndpointsAPI paths with input/output schemas, cache and rate-limit policies
PromptsLiquid templates with LLM configuration, versioning, and lifecycle
Prompt VersionsImmutable snapshots created automatically when templates change
SnippetsReusable template fragments included via {% include 'slug' %}
Execution LogsRead-only history with inputs, outputs, latency, tokens, and cost
StatsAggregated execution metrics per endpoint, prompt, or tenant
ModelsAvailable LLM models with capabilities and pricing