PromptRail Knowledge Base

Click on any topic below to learn more about how PromptRail helps you manage your AI workflows.

Core Concepts

Prompts

Define instructions, variables, and manage versions.

Read more

Models

Configure providers, models, and cost tracking.

Read more

Credentials

Securely store and manage your API keys.

Read more

Routes

Connect components and fine-tune parameters.

Read more

Endpoints

Stable URLs to access your Routes.

Read more

Executions

Track logs, tokens, and costs in real-time.

Read more

Governance

Audit trails, transparency, and compliance.

Read more

Testing

Test prompts and routes before deploying.

Read more

Going to Production

Best practices for live applications.

Read more

Integration Guide

Using the API Endpoint

Once you have created an Endpoint and connected it to a Route, you can execute it via a simple POST request.

1. Get Endpoint Metadata (GET)

Retrieve information about the endpoint, including expected variables.

curl -X GET https://api.promptrail.io/acme-corp/execute/product-desc \ -H "Authorization: Bearer YOUR_ENDPOINT_TOKEN"

2. Execute the Request (POST)

Send variables to execute the prompt.

curl -X POST https://api.promptrail.io/acme-corp/execute/product-desc \ -H "Authorization: Bearer YOUR_ENDPOINT_TOKEN" \ -H "Content-Type: application/json" \ -d '{ "topic": "AI Wrappers", "tone": "Professional" }'

3. Receive the Response

{ "success": true, "execution_id": "exec_123...", "output": "Here is the generated content..." }