Quick Start Path
Follow these steps to create and use your first managed prompt:Create ABV account and API key
- Create an ABV account (free trial available)
- Navigate to your project settings
- Create new API credentials
- Save your API key securely (starts with
sk-abv-...)
https://app.abv.dev) and EU region (https://eu.app.abv.dev) based on your data residency requirements.Create your first prompt
Choose your preferred method: UI (no code required), Python SDK, TypeScript/JavaScript SDK, or Public API.All methods create the same prompt object with name, version, labels, and optional configuration (model parameters, tags, metadata). See detailed examples in the “Creating Prompts” section below.Versioning is automatic: If you create a prompt with an existing name, ABV creates a new version instead of overwriting the existing prompt.
Fetch prompt at runtime
Your application fetches prompts at runtime using the SDK. By default, you get the
production label (the version you’ve designated for production use).Client-side caching ensures zero latency: The SDK caches prompts locally and fetches from cache instantly, while background processes keep the cache synchronized with ABV.Variable substitution: Prompts can contain {{variables}} that you fill in when compiling the prompt for each request.Link prompts to traces (optional but recommended)
Linking prompts to observability traces enables tracking metrics by prompt version. See exactly which prompt generated each response, compare quality across versions, and measure the impact of prompt changes.Add one line of code to associate your prompt with the LLM generation span. ABV automatically tracks latency, token usage, costs, and quality scores by prompt version.
Iterate and deploy new versions
When you create a new version of your prompt, assign it labels like
staging or production. Your application fetches prompts by label, so updating a label’s target version instantly changes which prompt your application uses—no code deployment required.Roll back by reassigning the production label to a previous version. Compare versions in the ABV UI to see exactly what changed.Creating Prompts
ABV supports four methods for creating prompts, all producing identical results:ABV UI (No Code Required)
ABV UI (No Code Required)
The ABV dashboard provides a visual interface for creating and editing prompts. Non-technical team members can iterate prompts without engineering involvement.
- Navigate to the Prompts section in the ABV dashboard
- Click Create Prompt
- Choose prompt type: Text (single string template) or Chat (structured conversation with roles)
- Enter your prompt content with
{{variables}}for dynamic substitution - Optionally configure model parameters (temperature, max tokens, etc.)
- Assign labels like
productionto make the prompt immediately available - Save to create version 1
Python SDK
Python SDK
Create prompts programmatically with the Python SDK for version-controlled prompt workflows.Install dependencies:Set environment variables (create a Create a text prompt:Create a chat prompt:Versioning: If a prompt with the same name exists, this creates a new version rather than overwriting.When to use: Infrastructure-as-code workflows, automated prompt deployment pipelines, version control integration.
.env file):.env
JavaScript/TypeScript SDK
JavaScript/TypeScript SDK
Create prompts programmatically with the TypeScript/JavaScript SDK.Install dependencies:Set environment variables (create a Create prompts:Run your application:Alternative: Constructor parameters (instead of environment variables):Versioning: Creating a prompt with an existing name creates a new version.When to use: Node.js applications, TypeScript infrastructure, JavaScript-based deployment pipelines.
.env file):.env
Public API
Public API
Create prompts via HTTP API for integration with any programming language or CI/CD system.Endpoint: Example request:When to use: Languages without ABV SDK support, CI/CD automation, webhook-triggered prompt updates.View full API reference →
POST https://app.abv.dev/api/public/v2/promptsAuthentication: Include your API key in the Authorization header:Fetching and Using Prompts
Once created, fetch prompts at runtime in your application:Python SDK
Python SDK
Fetch and compile text prompts:Fetch and compile chat prompts:Optional parameters for version control:Access raw prompt and config:
JavaScript/TypeScript SDK
JavaScript/TypeScript SDK
Fetch and compile text prompts:Fetch and compile chat prompts:Optional parameters for version control:Access raw prompt and config:
Linking Prompts to Observability Traces
Linking prompts to traces enables tracking metrics by prompt version. See which prompt generated each response, compare quality across versions, and measure the impact of prompt changes.Python SDK
Python SDK
Using decorators:Using context managers:
If a fallback prompt is used (when ABV is unavailable), no link will be created to preserve application reliability.
JavaScript/TypeScript SDK
JavaScript/TypeScript SDK
Install additional dependencies:Set up instrumentation (create Import instrumentation first in your application:Manual observations:Context manager approach:Observe wrapper:
instrumentation.ts):instrumentation.ts
index.ts
If a fallback prompt is used, no link will be created.
Common Workflows
Testing a Prompt Change Before Production
Testing a Prompt Change Before Production
Scenario: You’ve improved your prompt and want to test it in a staging environment before deploying to production.Steps:
- Create a new version of your prompt (via UI, SDK, or API)
- Assign the
staginglabel to the new version - In your staging environment, fetch prompts with
label="staging" - Test thoroughly, review linked traces and metrics
- When satisfied, reassign the
productionlabel to the new version - Production traffic immediately uses the new prompt—no code deployment required
production to the previous version instantly.Product Manager Iterating Prompts Without Engineering
Product Manager Iterating Prompts Without Engineering
Scenario: Your product team wants to experiment with prompt phrasing to improve response quality, but every change currently requires engineering involvement.Steps:
- Grant product managers access to the ABV dashboard
- Product team creates new prompt versions directly in the UI
- They assign
staginglabel to test versions in a non-production environment - After validation (via linked traces and quality metrics), they reassign
productionlabel - Changes deploy instantly without engineering involvement
Migrating Hardcoded Prompts to ABV
Migrating Hardcoded Prompts to ABV
Scenario: Your prompts are currently hardcoded in your application. You want to migrate to ABV for version control and faster iteration.Steps:
- Extract prompts from code and create them in ABV (via UI or SDK)
- Assign
productionlabel to the initial version - Replace hardcoded prompts with
abv.get_prompt()calls - Deploy code change that fetches from ABV instead of using hardcoded strings
- Future prompt updates happen without code deployment
Comparing Prompt Performance with A/B Testing
Comparing Prompt Performance with A/B Testing
Scenario: You have two prompt variants and want to determine which performs better in production.Steps:
- Create both prompt versions in ABV
- Configure A/B testing to randomly assign users to version 1 or version 2
- Link prompts to traces to track metrics by prompt version
- After sufficient data collection, compare quality scores, latency, and costs
- Promote the winning version to
productionlabel
Related Topics
Version Control
Learn how to manage prompt versions and labels for safe deployments and rollbacks
Link Prompts to Traces
Track metrics and performance for each prompt version through observability integration
A/B Testing
Compare different prompt versions in production to identify the best performer
Message Placeholders
Use dynamic placeholders in chat prompts for complex variable substitution
Prompt Experiments
Test prompts systematically with datasets and automated evaluation
Client-Side Caching
Understand how zero-latency caching works and configure cache behavior
Prompt Playground
Test and iterate prompts interactively in the ABV dashboard
Guaranteed Availability
Configure fallback prompts to ensure application reliability when ABV is unavailable
Integration with Other Features
Prompt Management works seamlessly with ABV’s broader platform:- Observability: Link prompts to traces to see which prompt version generated each response and track metrics over time
- Evaluations: Use Prompt Experiments to compare prompt performance on test datasets with automated scoring
- SDKs: Fetch prompts programmatically with Python SDK or JS/TS SDK for seamless integration
- Metrics: Track prompt performance over time in the Metrics Dashboard, comparing versions on quality, cost, and latency
- LLM Gateway: Route requests through the LLM Gateway with prompts managed centrally for consistent behavior across providers