How Prompt Config Works
Understanding config structure, versioning, and usage:Define config when creating prompts
The
config field accepts arbitrary JSON when creating or updating prompts.Config is freeform JSON - ABV doesn’t enforce schema. Store whatever your application needs.Config is versioned with the prompt
When you create a new prompt version, the config is versioned alongside the prompt content.Version 1: Prompt + Config
{"model": "gpt-4o", "temperature": 0.7}
Version 2: Same Prompt + Updated Config {"model": "gpt-4o-mini", "temperature": 0.8}Benefits include comparing prompts and configs side-by-side across versions, rolling back to previous config by reassigning labels, A/B testing parameter variations alongside prompt variations, and audit trail showing who changed which parameters when.Fetch config with the prompt
When your application fetches a prompt, the config is included in the response.Config is cached with the prompt - no additional network requests needed.
Update config without code changes
Product teams can modify config directly in the ABV UI:
- Navigate to prompt in ABV dashboard
- Create new version or edit existing draft
- Update config JSON (change temperature, model, etc.)
- Assign to
staginglabel for testing - After validation, reassign
productionlabel to new version - Application automatically uses new config on next cache refresh
Creating Prompts with Config
Python SDK Example:Using Config at Runtime
Python Example:Common Config Patterns
Model Parameters
Model Parameters
Store standard LLM model parameters in config for version-controlled parameter management.Common parameters:Usage pattern:Benefits: Experiment with temperature without code changes, switch models (GPT-4o → GPT-4o-mini) via UI, A/B test parameter combinations, roll back to previous parameter sets instantly.
Tool/Function Definitions
Tool/Function Definitions
Store function calling tool definitions in config for versioned tool management.Config with tools:Usage:Benefits: Add new tools without code deployment, modify tool schemas, version tool definitions with prompts, A/B test different tool configurations.
Response Format/Schema
Response Format/Schema
Define structured output schemas in config for consistent response parsing.Config with JSON schema:Usage:Benefits: Enforce structured outputs without code changes, version schema evolution, switch between freeform and structured responses via config.
Custom Application Metadata
Custom Application Metadata
Store application-specific metadata for business logic or feature flags.Custom config:Usage:Benefits: Manage feature flags without code deployment, configure business rules per prompt version, store documentation (supported languages, limits).
Model-Specific Configuration
Model-Specific Configuration
Store provider-specific parameters for advanced model features.OpenAI-specific config:Anthropic-specific config:Google-specific config:Benefits: Switch LLM providers via config without code changes, use provider-specific features.
Workflows
A/B Testing Model Parameters
A/B Testing Model Parameters
Test whether increasing temperature improves response creativity.Setup:Analysis: Compare quality scores, latency, and user feedback by prompt version.
- Create variant A with
temperature: 0.7, assignvariant-alabel - Create variant B with
temperature: 0.9, assignvariant-blabel - Application randomly selects variant, uses config from selected prompt
Model Migration Without Code Changes
Model Migration Without Code Changes
Migrate from GPT-4o to GPT-4o-mini for cost savings.Create new version with updated config, test in staging, compare metrics, and promote to production via label reassignment.
Progressive Tool Rollout
Progressive Tool Rollout
Add new tools to assistant without breaking existing deployments.Create new version with additional tools, ensure function handlers support new tools, test in staging, then promote to production.
Environment-Specific Configuration
Environment-Specific Configuration
Use different parameters in development vs production.Development uses faster, cheaper models while production uses higher quality models. Single codebase with environment-specific config managed in ABV.