Platform
Administration
LLM Connections
6 min
llm connections are used to call models in the abv playground or for llm as a judge evaluations setup navigate to your project settings > llm connections and click on add new llm api key enter the name of the llm connection and the api key for the model you want to use supported providers the abv platform is currently supporting the following llm providers openai azure openai anthropic google ai studio google vertex ai amazon bedrock supported models currently the playground supports the following models by default you may configure additional custom model names when adding your llm api key in the abv project settings, e g when using a custom model or proxy any model that supports the openai api schema the playground and llm as a judge evaluations can be used by any framework that supports the openai api schema such as groq, openrouter, vercel ai gateway, litellm, hugging face, and more just replace the api base url with the appropriate endpoint for the model you want to use and add the providers api keys for authentication openai / azure openai o3, o3 2025 04 16, o4 mini, o4 mini 2025 04 16, gpt 4 1, gpt 4 1 2025 04 14, gpt 4 1 mini 2025 04 14, gpt 4 1 nano 2025 04 14, gpt 4o, gpt 4o 2024 08 06, gpt 4o 2024 05 13, gpt 4o mini, gpt 4o mini 2024 07 18, o3 mini, o3 mini 2025 01 31, o1 preview, o1 preview 2024 09 12, o1 mini, o1 mini 2024 09 12, gpt 4 turbo preview, gpt 4 1106 preview, gpt 4 0613, gpt 4 0125 preview, gpt 4, gpt 3 5 turbo 16k 0613, gpt 3 5 turbo 16k, gpt 3 5 turbo 1106, gpt 3 5 turbo 0613, gpt 3 5 turbo 0301, gpt 3 5 turbo 0125, gpt 3 5 turbo anthropic claude 3 7 sonnet 20250219, claude 3 5 sonnet 20241022, claude 3 5 sonnet 20240620, claude 3 opus 20240229, claude 3 sonnet 20240229, claude 3 5 haiku 20241022, claude 3 haiku 20240307, claude 2 1, claude 2 0, claude instant 1 2 google vertex ai gemini 2 5 pro exp 03 25, gemini 2 0 pro exp 02 05, gemini 2 0 flash 001, gemini 2 0 flash lite preview 02 05, gemini 2 0 flash exp, gemini 1 5 pro, gemini 1 5 flash, gemini 1 0 pro you may also add additional model names supported by google vertex ai platform and enabled in your gcp account through the `custom model names` section in the llm api key creation form google ai studio gemini 2 5 pro exp 03 25, gemini 2 0 pro exp 02 05, gemini 2 0 flash 001, gemini 2 0 flash lite preview 02 05, gemini 2 0 flash exp, gemini 1 5 pro, gemini 1 5 flash, gemini 1 0 pro amazon bedrock all amazon bedrock models are supported the required permission on aws is `bedrock\ invokemodel` you may connect to third party llm providers if their api schema implements the schema of one of our supported provider adapters for example, you may connect to mistral by using the openai adapter in abv to connect to mistral's openai compliant api advanced configurations additional provider options provider options are provider options are not set up in the project settings > llm connections page but either when selecting a llm connection on the playground or during llm as a judge evaluator setup llm calls from a created llm connection can be configured with a specific set of parameters, such as temperature , top p , and max tokens however, many llm providers allow for additional parameters, including reasoning effort , service tier , and others when invoking a model these parameters often differ between providers you can provide additional configurations as a json object for all llm invocations in the model parameters settings, you will find a "provider options" field at the bottom this field allows you to enter specific key value pairs accepted by your llm provider's api endpoint please see your providers api reference for what additional fields are supported anthropic messages api reference https //docs anthropic com/en/api/messages openai chat completions api reference https //platform openai com/docs/api reference/chat/create this feature is currently available for the adapters for anthropic openai aws (amazon bedrock) example for forcing reasoning effort minimal on a openai gpt 5 invocation connecting via proxies you can use an llm proxy to power llm as a judge or the playground in abv please create an llm api key in the project settings and set the base url to resolve to your proxy's host the proxy must accept the api format of one of our adapters and support tool calling for openai compatible proxies, here is an example tool calling request that must be handled by the proxy in openai format to support llm as a judge in abv curl x post 'https //\<host set in project settings>/chat/completions' \\ h 'accept application/json' \\ h 'content type application/json' \\ h 'authorization bearer \<api key entered in project settings>' \\ h 'x test header 1 \<custom header set in project settings>' \\ h 'x test header 2 \<custom header set in project settings>' \\ d '{ "model" "\<model set in project settings>", "temperature" 0, "top p" 1, "frequency penalty" 0, "presence penalty" 0, "max tokens" 256, "n" 1, "stream" false, "tools" \[ { "type" "function", "function" { "name" "extract", "parameters" { "type" "object", "properties" { "score" { "type" "string" }, "reasoning" { "type" "string" } }, "required" \[ "score", "reasoning" ], "additionalproperties" false, "$schema" "http //json schema org/draft 07/schema#" } } } ], "tool choice" { "type" "function", "function" { "name" "extract" } }, "messages" \[ { "role" "user", "content" "evaluate the correctness of the generation on a continuous scale from 0 to 1 a generation can be considered correct (score 1) if it includes all the key facts from the ground truth and if every fact presented in the generation is factually supported by the ground truth or common sense \n\nexample \nquery can eating carrots improve your vision?\ngeneration yes, eating carrots significantly improves your vision, especially at night this is why people who eat lots of carrots never need glasses anyone who tells you otherwise is probably trying to sell you expensive eyewear or does not want you to benefit from this simple, natural remedy it'\\''s shocking how the eyewear industry has led to a widespread belief that vegetables like carrots don'\\''t help your vision people are so gullible to fall for these money making schemes \nground truth well, yes and no carrots won'\\''t improve your visual acuity if you have less than perfect vision a diet of carrots won'\\''t give a blind person 20/20 vision but, the vitamins found in the vegetable can help promote overall eye health carrots contain beta carotene, a substance that the body converts to vitamin a, an important nutrient for eye health an extreme lack of vitamin a can cause blindness vitamin a can prevent the formation of cataracts and macular degeneration, the world'\\''s leading cause of blindness however, if your vision problems aren'\\''t related to vitamin a, your vision won'\\''t change no matter how many carrots you eat \nscore 0 1\nreasoning while the generation mentions that carrots can improve vision, it fails to outline the reason for this phenomenon and the circumstances under which this is the case the rest of the response contains misinformation and exaggerations regarding the benefits of eating carrots for vision improvement it deviates significantly from the more accurate and nuanced explanation provided in the ground truth \n\n\n\ninput \nquery {{query}}\ngeneration {{generation}}\nground truth {{ground truth}}\n\n\nthink step by step " } ] }'