Platform
Metrics
Metrics overview
2 min
abv metrics derive actionable insights from observability https //docs abv dev/ and evaluation https //docs abv dev/evaluation overview traces metrics can be sliced and diced via the customizable dashboards https //docs abv dev/custom dashboards and the metrics api https //docs abv dev/metrics api features custom dashboards https //docs abv dev/custom dashboards metrics api https //docs abv dev/metrics api metrics & dimensions metrics quality is measured through user feedback, model based scoring, human in the loop scored samples or custom scores via sdks/api (see scores https //docs abv dev/evaluation overview ) quality is assessed over time as well as across prompt versions, llms and users cost and latency are accurately measured and broken down by user, session, geography, feature, model and prompt version volume based on the ingested traces and tokens used dimensions trace name differentiate between different use cases, features, etc by adding a name field to your traces user track usage and cost by user just add a userid to your traces ( docs https //docs abv dev/user tracking ) tags filter different use cases, features, etc by adding tags https //docs abv dev/tags to your traces release and version numbers track how changes to the llm application affected your metrics for an exact definition, please refer to the metrics api docs https //docs abv dev/metrics api