SDKs
JS/TS SDK
Cookbook: ABV JS/TS SDK
10 min
this is a jupyter notebook this is a jupyter notebook run on google colab js/ts applications can be traced via the typescript sdk overview docid\ j4sdnlmdmnfmk99ootgn7 in this notebook, we will walk you through a simple end to end example that uses the core features of the abv js/ts sdk shows how to log any llm call via the low level sdk for this guide, we assume that you are already familiar with the abv data model (traces, spans, generations, etc ) if not, please read the observability & tracing docid\ l z5ltur bd4f7v9mqkrc to tracing set up environment get your abv api key by signing up for abv you’ll also need your openai api key note this cookbook uses deno js for execution, which requires different syntax for importing packages and setting environment variables for node js applications, the setup process is similar but uses standard npm packages and process env // abv authentication key deno env set("abv api key", "sk abv "); // abv host configuration // for eu data region, set this to "https //eu app abv dev" deno env set("abv host", "https //app abv dev") // set environment variables using deno specific syntax deno env set("openai api key", "sk proj "); with the environment variables set, we can now initialize the abvspanprocessor which is passed to the main opentelemetry sdk that orchestrates tracing // import required dependencies import 'npm\ dotenv/config'; import { nodesdk } from "npm @opentelemetry/sdk node"; import { abvspanprocessor } from "npm @abvdev/otel"; // export the processor to be able to flush it later // this is important for ensuring all spans are sent to abv export const abvspanprocessor = new abvspanprocessor({ apikey process env abv api key!, baseurl process env abv host ?? 'https //app abv dev', // default if not specified environment process env node env ?? 'development', // default to development if not specified }); // initialize the opentelemetry sdk with our abv processor const sdk = new nodesdk({ spanprocessors \[abvspanprocessor], }); // start the sdk to begin collecting telemetry // the warning about crypto module is expected in deno and doesn't affect basic tracing functionality media upload features will be disabled, but all core tracing works normally sdk start(); the abvclient provides additional functionality beyond opentelemetry tracing, such as scoring, prompt management, and data retrieval it automatically uses the same environment variables we set earlier import { abvclient } from "npm @abvdev/client"; const abv = new abvclient(); log llm calls you can use the sdk to log any llm call or any of the integrations that are interoperable with it in the following, we will demonstrate how to log llm calls using the sdk, langchain, vercel ai sdk, and openai integrations option 1 context manager to simplify nesting and context management, you can use startactiveobservation these functions take a callback and automatically manage the observation’s lifecycle and the opentelemetry context any observation created inside the callback will automatically be nested under the active observation, and the observation will be ended when the callback finishes this is the recommended approach for most use cases as it prevents context leakage and ensures observations are properly ended // import necessary functions from the tracing package import { startactiveobservation, startobservation, updateactivetrace, updateactiveobservation } from "npm @abvdev/tracing"; // start a new span with automatic context management await startactiveobservation("context manager", async (span) => { // log the initial user query span update({ input { query "what is the capital of france?" } }); // create a new generation span that will automatically be a child of "context manager" const generation = startobservation( "llm call", { model "gpt 4", input \[{ role "user", content "what is the capital of france?" }], }, { astype "generation" }, ); // llm call logic would go here // update the generation with token usage statistics generation update({ usagedetails { input 10, // number of input tokens output 5, // number of output tokens cache read input tokens 2, // tokens read from cache some other token count 10, // custom token metric total 17, // optional automatically calculated if not provided }, }); // end the generation with the llm response generation update({ output { content "the capital of france is paris " }, }) end(); // example user information const user = { id "user 5678", name "jane doe", sessionid "123" }; // add an optional log level of type warning to the active span updateactiveobservation( { level "warning", statusmessage "this is a warning" }, ); // update the trace with user context updateactivetrace({ userid user id, sessionid user sessionid, metadata { username user name }, }); // mark the span as complete with final output span update({ output "successfully answered " }); }); // ensure all spans are sent to abv await abvspanprocessor forceflush(); public trace in the abv ui public trace in the abv ui option 2 observe decorator the observe wrapper is a powerful tool for tracing existing functions without modifying their internal logic it acts as a decorator that automatically creates a span or generation around the function call you can use the updateactiveobservation function to add attributes to the observation from within the wrapped function import { observe, updateactiveobservation } from "npm @abvdev/tracing"; // an existing function async function fetchdata(source string) { updateactiveobservation({ usagedetails { // usage input 10, output 5, }, { astype 'generation' } }) // logic to fetch data return { data `some data from ${source}` }; } // wrap the function to trace it const tracedfetchdata = observe(fetchdata, { name "observe wrapper", astype "generation", }); // now, every time you call tracedfetchdata, a span is created // its input and output are automatically populated with the // function's arguments and return value const result = await tracedfetchdata("api"); await abvspanprocessor forceflush(); public trace in the abv ui option 3 manual spans this part shows how to log any llm call by passing the model in and outputs via the abv sdk steps create span to contain this section within the trace create generation, log input and model name as it is already known call the llm sdk and log the output end generation and span teams typically wrap their llm sdk calls in a helper function that manages tracing internally this implementation occurs once and is then reused for all llm calls // import the startobservation function for manual span creation import { startobservation } from 'npm @abvdev/tracing'; // create the root span for this operation const span = startobservation('manual observation', { input { query 'what is the capital of france?' }, }); // create a child span for a tool call (e g , weather api) const toolcall = span startobservation( 'fetch weather', { input { city 'paris' } }, { astype "tool" }, ); // simulate api call with timeout await new promise((r) => settimeout(r, 100)); // end the tool call with its output toolcall update({ output { temperature '15°c' } }) end(); // create a generation span for the llm call const generation = span startobservation( 'llm call', { model 'gpt 4', input \[{ role 'user', content 'what is the capital of france?' }], output { content 'the capital of france is paris ' }, }, { astype "generation" }, ); // update the generation with token usage details generation update({ usagedetails { input 10, // input token count output 5, // output token count cache read input tokens 2, // cached tokens used some other token count 10, // custom metric total 17, // total tokens (optional) }, }); // end the generation with final output generation update({ output { content 'the capital of france is paris ' }, }) end(); // end the root span with final status and session id span update({ output 'successfully answered user request ', sessionid '123' }) end(); // ensure all spans are flushed to abv await abvspanprocessor forceflush(); public trace in the abv ui view the trace in abv after ingesting your spans, you can view them in your abv dashboard example trace in the abv ui learn more typescript sdk overview docid\ j4sdnlmdmnfmk99ootgn7 support support