Traces
Traces are used to track the behaviour of each individual call to an LLM inside your task.
reportTrace
You can report a trace by calling reportTrace
inside an evalite
eval:
import { evalite, type Evalite } from "evalite";import { reportTrace } from "evalite/traces";
evalite("My Eval", { data: async () => { return [{ input: "Hello", expected: "Hello World!" }]; }, task: async (input) => { // Track the start time const start = performance.now();
// Call our LLM const result = await myLLMCall();
// Report the trace once it's finished reportTrace({ start, end: performance.now(), output: result.output, input: [ { role: "user", content: input, }, ], usage: { completionTokens: result.completionTokens, promptTokens: result.promptTokens, }, });
// Return the output return result.output; }, scorers: [Levenshtein],});
traceAISDKModel
If you’re using the Vercel AI SDK, you can automatically report traces by wrapping your model in traceAISDKModel
function:
import { traceAISDKModel } from "evalite/ai-sdk";import { generateText } from "ai";import { openai } from "@ai-sdk/openai";
// All calls to this model will be recorded in evalite!const tracedModel = traceAISDKModel(openai("gpt-4o-mini"));
const result = await generateText({ model: tracedModel, system: `Answer the question concisely.`, prompt: `What is the capital of France?`,});