Vercel
Integrate Vercel AI SDK with Future AGI. Set up @traceai/vercel for automatic tracing of AI-powered Next.js and Vercel applications.
1. Installation
First install the TraceAI + Vercel packages (and OpenTelemetry peer deps). Pick your favourite package manager:
npm install @traceai/vercel @vercel/otel \
@opentelemetry/api @opentelemetry/sdk-trace-base \
@opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js \
@ai-sdk/openaiyarn add @traceai/vercel @vercel/otel \
@opentelemetry/api @opentelemetry/sdk-trace-base \
@opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js \
@ai-sdk/openaipnpm add @traceai/vercel @vercel/otel \
@opentelemetry/api @opentelemetry/sdk-trace-base \
@opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js \
@ai-sdk/openai Note Vercel currently supports OpenTelemetry v1.x. Avoid installing
@opentelemetry/*2.x packages.
2. Set Environment Variables
Configure your Future AGI credentials (locally via .env, or in Vercel Project → Settings → Environment Variables).
FI_API_KEY=<YOUR_FI_API_KEY>
FI_SECRET_KEY=<YOUR_FI_SECRET_KEY>
3. Initialise tracing
Create instrumentation.ts and import it once on the server (e.g. in _app.tsx or at the top of your first API route).
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-ignore — module ships without types
import { registerOTel } from "@vercel/otel";
import { diag, DiagConsoleLogger, DiagLogLevel } from "@opentelemetry/api";
import { FISimpleSpanProcessor, isFISpan } from "@traceai/vercel";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-grpc";
import { Metadata } from "@grpc/grpc-js";
// Optional: verbose console logs while testing
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);
export function register() {
registerOTel({
attributes: {
project_name: "vercel-project",
project_type: "observe",
},
spanProcessors: [
new FISimpleSpanProcessor({
exporter: (() => {
const meta = new Metadata();
meta.set("x-api-key", process.env.FI_API_KEY ?? "");
meta.set("x-secret-key", process.env.FI_SECRET_KEY ?? "");
return new OTLPTraceExporter({ url: "grpc://grpc.futureagi.com", metadata: meta });
})(),
// Export only TraceAI spans (remove if you want everything)
spanFilter: isFISpan,
}),
],
});
}
4. Instrument an API Route
Our instrumentation is automatic—just import and call the register function inside each serverless function.
import type { NextApiRequest, NextApiResponse } from "next";
import { register as registerTracing } from "../../instrumentation";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
registerTracing(); // initialise OTEL + exporters
const result = await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a short creative story about a time-traveling detective.",
experimental_telemetry: { isEnabled: true }, // ⇢ creates spans for each call
maxTokens: 300,
});
res.status(200).json({
story: result.text?.trim() ?? "n/a",
});
}
That’s it—deploy to Vercel and watch traces flow into Observe → Traces in real time 🎉