Skip to main content

JavaScript SDK

The FutureAGI JavaScript/TypeScript SDK provides two primary classes: Annotation for logging annotations via a DataFrame-style interface, and AnnotationQueue for full queue lifecycle management.

Installation

npm install @future-agi/sdk

Annotation Class — Log Annotations

Initialize the client

import { Annotation } from '@future-agi/sdk';

const client = new Annotation({
  fiApiKey: 'YOUR_API_KEY',
  fiSecretKey: 'YOUR_SECRET_KEY',
});

Log annotations

Log annotations using DataFrame-style records. Each record is an object with column keys following the same naming convention as the Python SDK.
const response = await client.logAnnotations([
  {
    'context.span_id': 'span_abc123',
    'annotation.quality.text': 'Excellent response',
    'annotation.sentiment.label': 'positive',
    'annotation.accuracy.score': 9.0,
    'annotation.rating.rating': 5,
    'annotation.helpful.thumbs': true,
    'annotation.notes': 'Top quality',
  },
  {
    'context.span_id': 'span_def456',
    'annotation.quality.text': 'Needs improvement',
    'annotation.sentiment.label': 'negative',
    'annotation.accuracy.score': 3.5,
    'annotation.rating.rating': 2,
    'annotation.helpful.thumbs': false,
    'annotation.notes': 'Hallucinated facts',
  },
], { projectName: 'My Project' });

console.log(`Created: ${response.annotationsCreated}, Errors: ${response.errorsCount}`);
For the full column naming convention table, see the Python SDK — Column naming convention. The format is identical across both SDKs.

Get labels

const labels = await client.getLabels({ projectId: 'proj_123' });

labels.forEach(l => console.log(`${l.name} (${l.type}): ${l.id}`));

List projects

const projects = await client.listProjects({ projectType: 'observe' });

projects.forEach(p => console.log(`${p.name}: ${p.id}`));

AnnotationQueue Class — Full Queue Management

The AnnotationQueue class provides complete programmatic control over the annotation queue lifecycle: creating queues, adding items, assigning work, submitting annotations, and exporting results.

Initialize the client

import { AnnotationQueue } from '@future-agi/sdk';

const queues = new AnnotationQueue({
  fiApiKey: 'YOUR_API_KEY',
  fiSecretKey: 'YOUR_SECRET_KEY',
});

Create a queue

const queue = await queues.create({
  name: 'Review Queue',
  description: 'Quality review of traces',
  instructions: 'Rate response quality on all labels',
  assignmentStrategy: 'round_robin',
  annotationsRequired: 2,
  reservationTimeoutMinutes: 30,
  requiresReview: false,
});

Add items to a queue

const result = await queues.addItems(queue.id, [
  { sourceType: 'trace', sourceId: 'trace_abc' },
  { sourceType: 'observation_span', sourceId: 'span_def' },
  { sourceType: 'dataset_row', sourceId: 'row_ghi' },
]);

console.log(`Added: ${result.added}, Duplicates: ${result.duplicates}`);

Valid source types

Source TypeDescription
traceAn LLM trace
observation_spanA specific span in a trace
trace_sessionA conversation session
dataset_rowA dataset row
call_executionA simulation call
prototype_runA prototype run

Submit annotations

await queues.submitAnnotations(queue.id, itemId, [
  { labelId: 'label_123', value: 'positive', scoreSource: 'human' },
  { labelId: 'label_456', value: 4.5, scoreSource: 'human' },
], { notes: 'High quality response' });

Create scores directly (without queue)

You can create scores against any source without going through a queue workflow.
const score = await queues.createScore({
  sourceType: 'trace',
  sourceId: 'trace_abc',
  labelId: 'label_123',
  value: { text: 'Good response' },
  scoreSource: 'human',
  notes: 'Quick feedback',
});

Bulk create scores

await queues.createScores({
  sourceType: 'trace',
  sourceId: 'trace_abc',
  scores: [
    { labelId: 'label_123', value: 'positive' },
    { labelId: 'label_456', value: 4.5 },
  ],
  notes: 'Batch annotation',
});

Queue lifecycle

// Activate a draft queue
await queues.activate(queue.id);

// Mark a queue as completed
await queues.completeQueue(queue.id);

// Add or remove labels from a queue
await queues.addLabel(queue.id, 'label_789');
await queues.removeLabel(queue.id, 'label_789');

// List items with optional status filter
const items = await queues.listItems(queue.id, { status: 'pending' });

// Assign items to a specific user
await queues.assignItems(queue.id, ['item_1', 'item_2'], 'user_123');

// Complete or skip items
await queues.completeItem(queue.id, 'item_1');
await queues.skipItem(queue.id, 'item_2');

Progress and analytics

const progress = await queues.getProgress(queue.id);
console.log(`${progress.completed}/${progress.total} (${progress.progressPct}%)`);

const analytics = await queues.getAnalytics(queue.id);

const agreement = await queues.getAgreement(queue.id);

Export

const data = await queues.export(queue.id, {
  format: 'json',
  status: 'completed',
});

Complete Method Reference

AnnotationQueue methods

MethodDescription
create(config)Create a new queue
list(options)List queues
get(queueId)Get queue details
update(queueId, updates)Update queue configuration
delete(queueId)Delete a queue
activate(queueId)Set queue status to active
completeQueue(queueId)Set queue status to completed
addLabel(queueId, labelId)Add a label to a queue
removeLabel(queueId, labelId)Remove a label from a queue
addItems(queueId, items)Add source items to a queue
listItems(queueId, options)List queue items with optional filters
removeItems(queueId, itemIds)Remove items from a queue
assignItems(queueId, itemIds, userId)Assign items to a user
submitAnnotations(queueId, itemId, annotations)Submit annotations for an item
getAnnotations(queueId, itemId)Get annotations for an item
completeItem(queueId, itemId)Mark an item as completed
skipItem(queueId, itemId)Skip an item
createScore(options)Create a single score (no queue required)
createScores(options)Bulk create scores (no queue required)
getScores(sourceType, sourceId)Get scores for a source
getProgress(queueId)Get queue completion progress
getAnalytics(queueId)Get queue analytics and metrics
getAgreement(queueId)Get inter-annotator agreement metrics
export(queueId, options)Export annotations as JSON or CSV
exportToDataset(queueId, options)Export annotations to a FutureAGI dataset

Best Practices

  • Use logAnnotations() for bulk SDK-based annotation — The DataFrame-style format is the fastest way to annotate many spans at once.
  • Use AnnotationQueue for programmatic queue management — Create, assign, and complete queues entirely from code.
  • Use createScore() / createScores() for direct score creation — Bypass the queue workflow when you need to attach scores to traces directly.
  • Always handle errors — Check for partial failures in bulk operations. Both logAnnotations and addItems can succeed for some records and fail for others.
  • Use TypeScript — All SDK methods are fully typed. TypeScript catches column name typos and invalid configurations at compile time.
Bulk operations (logAnnotations, addItems, createScores) may partially succeed. Always inspect the response for per-record errors before assuming all records were processed.

Next steps

Python SDK

DataFrame-based annotation logging with the Python SDK.

Scores API

Query and manage annotation scores via the REST API.

Queues API

REST API reference for queue CRUD operations.