Skip to main content

What it is

An annotation queue is a managed campaign that groups items to annotate, assigns them to annotators, tracks progress, and enforces quality controls. Queues sit between labels (what to measure) and scores (the resulting data), providing the operational layer that turns annotation from an ad-hoc activity into a structured workflow.

Queue lifecycle

A queue moves through a defined set of statuses:
TransitionDescription
DraftActiveStart accepting annotations. Annotators can begin work.
ActivePausedTemporarily stop annotation. No new items can be picked up, but in-progress items are preserved.
PausedActiveResume annotation.
Active / PausedCompletedAll items are done, or you manually close the queue.
CompletedActiveRe-open the queue if new items are added or you need additional annotations.

Item statuses

Each item in a queue has its own status:
StatusMeaning
PendingWaiting for an annotator to pick it up.
In ProgressAn annotator has opened the item and is actively annotating.
CompletedAll required annotations have been submitted.
SkippedAn annotator chose to skip this item. It remains available for others.
Pending ReviewAnnotations are done but the item is awaiting reviewer approval (when review workflow is enabled).

Assignment strategies

Assignment strategies control how items are distributed to annotators when they click Start Annotating.
StrategyBehaviorBest For
ManualAnnotators browse and pick items themselves from the queue list.Small queues or exploratory annotation where annotators need context to choose.
Round RobinItems are distributed cyclically across annotators in order. Each annotator gets the next available item in rotation.Even distribution when annotators work at similar speeds.
Load BalancedItems are distributed based on each annotator’s current workload. Annotators with fewer in-progress items get the next item.Teams with varying availability or part-time annotators.

Reservation system

When an annotator opens an item, the system reserves it for a configurable timeout period. This prevents two annotators from working on the same item simultaneously.
  • Default timeout: 1 hour.
  • Configurable range: 15 minutes to 4 hours.
  • Expiry behavior: If the annotator does not submit or skip within the timeout, the reservation expires and the item returns to Pending status for another annotator to pick up.
Reservations are visible in the queue detail view so managers can monitor active work.

Multi-annotator support

For tasks that benefit from agreement between multiple reviewers, set the Annotations Required field (1-10) when creating or editing a queue.
  • Each item must receive the configured number of complete annotations before it transitions to Completed.
  • Different annotators independently annotate the same item — they do not see each other’s responses.
  • The queue analytics tab shows inter-annotator agreement metrics once multiple annotators have scored the same items.
An item is considered fully annotated by a single annotator only when all labels attached to the queue have been scored. Partial submissions are saved but do not count toward the required annotation count.

Review workflow

Enable Requires Review on a queue to add a review step after annotation:
1

Annotation

Annotators complete their work as usual. When all required annotations are submitted, the item moves to Pending Review instead of Completed.
2

Review

A designated reviewer opens the item, sees all submitted annotations, and either Approves (moves to Completed) or Rejects (sends back to Pending for re-annotation).
This is useful for high-stakes labeling tasks where a senior reviewer must validate annotations before they become final.

Guidelines

Each queue supports a Guidelines field — a markdown-formatted instruction document shown to annotators when they open the annotation workspace. Use guidelines to:
  • Define the annotation criteria and edge cases.
  • Provide examples of correct and incorrect annotations.
  • Specify when to skip an item.
  • Link to external reference material.
Well-written guidelines significantly improve annotation consistency, especially with larger teams.

Auto-completion

Items auto-complete when the following conditions are met:
  1. All labels attached to the queue have been scored for the item.
  2. The required number of annotators (set by Annotations Required) have each fully annotated the item.
  3. If Requires Review is enabled, the reviewer has approved the item.
No manual intervention is needed — the system tracks progress and transitions items automatically.

Creating a queue

1

Open the Queues tab

Navigate to Annotations in the left sidebar and select the Queues tab. Click Create Queue.
2

Configure the queue

Fill in the queue settings:
FieldDescription
NameA descriptive name for the campaign.
LabelsSelect one or more labels that annotators will apply.
Assignment StrategyManual, Round Robin, or Load Balanced.
AnnotatorsAdd team members who will annotate.
Annotations RequiredNumber of independent annotators per item (1-10).
Reservation TimeoutHow long an item stays reserved for one annotator.
Requires ReviewWhether completed items need reviewer approval.
GuidelinesMarkdown instructions for annotators.
Create queue
3

Add items

Items can be added from Observe (traces, spans, sessions), Datasets (rows), Prototypes (runs), or Simulations (executions). Select items in their respective views and click Add to Queue.

What you can do next

Annotation Labels

Learn about the five label types you can attach to queues.

Scores

Understand the data model behind every annotation.

Quickstart

Walk through the full annotation flow end to end in 5 minutes.