Skip to main content

What you will do

In this walkthrough you will create an annotation label, set up a queue, add traces to it, and annotate your first item. The entire flow takes about 5 minutes.
1

Create an annotation label

Navigate to Annotations in the left sidebar, then open the Labels tab. Click Create Label.Labels pageFill in the form:
FieldValue
NameSentiment
TypeCategorical
OptionsPositive, Negative, Neutral
Allow NotesEnabled
Click Create to save.Create label
2

Create a queue

Switch to the Queues tab and click Create Queue.
FieldValue
NameReview Queue
LabelsSelect the Sentiment label you just created
Assignment StrategyRound Robin
AnnotatorsAdd yourself
Annotations Required1
Click Create to save the queue.Create queue
3

Add items to the queue

Go to your Observe project and open the LLM Tracing view. Select one or more traces using the checkboxes, then click the Add to Queue button in the toolbar.In the dialog, choose Review Queue and confirm. The selected traces are now queue items with a Pending status.
4

Start annotating

Go back to Annotations > Queues and click on Review Queue to open its detail page. Click Start Annotating.The annotation workspace loads the first pending item. You will see:
  • The trace content on the left.
  • The annotation panel on the right with your Sentiment label.
Select an option (e.g. Positive), optionally add a note, and click Submit.Annotation workspaceThe workspace automatically advances to the next item. You can also click Skip to move past an item you cannot annotate.
5

Review progress

Click the Analytics tab on the queue detail page to see completion rates, annotator activity, and label distribution.Analytics
Keyboard shortcuts speed up annotation significantly:
  • Ctrl+Enter (or Cmd+Enter) — Submit the current annotation
  • 1-9 — Select a categorical option by its position
  • S — Skip the current item

What you can do next

Annotation Labels

Explore all five label types and their configuration options.

Queues & Workflow

Configure assignment strategies, multi-annotator requirements, and review workflows.

Scores

Understand how annotation data is stored and queried via the Score model.