Adding Annotations to your Spans
Learn how to annotate your spans in bulk using the API
Annotations are a way to label your spans with additional information. Annotations are useful for:
- Label your data with custom tags and criteria
- Add custom events to your spans
- Create a golden dataset for your AI training
- Add human feedback to your spans
Annotations are very important for your AI applications, as they allow you to add feedback to trace data.
How to annotate your spans
Annotations can be added to your spans using the API or UI.
Before adding annotations, you have to create an annotation label which can be only done on the UI.
1. Create an annotation label
- Go to your project in Observe/Prototype.
- Click on any Trace or Span to open the Trace Details page.
- Click on the “Annotations” tab.
- Click on the “Create Annotation Label” button.
- Fill in the form with the following information:
- Name: The name of the annotation label.
- Description: The description of the annotation label.
- Type: The type of the annotation label.
Text
: this type is used for free text annotations.Numeric
: this type is used for numeric annotations.Categorical
: this type is used for categorical annotations.Star
: this type is used for star rating annotations.Thumbs up/down
: this type is used for thumbs up/down annotations.
- Write the necessary configuration for the annotation label.
- Click on the “Create” button.
- You will be redirected to the Annotation Labels page.
- You can see the created annotation label in the list.
- You can edit the annotation label by clicking on the “Edit” button.
- You can delete the annotation label by clicking on the “Delete” button.
2. Add annotations to your spans
On UI after creating an annotation label, you can annotate your spans, by clicking on the “Annotate” button.
Once you have at least one annotation label, you can add, update, and retrieve annotations with the /tracer/bulk-annotation/ endpoint.
Authentication should be done using your API key and Secret key.
all request must also include Content-Type: application/json
header.
Request payload
records
is an array; each record targets a single span.
Inside each record you can:
- Add new annotations and notes
- Update existing annotations (matched by annotation_label_id + annotator_id)
- Add notes (duplicates are silently ignored)
Fetching your annotation-label ID Before you can attach annotations, you need the internal annotation_label_id that corresponds to the label you created in the UI. You can retrieve it in one line with the /tracer/get-annotation-labels/ endpoint:
The response contains a list of all labels in your project; each item includes id, name, type, and other metadata. In most scripts you only need one ID, so selecting the first element which was created very recently (result[0][“id”]) is a quick way to proceed.
Supported value keys per label type:
Label Type | Field to Use | Example Value |
---|---|---|
Text | value | "Loved the answer" |
Numeric | value_float | 4.2 |
Categorical | value_str_list | ["option1", "option2"] |
Star rating | value_float | 4.0 (1–5) |
Thumbs up/down | value_bool | true or false |
Response
Response object
Every call returns a top-level boolean status and a nested result object:
Field | Type | Meaning |
---|---|---|
status | boolean | true if the request itself was processed (even if some records failed). |
result.message | string | Human-readable summary. |
result.annotationsCreated | number | How many annotations were created across all records. |
result.notesCreated | number | How many notes were created across all records. |
result.succeededCount | number | Number of records that were applied without errors. |
result.errorsCount | number | Number of records that had at least one error. |
result.errors | array | Per-error details (see below). |
Error objects
Each element in result.errors contains:
Field | Type | Example | Description |
---|---|---|---|
recordIndex | number | 1 | Position of the offending record in the records array (0-based). |
spanId | string | ”45635513961540ab” | The span that failed. |
annotationError | string | ”Annotation label “axdf” does not belong to span’s project” | Error message for the annotation operation (optional). |
noteError | string | ”Duplicate note” | Error message for the note operation (optional). |
End to End Example
Best practices
- Immutable labels – avoid changing the meaning of an existing label; create a new one instead.
- Consistent annotator IDs – use stable identifiers (“human_annotator_1”, “model_v1”, …).
- Batch updates – updating many spans? Group 100–500 records per request to minimize network overhead.
- Idempotency – sending the same note text twice in a record skips duplicates, keeping data clean.
- Monitor quotas – large annotation operations count toward your project’s API usage.
That’s it! You now have a complete workflow for labeling, adding, and updating annotations programmatically.