Importance of Annotations and Human-In-The-Loop (HITL) in Generative AI
Generative models don’t just classify or predict - they generate open -ended content. This makes quality of output subjective, and often dependent on human judgement. Annotations are therefore very important they improve:- Feedback Loop: Create a continuous learning system by feeding annotated responses back into training or fine-tuning pipelines.
- Customization: Adapt generic LLMs to user preferences and domain specific conventions via annotated datasets.
- Quality Control: Catch failure modes like hallucinations, off-topic responses, or biases through manual review.
Common Use cases for Annotations
Use Case | Annotation Type | Description |
---|---|---|
Sentiment Analysis | Categorical | Label text as Positive, Negative, or Neutral to measure tone |
Factuality Check | Boolean or Text | Validate whether the model output is grounded in the source |
Toxicity Review | Categorical | Flag harmful, biased, or unsafe responses |
Relevance Scoring | Numeric | Rate how well the response addresses the user query |
Grammar/Style Edits | Text | Provide rewritten versions or highlight grammar issues |
Prompt Comparison | Categorical or Numeric | Compare responses from different prompt variants |
Steps to Add Annotations
1. Select a Dataset
- Navigate to the Datasets section from the main dashboard.
- Click on the name of the dataset you want to annotate.
- If you don’t have a dataset yet, please create or upload one first.
2. Open the Annotation Interface
- Once inside your selected dataset view, click the Annotations tab or button (usually located near the top or side of the data table).
- This opens the main interface for managing annotation views and labels.
3. Create an Annotation View
An Annotation View defines what you want to annotate and how.- Within the Annotations interface, click Create New View.
- Give your view a descriptive Name (e.g., “Sentiment Labels”, “Fact Check Ratings”).
4. Define Labels
Labels specify the type and possible values for your annotations. You’ll link a label to your view in the next step.- If you don’t have a suitable label already, click Create New Label.
- Name: Give the label a clear name (e.g., “Sentiment”, “Accuracy Score”).
- Type: Choose the annotation type:
- Categorical: For predefined text categories (e.g., “Positive”, “Negative”, “Neutral”).
- Define the possible category names.
- Numeric: For scores or ratings on a scale (e.g., 1-5).
- Define the minimum and maximum values.
- Text: For free-form text feedback or corrections.
- Categorical: For predefined text categories (e.g., “Positive”, “Negative”, “Neutral”).
- Click Save to create the label.
Leveraging Auto-Annotation
For Categorical labels, Future AGI offers an optional Auto-Annotation feature designed to accelerate the labeling process. How it Works: When enabled during label creation, the platform observes the annotations you manually apply. Based on these examples, it learns patterns and can automatically suggest labels for the remaining unannotated rows in your dataset. Benefits:- Speeds up annotation: Significantly reduces the time needed for large datasets by automating suggestions.
- Improves consistency: Helps maintain uniform labeling based on learned patterns from your initial annotations.
5. Configure the Annotation View
Now, connect the fields and the label within the view you created in Step 3:- Static Fields: Select the column(s) that provide context or input (e.g., the user query, the original document).
- Response Fields: Select the column(s) containing the model output or data you want to annotate.
- Label: Choose the Label you created or selected in Step 4.
- Preview: Review the setup to ensure it looks correct.
- Click Save to finalize the Annotation View.
6. Assign Annotators
- In the Annotation View settings, find the Annotators section.
- Add workspace members who should contribute annotations to this specific view.
7. Review and Edit Annotations
You can review and edit annotations added within a specific View:- Select the Annotation View from the list.
- Navigate through the dataset rows in the annotation interface.
- Click on an existing annotation value to modify it.
- Changes are typically saved automatically, or click a Save button if available.