Using Future AGI Annotation Feature, you can create high quality training and evaluation datasets. This enables teams to train better models, fine-tune prompting strategies, and monitor responses effectively.

Importance of Annotations and Human-In-The-Loop (HITL) in Generative AI

Generative models don’t just classify or predict - they generate open -ended content. This makes quality of output subjective, and often dependent on human judgement. Annotations are therefore very important they improve:

  • Feedback Loop: Create a continuous learning system by feeding annotated responses back into training or fine-tuning pipelines.
  • Customization: Adapt generic LLMs to user preferences and domain specific conventions via annotated datasets.
  • Quality Control: Catch failure modes like hallucinations, off-topic responses, or biases through manual review.

This is the reason why Human-In-The-Loop (HITL) is very important as they improve the standards of Generative AI by providing critical evaluations and maintaining metrics like accuracy, Safety, Coherence.

Common Use cases for Annotations

Use CaseAnnotation TypeDescription
Sentiment AnalysisCategoricalLabel text as Positive, Negative, or Neutral to measure tone
Factuality CheckBoolean or TextValidate whether the model output is grounded in the source
Toxicity ReviewCategoricalFlag harmful, biased, or unsafe responses
Relevance ScoringNumericRate how well the response addresses the user query
Grammar/Style EditsTextProvide rewritten versions or highlight grammar issues
Prompt ComparisonCategorical or NumericCompare responses from different prompt variants

Steps to Add Annotations

1. Select a Dataset

  • Navigate to the Datasets section from the main dashboard.
  • Click on the name of the dataset you want to annotate.
  • If you don’t have a dataset yet, please create or upload one first.

2. Open the Annotation Interface

  • Once inside your selected dataset view, click the Annotations tab or button (usually located near the top or side of the data table).
  • This opens the main interface for managing annotation views and labels.

3. Create an Annotation View

An Annotation View defines what you want to annotate and how.

  • Within the Annotations interface, click Create New View.
  • Give your view a descriptive Name (e.g., “Sentiment Labels”, “Fact Check Ratings”).

4. Define Labels

Labels specify the type and possible values for your annotations. You’ll link a label to your view in the next step.

  • If you don’t have a suitable label already, click Create New Label.
  • Name: Give the label a clear name (e.g., “Sentiment”, “Accuracy Score”).
  • Type: Choose the annotation type:
    • Categorical: For predefined text categories (e.g., “Positive”, “Negative”, “Neutral”).
      • Define the possible category names.
    • Numeric: For scores or ratings on a scale (e.g., 1-5).
      • Define the minimum and maximum values.
    • Text: For free-form text feedback or corrections.
  • Click Save to create the label.

Leveraging Auto-Annotation

For Categorical labels, Future AGI offers an optional Auto-Annotation feature designed to accelerate the labeling process.

How it Works: When enabled during label creation, the platform observes the annotations you manually apply. Based on these examples, it learns patterns and can automatically suggest labels for the remaining unannotated rows in your dataset.

Benefits:

  • Speeds up annotation: Significantly reduces the time needed for large datasets by automating suggestions.
  • Improves consistency: Helps maintain uniform labeling based on learned patterns from your initial annotations.

You can review, accept, or override any suggestions made by the Auto-Annotation feature, ensuring you always retain final control over the data quality.

5. Configure the Annotation View

Now, connect the fields and the label within the view you created in Step 3:

  • Static Fields: Select the column(s) that provide context or input (e.g., the user query, the original document).
  • Response Fields: Select the column(s) containing the model output or data you want to annotate.
  • Label: Choose the Label you created or selected in Step 4.
  • Preview: Review the setup to ensure it looks correct.
  • Click Save to finalize the Annotation View.

6. Assign Annotators

  • In the Annotation View settings, find the Annotators section.
  • Add workspace members who should contribute annotations to this specific view.

7. Review and Edit Annotations

You can review and edit annotations added within a specific View:

  • Select the Annotation View from the list.
  • Navigate through the dataset rows in the annotation interface.
  • Click on an existing annotation value to modify it.
  • Changes are typically saved automatically, or click a Save button if available.

Conclusion

Adding annotations is key to evaluating model performance, refining training data, and ensuring the reliability of your AI applications. By creating structured annotation views and leveraging features like auto-annotation, you can efficiently enhance your datasets within Future AGI.

For more information on dataset management, visit the Dataset Overview page.