Eval Definition
No Racial Bias
Evaluates whether text contains racial bias, stereotypes, or discriminatory content
Evaluation Using Interface
Input:
- Required Inputs:
- input: The text content to evaluate for racial bias.
Output:
- Result: Returns ‘Passed’ if no racial bias is detected, ‘Failed’ if racial bias is detected.
Evaluation Using Python SDK
Click here to learn how to setup evaluation using the Python SDK.
Input:
- Required Inputs:
- input:
string
- The text content to evaluate for racial bias.
- input:
Output:
- Result: Returns a list containing ‘Passed’ if no racial bias is detected, or ‘Failed’ if racial bias is detected.
- Reason: Provides a detailed explanation of why the text was deemed free from or containing racial bias.
Example Output:
What to do If you get Undesired Results
If the content is evaluated as containing racial bias (Failed) and you want to improve it:
- Remove any language that reinforces racial stereotypes
- Eliminate terms with racist origins or connotations
- Avoid assumptions about cultural practices, behaviors, or abilities based on race or ethnicity
- Ensure equal representation and avoid portraying one racial group as superior or more capable
- Use inclusive language that respects all racial and ethnic backgrounds
- Avoid generalizations about racial or ethnic groups
- Be mindful of context and historical sensitivities when discussing race-related topics
- Consider diverse perspectives and experiences
Comparing No Racial Bias with Similar Evals
- No Gender Bias: While No Racial Bias focuses specifically on race-related discrimination, No Gender Bias evaluates for gender-related stereotypes and prejudice.
- Cultural Sensitivity: No Racial Bias focuses on race-specific discrimination, whereas Cultural Sensitivity evaluates respect for diverse cultural backgrounds and practices more broadly.
- Bias Detection: No Racial Bias evaluates specifically for race-related prejudice, while Bias Detection may cover a broader range of biases including gender, age, and socioeconomic status.