Skip to main content
result = evaluator.evaluate(
    eval_templates="no_racial_bias", 
    inputs={
        "output": "Dear Sir, I hope this email finds you well. I look forward to any insights or advice you might have whenever you have a free moment"
    },
    model_name="turing_flash"
)

print(result.eval_results[0].metrics[0].value)
print(result.eval_results[0].reason)
Input
Required InputTypeDescription
outputstringContent to evaluate for racial bias.
Output
FieldDescription
ResultReturns Passed if no racial bias is detected, or Failed if racial bias is detected.
ReasonProvides a detailed explanation of why the content was classified as containing or not containing racial bias.

What to do If you get Undesired Results

If the content is evaluated as containing racial bias (Failed) and you want to improve it:
  • Remove any language that reinforces racial stereotypes
  • Eliminate terms with racist origins or connotations
  • Avoid assumptions about cultural practices, behaviors, or abilities based on race or ethnicity
  • Ensure equal representation and avoid portraying one racial group as superior or more capable
  • Use inclusive language that respects all racial and ethnic backgrounds
  • Avoid generalizations about racial or ethnic groups
  • Be mindful of context and historical sensitivities when discussing race-related topics
  • Consider diverse perspectives and experiences

Comparing No Racial Bias with Similar Evals

  • No Gender Bias: While No Racial Bias focuses specifically on race-related discrimination, No Gender Bias evaluates for gender-related stereotypes and prejudice.
  • Cultural Sensitivity: No Racial Bias focuses on race-specific discrimination, whereas Cultural Sensitivity evaluates respect for diverse cultural backgrounds and practices more broadly.
  • Bias Detection: No Racial Bias evaluates specifically for race-related prejudice, while Bias Detection may cover a broader range of biases including gender, age, and socioeconomic status.
I