Skip to main content
result = evaluator.evaluate(
    eval_templates="bias_detection",
    inputs={
        "output": "This is a sample text to check for bias detection"
    },
    model_name="turing_flash"
)

print(result.eval_results[0].output)
print(result.eval_results[0].reason)
Input
Required InputTypeDescription
outputstringThe text content to analyze for bias
Output
FieldDescription
ResultReturns Passed or Failed, where Passed indicates neutral content and Failed indicates the presence of detectable bias
ReasonProvides a detailed explanation of the bias assessment

What to do if Bias is detected

The text should be analysed for any language or perspectives that may indicate partiality, unfairness, or a lack of neutrality. Identifying specific instances of bias allows for targeted refinements to make the text more balanced and inclusive while maintaining its original intent.

Differentiating Bias Detection with Cultural Sensitivity

Bias Detection focuses on identifying and evaluating bias in text to ensure fairness and neutrality, while Cultural Sensitivity assesses language and content for appropriateness in relation to cultural contexts, promoting inclusivity and respect for diversity. Bias Detection examines text for any forms of bias that may introduce unfairness or lack of neutrality, whereas Cultural Sensitivity evaluates inclusivity, cultural awareness, and the absence of insensitive language.