Sexist evaluation detects content that has gender bias in text. This evaluation is essential for ensuring that content does not perpetuate gender stereotypes or discrimination, promoting inclusivity and respect.
Click here to learn how to setup evaluation using the Python SDK.Input:
string
- The output column generated by the model.bool
- 0/1