Protect Class
The Protect class provides a client for evaluating and protecting against unwanted content (such as toxicity, prompt injection, and more) using various metrics and rules. It leverages the Evaluator class and a set of built-in evaluation templates.
Initialization
fi_api_key(Optional[str]): API key for authentication. If not provided, will be read from environment variables.fi_secret_key(Optional[str]): Secret key for authentication. If not provided, will be read from environment variables.fi_base_url(Optional[str]): Base URL for the API. If not provided, will be read from environment variables.evaluator(Optional[Evaluator]): An instance of theEvaluatorclass to use for evaluations. If not provided, a new one will be created.
InvalidAuthError: If API key or secret key is missing.
Instance Methods
protect
Evaluates input strings against a set of protection rules and returns messages for any failed checks.
inputs(str): The input string to evaluate.protect_rules(List[Dict]): List of protection rule dictionaries. Each rule must contain:metric(str): Name of the metric to evaluate (e.g.,"Toxicity","Tone","Sexism").contains(List[str]): Values to check for in the evaluation results.type(str): Either"any"or"all", specifying the matching logic.action(str): Message to return when the rule is triggered.reason(bool, optional): Whether to include the evaluation reason in the message.
action(str, optional): Default message to return when a rule is triggered. Defaults to"Response cannot be generated as the input fails the checks".reason(bool, optional): Whether to include the evaluation reason in the message. Defaults toFalse.timeout(int, optional): Timeout for evaluations in seconds. Defaults to300.
List[str]: List of protection messages for failed rules, or["All checks passed"]if no rules are triggered.
ValueError: Ifinputsorprotect_rulesdo not match the required structure.TypeError: Ifinputscontains non-string objects.