Measures semantic similarity between the generated and reference text.
u
and v
is computed using one of the following methods:
Click here to learn how to setup evaluation using SDK.Input & Configuration:
Parameter | Type | Description | |
---|---|---|---|
Required Inputs | response | str | Model-generated output to be evaluated. |
expected_text | str or List[str] | One or more reference texts for comparison. | |
Optional Config | similarity_method | str | Distance function used to compare embedding vectors. Options: "cosine" (default), "euclidean" , "manhattan" . |
normalize | bool | Whether to normalize embedding vectors before computing similarity. Default is True . |
Parameter - similarity_method | Description |
---|---|
cosine | Measures the cosine of the angle between two vectors |
euclidean | Computes the straight-line (L2) distance between vectors |
manhattan | Computes the L1 (absolute) distance between vectors |
Output Field | Type | Description |
---|---|---|
score | float | Value between 0 and 1 representing semantic similarity. Higher values indicate stronger similarity. |