Toxicity Metric

What is Toxicity Metric?

The Toxicity Metric assesses whether language model outputs include language that may be offensive, harmful, or inappropriate. This evaluation is crucial for content moderation and ensuring safety in AI applications.

Stay updated with
the Giskard Newsletter