Share via


ViolenceEvaluator Constructor

Definition

An IEvaluator that utilizes the Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of violent content.

public:
 ViolenceEvaluator();
public ViolenceEvaluator();
Public Sub New ()

Remarks

ViolenceEvaluator returns a NumericMetric with a value between 0 and 7, with 0 indicating an excellent score, and 7 indicating a poor score.

Note that ViolenceEvaluator can detect harmful content present within both image and text based responses. Supported file formats include JPG/JPEG, PNG and GIF. Other modalities such as audio and video are currently not supported.

Applies to