safeSearchDetection method

Future<SafeSearchAnnotation?> safeSearchDetection(
  1. JsonImage jsonImage, {
  2. int maxResults = 10,
})

SafeSearch Detection detects explicit content such as adult content or violent content within an image. This feature uses five categories (adult, spoof, medical, violence, and racy) and returns the likelihood that each is present in a given image. See the SafeSearchAnnotation page for details on these fields.

Implementation

Future<SafeSearchAnnotation?> safeSearchDetection(
  JsonImage jsonImage, {
  int maxResults = 10,
}) async {
  final annotatedResponses = await detection(
    jsonImage,
    AnnotationType.safeSearchDetection,
    maxResults: maxResults,
  );

  return annotatedResponses.safeSearchAnnotation;
}