detectYUV420 method

DetectResults detectYUV420({
  1. required Uint8List y,
  2. required Uint8List u,
  3. required Uint8List v,
  4. required int height,
  5. @Deprecated("width is automatically calculated from the length of the y.") int width = 0,
  6. required KannaRotateDeviceOrientationType deviceOrientationType,
  7. required int sensorOrientation,
  8. void onDecodeImage(
    1. Image image
    )?,
  9. void onYuv420sp2rgbImage(
    1. Image image
    )?,
})

Detect with YOLOX Run it after initYolox

When detecting from an image byte array, specify y, u and v. y and u and v are the YUV420 data of the image. height is the height of the image. The width of the image is calculated from y length.

deviceOrientationType is the device orientation. It can be obtained from CameraController of the camera package. https://github.com/flutter/plugins/blob/main/packages/camera/camera/lib/src/camera_controller.dart#L134

sensorOrientation is the orientation of the camera sensor. It can be obtained from CameraController of the camera package. https://github.com/flutter/plugins/blob/main/packages/camera/camera_platform_interface/lib/src/types/camera_description.dart#L42

onDecodeImage and onYuv420sp2rgbImage are callback functions for decoding images. The process of converting to a ui.Image object is heavy and affects performance. If ui.Image is not needed, it is recommended to set null.

Implementation

DetectResults detectYUV420({
  required Uint8List y,
  required Uint8List u,
  required Uint8List v,
  required int height,
  @Deprecated("width is automatically calculated from the length of the y.")
      int width = 0,
  required KannaRotateDeviceOrientationType deviceOrientationType,
  required int sensorOrientation,
  void Function(ui.Image image)? onDecodeImage,
  void Function(ui.Image image)? onYuv420sp2rgbImage,
}) {
  final yuv420sp = yuv420sp2Uint8List(
    y: y,
    u: u,
    v: v,
  );

  final width = y.length ~/ height;

  final pixels = yuv420sp2rgb(
    yuv420sp: yuv420sp,
    width: width,
    height: height,
  );
  if (onYuv420sp2rgbImage != null) {
    final rgba = rgb2rgba(
      rgb: pixels,
      width: width,
      height: height,
    );

    ui.decodeImageFromPixels(
      rgba,
      width,
      height,
      ui.PixelFormat.rgba8888,
      onYuv420sp2rgbImage,
    );
  }

  final rotated = kannaRotate(
    pixels: pixels,
    width: width,
    height: height,
    deviceOrientationType: deviceOrientationType,
    sensorOrientation: sensorOrientation,
  );

  if (onDecodeImage != null) {
    final rgba = rgb2rgba(
      rgb: rotated.pixels ?? Uint8List(0),
      width: rotated.width,
      height: rotated.height,
    );

    ui.decodeImageFromPixels(
      rgba,
      rotated.width,
      rotated.height,
      ui.PixelFormat.rgba8888,
      onDecodeImage,
    );
  }

  return DetectResults(
    results: detectPixels(
      pixels: rotated.pixels ?? Uint8List(0),
      width: rotated.width,
      height: rotated.height,
    ),
    image: rotated,
  );
}