apple_vision_face_detection 0.0.2 copy "apple_vision_face_detection: ^0.0.2" to clipboard
apple_vision_face_detection: ^0.0.2 copied to clipboard

PlatformiOSmacOS

A Flutter plugin to use Apple Vision Face Detection to find faces in an image or live camera feed.

apple_vision_face__detection #

Pub Version analysis Star on Github License: MIT

Apple Vision Face Detection is a Flutter plugin that enables Flutter apps to use Apple Vision Face Detection.

  • This plugin is not sponsor or maintained by Apple. The authors are developers who wanted to make a similar plugin to Google's ml kit for macos.

Requirements #

MacOS

  • Minimum osx Deployment Target: 10.13
  • Xcode 13 or newer
  • Swift 5
  • ML Kit only supports 64-bit architectures (x86_64 and arm64).

iOS

  • Minimum ios Deployment Target: 12.0
  • Xcode 13 or newer
  • Swift 5
  • ML Kit only supports 64-bit architectures (x86_64 and arm64).

Getting Started #

You need to first import 'package:apple_vision/apple_vision.dart';

  final GlobalKey cameraKey = GlobalKey(debugLabel: "cameraKey");
  AppleVisionFaceDetectionController visionController = AppleVisionFaceDetectionController();
  InsertCamera camera = InsertCamera();
  Size imageSize = const Size(640,640*9/16);
  String? deviceId;
  bool loading = true;

  List<Rect>? faceData;
  late double deviceWidth;
  late double deviceHeight;

  @override
  void initState() {
    camera.setupCameras().then((value){
      setState(() {
        loading = false;
      });
      camera.startLiveFeed((InputImage i){
        if(i.metadata?.size != null){
          imageSize = i.metadata!.size;
        }
        if(mounted) {
          Uint8List? image = i.bytes;
          visionController.processImage(image!, imageSize).then((data){
            faceData = data;
            setState(() {
              
            });
          });
        }
      });
    });
    super.initState();
  }
  @override
  void dispose() {
    camera.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    deviceWidth = MediaQuery.of(context).size.width;
    deviceHeight = MediaQuery.of(context).size.height;
    return Stack(
      children:<Widget>[
        SizedBox(
          width: imageSize.width, 
          height: imageSize.height, 
          child: loading?Container():CameraSetup(camera: camera, size: imageSize)
      ),
      ]+showRects()
    );
  }

  List<Widget> showRects(){
    if(faceData == null || faceData!.isEmpty) return [];
    List<Widget> widgets = [];

    for(int i = 0; i < faceData!.length; i++){
      widgets.add(
        Positioned(
          top: faceData![i].top,
          left: faceData![i].left,
          child: Container(
            width: faceData![i].width*imageSize.width,
            height: faceData![i].height*imageSize.height,
            decoration: BoxDecoration(
              color: Colors.transparent,
              border: Border.all(width: 1, color: Colors.green),
              borderRadius: BorderRadius.circular(5)
            ),
          )
        )
      );
    }
    return widgets;
  }

  Widget loadingWidget(){
    return Container(
      width: deviceWidth,
      height:deviceHeight,
      color: Theme.of(context).canvasColor,
      alignment: Alignment.center,
      child: const CircularProgressIndicator(color: Colors.blue)
    );
  }

Example #

Find the example for this API here.

Contributing #

Contributions are welcome. In case of any problems look at existing issues, if you cannot find anything related to your problem then open an issue. Create an issue before opening a pull request for non trivial fixes. In case of trivial fixes open a pull request directly.

2
likes
150
points
38
downloads

Publisher

unverified uploader

Weekly Downloads

A Flutter plugin to use Apple Vision Face Detection to find faces in an image or live camera feed.

Repository (GitHub)
View/report issues

Documentation

API reference

License

MIT (license)

Dependencies

apple_vision_commons, flutter

More

Packages that depend on apple_vision_face_detection