google_mlkit_image_labeling 0.13.0 copy "google_mlkit_image_labeling: ^0.13.0" to clipboard
google_mlkit_image_labeling: ^0.13.0 copied to clipboard

A Flutter plugin to use Google's ML Kit Image Labeling to detect and extract information about entities in an image across a broad group of categories.

Google's ML Kit Image Labeling for Flutter #

Pub Version analysis Star on Github License: MIT

A Flutter plugin to use Google's ML Kit Image Labeling to detect and extract information about entities in an image across a broad group of categories.

PLEASE READ THIS before continuing or posting a new issue:

  • Google's ML Kit was build only for mobile platforms: iOS and Android apps. Web or any other platform is not supported, you can request support for those platform to Google in their repo.

  • This plugin is not sponsored or maintained by Google. The authors are developers excited about Machine Learning that wanted to expose Google's native APIs to Flutter.

  • Google's ML Kit APIs are only developed natively for iOS and Android. This plugin uses Flutter Platform Channels as explained here.

    Messages are passed between the client (the app/plugin) and host (platform) using platform channels as illustrated in this diagram:

    Messages and responses are passed asynchronously, to ensure the user interface remains responsive. To read more about platform channels go here.

    Because this plugin uses platform channels, no Machine Learning processing is done in Flutter/Dart, all the calls are passed to the native platform using MethodChannel in Android and FlutterMethodChannel in iOS, and executed using Google's native APIs. Think of this plugin as a bridge between your app and Google's native ML Kit APIs. This plugin only passes the call to the native API and the processing is done by Google's API. It is important that you understand this concept when it comes to debugging errors for your ML model and/or app.

  • Since the plugin uses platform channels, you may encounter issues with the native API. Before submitting a new issue, identify the source of the issue. You can run both iOS and/or Android native example apps by Google and make sure that the issue is not reproducible with their native examples. If you can reproduce the issue in their apps then report the issue to Google. The authors do not have access to the source code of their native APIs, so you need to report the issue to them. If you find that their example apps are okay and still you have an issue using this plugin, then look at our closed and open issues. If you cannot find anything that can help you then report the issue and provide enough details. Be patient, someone from the community will eventually help you.

Requirements #

iOS #

  • Minimum iOS Deployment Target: 15.5.0
  • Xcode 15.3.0 or newer
  • Swift 5
  • ML Kit does not support 32-bit architectures (i386 and armv7). ML Kit does support 64-bit architectures (x86_64 and arm64). Check this list to see if your device has the required device capabilities. More info here.

Since ML Kit does not support 32-bit architectures (i386 and armv7), you need to exclude armv7 architectures in Xcode in order to run flutter build ios or flutter build ipa. More info here.

Go to Project > Runner > Building Settings > Excluded Architectures > Any SDK > armv7

Your Podfile should look like this:

platform :ios, '15.5.0'  # or newer version

...

# add this line:
$iOSVersion = '15.5.0'  # or newer version

post_install do |installer|
  # add these lines:
  installer.pods_project.build_configurations.each do |config|
    config.build_settings["EXCLUDED_ARCHS[sdk=*]"] = "armv7"
    config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = $iOSVersion
  end

  installer.pods_project.targets.each do |target|
    flutter_additional_ios_build_settings(target)

    # add these lines:
    target.build_configurations.each do |config|
      if Gem::Version.new($iOSVersion) > Gem::Version.new(config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'])
        config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = $iOSVersion
      end
    end

  end
end

Notice that the minimum IPHONEOS_DEPLOYMENT_TARGET is 15.5.0, you can set it to something newer but not older.

Android #

  • minSdkVersion: 21
  • targetSdkVersion: 33
  • compileSdkVersion: 34

Usage #

Create an instance of InputImage #

Create an instance of InputImage as explained here.

final InputImage inputImage;

Create an instance of ImageLabeler #

final ImageLabelerOptions options = ImageLabelerOptions(confidenceThreshold: 0.5);
final imageLabeler = ImageLabeler(options: options);

Process image #

final List<ImageLabel> labels = await imageLabeler.processImage(inputImage);

for (ImageLabel label in labels) {
  final String text = label.label;
  final int index = label.index;
  final double confidence = label.confidence;
}

Release resources with close() #

imageLabeler.close();

Models #

Image Labeling can be used with either the Base Model or a Custom Model. The base model is the default model bundled in the SDK, and a custom model can either be bundled with the app as an asset or downloaded from Firebase.

Base model #

To use the base model:

final ImageLabelerOptions options = ImageLabelerOptions(confidenceThreshold: confidenceThreshold);
final imageLabeler = ImageLabeler(options: options);

Local custom model #

Before using a custom model make sure you read and understand the ML Kit's compatibility requirements for TensorFlow Lite models here. To learn how to create a custom model that is compatible with ML Kit go here.

To use a local custom model add the tflite model to your pubspec.yaml:

assets:
  - assets/ml/

Add this method:

import 'dart:io';
import 'package:flutter/services.dart';
import 'package:path/path.dart';
import 'package:path_provider/path_provider.dart';

Future<String> getModelPath(String asset) async {
  final path = '${(await getApplicationSupportDirectory()).path}/$asset';
  await Directory(dirname(path)).create(recursive: true);
  final file = File(path);
  if (!await file.exists()) {
    final byteData = await rootBundle.load(asset);
    await file.writeAsBytes(byteData.buffer
            .asUint8List(byteData.offsetInBytes, byteData.lengthInBytes));
  }
  return file.path;
}

Create an instance of [ImageLabeler]:

final modelPath = await getModelPath('assets/ml/object_labeler.tflite');
final options = LocalLabelerOptions(
  confidenceThreshold: confidenceThreshold,
  modelPath: modelPath,
);
final imageLabeler = ImageLabeler(options: options);

Android Additional Setup

Add the following to your app's build.gradle file to ensure Gradle doesn't compress the model file when building the app:

android {
    // ...
    aaptOptions {
        noCompress "tflite"
        // or noCompress "lite"
    }
}

Firebase model #

Google's standalone ML Kit library does NOT have any direct dependency with Firebase. As designed by Google, you do NOT need to include Firebase in your project in order to use ML Kit. However, to use a remote model hosted in Firebase, you must setup Firebase in your project following these steps:

iOS Additional Setup

Additionally, for iOS you have to update your app's Podfile.

First, include GoogleMLKit/LinkFirebase and Firebase in your Podfile:

platform :ios, '15.5.0'

...

# Enable firebase-hosted models #
pod 'GoogleMLKit/LinkFirebase'
pod 'Firebase'

Next, add the preprocessor flag to enable the firebase remote models at compile time. To do that, update your existing build_configurations loop in the post_install step with the following:

post_install do |installer|
  installer.pods_project.targets.each do |target|
    ... # Here are some configurations automatically generated by flutter

    target.build_configurations.each do |config|
      # Enable firebase-hosted ML models
      config.build_settings['GCC_PREPROCESSOR_DEFINITIONS'] ||= [
        '$(inherited)',
        'MLKIT_FIREBASE_MODELS=1',
      ]
    end
  end
end

Usage

To use a Firebase model:

final options = FirebaseLabelerOption(
  confidenceThreshold: confidenceThreshold,
  modelName: modelName,
);
final imageLabeler = ImageLabeler(options: options);

Managing Firebase models

Create an instance of model manager

final modelManager = FirebaseImageLabelerModelManager();

To check if model is downloaded

final bool response = await modelManager.isModelDownloaded(modelName);

To download a model

final bool response = await modelManager.downloadModel(modelName);

To delete a model

final bool response = await modelManager.deleteModel(modelName);

Example app #

Find the example app here.

Contributing #

Contributions are welcome. In case of any problems look at existing issues, if you cannot find anything related to your problem then open an issue. Create an issue before opening a pull request for non trivial fixes. In case of trivial fixes open a pull request directly.

48
likes
160
points
13.5k
downloads

Publisher

verified publisherflutter-ml.dev

Weekly Downloads

A Flutter plugin to use Google's ML Kit Image Labeling to detect and extract information about entities in an image across a broad group of categories.

Homepage
Repository (GitHub)
View/report issues

Documentation

API reference

License

MIT (license)

Dependencies

flutter, google_mlkit_commons

More

Packages that depend on google_mlkit_image_labeling