tflite_flutter 0.3.0

  • Readme
  • Changelog
  • Example
  • Installing
  • 86

TensorFlow Lite Flutter Plugin #

TensorFlow Lite plugin provides a dart API for accessing TensorFlow Lite interpreter and performing inference. It binds to TensorFlow Lite C API using dart:ffi.

(Most Important) Initial setup #

Add dynamic libraries to your app

  • Linux/Mac Users

    Place the script install.sh at the root of your project.

    Execute

    sh install.sh
    at the root of your project to automatically download and place binaries at appropriate folders.

    The binaries installed will not include support for GpuDelegateV2 and NnApiDelegate however InterpreterOptions().useNnApiForAndroid can still be used.

    Use install.sh -d instead if you wish to use these GpuDelegateV2 and NnApiDelegate.

  • Windows users

    Place the script install.bat at the root of your project.

    Execute

    install.bat
    at the root of your project to automatically download and place binaries at appropriate folders.

    If you want to use delegate support then execute install.bat -d.

    These scripts install pre-built binaries based on latest stable tensorflow release.

Why do we need to do this?

tflite_flutter dynamically links to C APIs which are supplied in the form of libtensorflowlite_c.so on Android and TensorFlowLiteC.framework on iOS.

For Android, We need to manually download these binaries from release assets and place the libtensorflowlite_c.so files in the <root>/android/app/src/main/jniLibs/ directory for each arm, arm64, x86, x86_64 architecture as done here in the example app.  

No setup needed for iOS as of now, TensorFlowLiteC.framework is embedded in the plugin itself.

How to build locally ?

The pre-built binaries are updated with each stable tensorflow release. However, you many want to use latest unstable tf releases or older tf versions, for that proceed to build locally, if you are unable to find the required version in release assets.

Make sure you have required version of bazel installed. (Check TF_MIN_BAZEL_VERSION, TF_MAX_BAZEL_VERSION in configure.py)

  • Android

Configure your workspace for android builds as per these instructions.

For TensorFlow >= v2.2

    bazel build -c opt --cxxopt=--std=c++11 --config=android_arm //tensorflow/lite/c:tensorflowlite_c
    
    // similarily for arm64 use --config=android_arm64

For TensorFlow <= v2.1

    bazel build -c opt --cxxopt=--std=c++11 --config=android_arm //tensorflow/lite/experimental/c:libtensorflowlite_c.so
    
    // similarily for arm64 use --config=android_arm64
  • iOS

Refer instructions on TensorFlow Lite website to build locally for iOS.

Note: You must use macOS for building iOS.

Import #

import 'package:tflite_flutter/tflite_flutter.dart';

Usage instructions #

Creating the Interpreter #

Interpreter can be created in three ways:

  • directly from asset (easiest)

    Place your_model.tflite in assets directory. Make sure to include assets in pubspec.yaml.

      final interpreter = await tfl.Interpreter.fromAsset('your_model.tflite');
    
  • from buffer

      final buffer = await getBuffer('assets/your_model.tflite');
      final interpreter = Interpreter.fromBuffer(buffer);
    
      Future<Uint8List> getBuffer(String filePath) async {  
       final rawAssetFile = await rootBundle.load(filePath);  
       final rawBytes = rawAssetFile.buffer.asUint8List();  
       return rawBytes;  
      }
    
  • from file

     final dataFile = await getFile('assets/your_model.tflite');
     final interpreter = Interpreter.fromFile(dataFile);
    
     Future<File> getFile(String fileName) async {
       final appDir = await getTemporaryDirectory();
       final appPath = appDir.path;
       final fileOnDevice = File('$appPath/$fileName');
       final rawAssetFile = await rootBundle.load(fileName);
       final rawBytes = rawAssetFile.buffer.asUint8List();
       await fileOnDevice.writeAsBytes(rawBytes, flush: true);
       return fileOnDevice;
     }
    

Performing inference #

  • For single input and output

    Use void run(Object input, Object output).

      // For ex: if input tensor shape [1,5] and type is float32
      var input = [[1.23, 6.54, 7.81. 3.21, 2.22]];
    
      // if output tensor shape [1,2] and type is float32
      var output = List(1*2).reshape([1,2]);
    
      // inference
      interpreter.run(input, output);
    
      // print the output
      print(output);
    
  • For multiple inputs and outputs

    Use void runForMultipleInputs(List<Object> inputs, Map<int, Object> outputs).

      var input0 = [1.23];  
      var input1 = [2.43];  
    
      // input: List<Object>
      var inputs = [input0, input1, input0, input1];  
    
      var output0 = List<double>(1);  
      var output1 = List<double>(1);
    
      // output: Map<int, Object>   
      var outputs = {0: output0, 1: output1};
    
      // inference  
      interpreter.runForMultipleInputs(inputs, outputs);
    
      // print outputs
      print(outputs)
    

Closing the interpreter #

interpreter.close();

Improve performance using delegate support #

Note: This feature is under testing and could be unstable with some builds and on some devices.
  • NNAPI delegate for Android

      var interpreterOptions = InterpreterOptions()..useNnApiForAndroid = true;
      final interpreter = await Interpreter.fromAsset('your_model.tflite',
          options: interpreterOptions);
    
    

    or

      var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
      final interpreter = await Interpreter.fromAsset('your_model.tflite',
          options: interpreterOptions);
    
    
  • GPU delegate for Android and iOS

    • Android GpuDelegateV2
      final gpuDelegateV2 = GpuDelegateV2(
              options: GpuDelegateOptionsV2(
            false,
            TfLiteGpuInferenceUsage.fastSingleAnswer,
            TfLiteGpuInferencePriority.minLatency,
            TfLiteGpuInferencePriority.auto,
            TfLiteGpuInferencePriority.auto,
          ));
    
      var interpreterOptions = InterpreterOptions()..addDelegate(gpuDelegateV2);
      final interpreter = await Interpreter.fromAsset('your_model.tflite',
          options: interpreterOptions);
    
    • iOS Metal Delegate (GpuDelegate)
      final gpuDelegate = GpuDelegate(
            options: GpuDelegateOptions(true, TFLGpuDelegateWaitType.active),
          );
      var interpreterOptions = InterpreterOptions()..addDelegate(gpuDelegate);
      final interpreter = await Interpreter.fromAsset('your_model.tflite',
          options: interpreterOptions);
    
    

Refer Tests to see more example code for each method.

Refer Text Classification Flutter Example App for demo.

Credits #

  • Tian LIN, Jared Duke, Andrew Selle, YoungSeok Yoon, Shuangfeng Li from the TensorFlow Lite Team for their invaluable guidance.
  • Authors of dart-lang/tflite_native.

0.3.0 #

  • New features
    • multi-dimensional reshape with type
  • Bug fixes
    • extension flatten on List fixed.
    • error on passing not dynamic type list to interpreter output fixed

0.2.0 #

  • Direct conversion support for more TfLiteTypes
  • int16, float16, int8, int64
  • Pre-built tf 2.2.0 stable binaries

0.1.3 #

  • update usage instructions

0.1.2 #

  • fixed analysis issues to improve score

0.1.1 #

  • fixed warnings
  • longer package description

0.1.0 #

  • TfLite dart API

example/lib/main.dart

import 'package:flutter/material.dart';
import 'package:tflite_flutter_plugin_example/classifier.dart';

void main() => runApp(MyApp());

class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  TextEditingController _controller;
  Classifier _classifier;
  List<Widget> _children;
  @override
  void initState() {
    super.initState();
    _controller = TextEditingController();
    _classifier = Classifier();
    _children = [];
    _children.add(Container());
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(
          backgroundColor: Colors.orangeAccent,
          title: const Text('Text classification'),
        ),
        body: Container(
          padding: const EdgeInsets.all(4),
          child: Column(
            children: <Widget>[
              Expanded(
                  child: ListView.builder(
                itemCount: _children.length,
                itemBuilder: (_, index) {
                  return _children[index];
                },
              )),
              Container(
                padding: const EdgeInsets.all(8),
                decoration: BoxDecoration(
                    border: Border.all(color: Colors.orangeAccent)),
                child: Row(children: <Widget>[
                  Expanded(
                    child: TextField(
                      decoration: const InputDecoration(
                          hintText: 'Write some text here'),
                      controller: _controller,
                    ),
                  ),
                  FlatButton(
                    child: const Text('Classify'),
                    onPressed: () {
                      final text = _controller.text;
                      final prediction = _classifier.classify(text);
                      setState(() {
                        _children.add(Dismissible(
                          key: GlobalKey(),
                          onDismissed: (direction) {},
                          child: Card(
                            child: Container(
                              padding: const EdgeInsets.all(16),
                              color: prediction[1] > prediction[0]
                                  ? Colors.lightGreen
                                  : Colors.redAccent,
                              child: Column(
                                crossAxisAlignment: CrossAxisAlignment.start,
                                children: <Widget>[
                                  Text(
                                    "Input: $text",
                                    style: const TextStyle(fontSize: 16),
                                  ),
                                  Text("Output:"),
                                  Text("   Positive: ${prediction[1]}"),
                                  Text("   Negative: ${prediction[0]}"),
                                ],
                              ),
                            ),
                          ),
                        ));
                        _controller.clear();
                      });
                    },
                  ),
                ]),
              ),
            ],
          ),
        ),
      ),
    );
  }
}

Use this package as a library

1. Depend on it

Add this to your package's pubspec.yaml file:


dependencies:
  tflite_flutter: ^0.3.0

2. Install it

You can install packages from the command line:

with Flutter:


$ flutter pub get

Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.

3. Import it

Now in your Dart code, you can use:


import 'package:tflite_flutter/tflite_flutter.dart';
  
Popularity:
Describes how popular the package is relative to other packages. [more]
71
Health:
Code health derived from static analysis. [more]
100
Maintenance:
Reflects how tidy and up-to-date the package is. [more]
100
Overall:
Weighted score of the above. [more]
86
Learn more about scoring.

We analyzed this package on Jun 2, 2020, and provided a score, details, and suggestions below. Analysis was completed with status completed using:

  • Dart: 2.8.2
  • pana: 0.13.8-dev
  • Flutter: 1.17.1

Dependencies

Package Constraint Resolved Available
Direct dependencies
Dart SDK >=2.6.0 <3.0.0
ffi ^0.1.3 0.1.3
flutter 0.0.0
path ^1.6.2 1.7.0
quiver ^2.0.3 2.1.3
Transitive dependencies
collection 1.14.12
matcher 0.12.6
meta 1.1.8
sky_engine 0.0.99
stack_trace 1.9.3
typed_data 1.1.6
vector_math 2.0.8
Dev dependencies
flutter_test
pedantic ^1.0.0
test ^1.6.4