tflite_next 0.0.1 copy "tflite_next: ^0.0.1" to clipboard
tflite_next: ^0.0.1 copied to clipboard

A Flutter plugin to run TensorFlow Lite models on Android and iOS.

tflite_next 🚀 #

pub version style: effective dart

A Flutter plugin for accessing TensorFlow Lite native APIs. This plugin allows you to run TensorFlow Lite models in your Flutter applications for both Android and iOS.

Features ✨ #

  • Load TensorFlow Lite models (.tflite).
  • Run inference on input data.
  • Support for hardware acceleration (GPU, NNAPI where available).
  • Easy-to-use Dart API.

Getting Started 🏁 #

This project is a starting point for a Flutter plug-in package, a specialized package that includes platform-specific implementation code for Android and/or iOS.

For help getting started with Flutter development, view the online documentation, which offers tutorials, samples, guidance on mobile development, and a full API reference.

Installation 💻 #

  1. Add tflite_next to your pubspec.yaml file:

    dependencies:
      tflite_next: ^1.1.0
    
  2. Install the package by running:

    flutter pub get
    

Platform Specific Setup 🛠️ #

Android #

  1. Minimum SDK Version: Ensure your android/app/build.gradle has a minSdkVersion of at least 21 (or higher, as required by TensorFlow Lite).

    android {
        defaultConfig {
            minSdkVersion 21
            // ...
        }
        // ...
    }
    
  2. Model Files: Place your .tflite model files in your Flutter project's assets folder. Update your pubspec.yaml accordingly:

    flutter:
      assets:
        - assets/your_model.tflite
        # Add other assets here
    
  3. (Optional) Proguard: If you are using Proguard, you might need to add rules to prevent stripping of TensorFlow Lite classes. Add the following to your android/app/proguard-rules.pro file:

    # Process tensorflow-lite
    -keep class org.tensorflow.lite.** { *; }
    

iOS #

  1. Minimum iOS Version: Ensure your ios/Podfile has a platform version of at least 12.0 (or higher, as required by TensorFlow Lite).

    platform :ios, '12.0' # Or your desired minimum version
    
  2. Model Files: Similar to Android, include your model files in the assets section of your pubspec.yaml. These will be bundled with your iOS app.

  3. Permissions: If your model requires camera access or other permissions, ensure you've added the necessary descriptions to your Info.plist file.

Usage 💡 #

Here's a basic example of how to load a model and run inference:

import 'package:flutter/services.dart';
import 'package:tflite_next/tflite_next.dart'; // Import the plugin

Future<void> loadModel() async {
  try {
    String? res = await TfliteNext.loadModel(
      model: "assets/your_model.tflite", // Your model path
      labels: "assets/your_labels.txt", // Your labels path (optional)
      numThreads: 1, // Number of threads for inference
      isAsset: true, // Indicates if the model is in assets
      useGpuDelegate: false // Whether to use GPU delegate
    );
    print("Model loaded: $res");
  } on PlatformException catch (e) {
    print('Failed to load model: ${e.message}');
  }
}

Future<void> runInference(List<double> inputData) async {
  try {
    var recognitions = await TfliteNext.runModelOnBinary(
      binary: inputData, // Your input data as a List<double> or Uint8List
      // For image classification, you might use:
      // path: "path/to/your/image.jpg", // Path to an image file
      // imageMean: 127.5,
      // imageStd: 127.5,
      // threshold: 0.1, // Confidence threshold
      // numResultsPerClass: 1, // Max results per class
    );
    print("Inference results: $recognitions");

    // Process the recognitions (List<dynamic> or similar, depending on model output)
    if (recognitions != null) {
      for (var recognition in recognitions) {
        print("Label: ${recognition['label']}, Confidence: ${recognition['confidence']}");
      }
    }
  } on PlatformException catch (e) {
    print('Failed to run inference: ${e.message}');
  }
}

// Don't forget to close the interpreter when done
Future<void> closeModel() async {
  try {
    String? res = await TfliteNext.close();
    print("Model closed: $res");
  } on PlatformException catch (e) {
    print('Failed to close model: ${e.message}');
  }
}

Important: The exact methods and parameters for runModelOnBinary (or similar methods like runModelOnFrame, runModelOnImage) will depend on your specific model's input and output requirements, and the plugin's API design. Adapt the inputData and result processing accordingly.

API Reference 📚 #

(You should detail the main classes and methods of your plugin here. For example:)

  • TfliteNext.loadModel(...): Loads the TFLite model.
    • model: Path to the .tflite file.
    • labels: Path to the labels file (optional).
    • numThreads: Number of threads for inference.
    • isAsset: Boolean, true if model/labels are in Flutter assets.
    • useGpuDelegate: Boolean, attempts to use GPU delegate.
    • useNnapiDelegate: Boolean, attempts to use NNAPI delegate (Android).
  • TfliteNext.runModelOnBinary(...): Runs inference on raw byte data.
  • TfliteNext.runModelOnImage(...): Runs inference on an image path.
  • TfliteNext.close(): Closes the TFLite interpreter and releases resources.

(Refer to your plugin's Dart API documentation for the full list of methods and parameters.)

Example App 📱 #

Check out the example folder for a working application showcasing the plugin's capabilities.

Limitations and Known Issues ⚠️ #

  • List any known limitations (e.g., specific model types not supported, performance considerations on certain devices).
  • Link to your issue tracker for current bugs.

Contributing 🤝 #

Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.

  1. Fork the repository.
  2. Create your feature branch (git checkout -b feature/AmazingFeature).
  3. Commit your changes (git commit -m 'Add some AmazingFeature').
  4. Push to the branch (git push origin feature/AmazingFeature).
  5. Open a Pull Request.

License 📄 #

This project is licensed under the MIT License - see the LICENSE file for details.


Made with ❤️ by DevTestify Labs

2
likes
140
points
13
downloads

Publisher

unverified uploader

Weekly Downloads

A Flutter plugin to run TensorFlow Lite models on Android and iOS.

Repository (GitHub)
View/report issues

Documentation

API reference

License

MIT (license)

Dependencies

flutter, plugin_platform_interface

More

Packages that depend on tflite_next

Packages that implement tflite_next