tflite_next 🚀
A Flutter plugin for accessing TensorFlow Lite native APIs. This plugin allows you to run TensorFlow Lite models in your Flutter applications for both Android and iOS.
Features ✨
- Load TensorFlow Lite models (
.tflite). - Run inference on input data.
- Support for hardware acceleration (GPU, NNAPI where available).
- Easy-to-use Dart API.
Getting Started 🏁
This project is a starting point for a Flutter plug-in package, a specialized package that includes platform-specific implementation code for Android and/or iOS.
For help getting started with Flutter development, view the online documentation, which offers tutorials, samples, guidance on mobile development, and a full API reference.
Installation 💻
-
Add
tflite_nextto yourpubspec.yamlfile:dependencies: tflite_next: ^1.1.0 -
Install the package by running:
flutter pub get
Platform Specific Setup 🛠️
Android
-
Minimum SDK Version: Ensure your
android/app/build.gradlehas aminSdkVersionof at least 21 (or higher, as required by TensorFlow Lite).android { defaultConfig { minSdkVersion 21 // ... } // ... } -
Model Files: Place your
.tflitemodel files in your Flutter project'sassetsfolder. Update yourpubspec.yamlaccordingly:flutter: assets: - assets/your_model.tflite # Add other assets here -
(Optional) Proguard: If you are using Proguard, you might need to add rules to prevent stripping of TensorFlow Lite classes. Add the following to your
android/app/proguard-rules.profile:# Process tensorflow-lite -keep class org.tensorflow.lite.** { *; }
iOS
-
Minimum iOS Version: Ensure your
ios/Podfilehas a platform version of at least 12.0 (or higher, as required by TensorFlow Lite).platform :ios, '12.0' # Or your desired minimum version -
Model Files: Similar to Android, include your model files in the
assetssection of yourpubspec.yaml. These will be bundled with your iOS app. -
Permissions: If your model requires camera access or other permissions, ensure you've added the necessary descriptions to your
Info.plistfile.
Usage 💡
Here's a basic example of how to load a model and run inference:
import 'package:flutter/services.dart';
import 'package:tflite_next/tflite_next.dart'; // Import the plugin
Future<void> loadModel() async {
try {
String? res = await TfliteNext.loadModel(
model: "assets/your_model.tflite", // Your model path
labels: "assets/your_labels.txt", // Your labels path (optional)
numThreads: 1, // Number of threads for inference
isAsset: true, // Indicates if the model is in assets
useGpuDelegate: false // Whether to use GPU delegate
);
print("Model loaded: $res");
} on PlatformException catch (e) {
print('Failed to load model: ${e.message}');
}
}
Future<void> runInference(List<double> inputData) async {
try {
var recognitions = await TfliteNext.runModelOnBinary(
binary: inputData, // Your input data as a List<double> or Uint8List
// For image classification, you might use:
// path: "path/to/your/image.jpg", // Path to an image file
// imageMean: 127.5,
// imageStd: 127.5,
// threshold: 0.1, // Confidence threshold
// numResultsPerClass: 1, // Max results per class
);
print("Inference results: $recognitions");
// Process the recognitions (List<dynamic> or similar, depending on model output)
if (recognitions != null) {
for (var recognition in recognitions) {
print("Label: ${recognition['label']}, Confidence: ${recognition['confidence']}");
}
}
} on PlatformException catch (e) {
print('Failed to run inference: ${e.message}');
}
}
// Don't forget to close the interpreter when done
Future<void> closeModel() async {
try {
String? res = await TfliteNext.close();
print("Model closed: $res");
} on PlatformException catch (e) {
print('Failed to close model: ${e.message}');
}
}
Important: The exact methods and parameters for runModelOnBinary (or similar methods like runModelOnFrame, runModelOnImage) will depend on your specific model's input and output requirements, and the plugin's API design. Adapt the inputData and result processing accordingly.
API Reference 📚
(You should detail the main classes and methods of your plugin here. For example:)
TfliteNext.loadModel(...): Loads the TFLite model.model: Path to the .tflite file.labels: Path to the labels file (optional).numThreads: Number of threads for inference.isAsset: Boolean, true if model/labels are in Flutter assets.useGpuDelegate: Boolean, attempts to use GPU delegate.useNnapiDelegate: Boolean, attempts to use NNAPI delegate (Android).
TfliteNext.runModelOnBinary(...): Runs inference on raw byte data.TfliteNext.runModelOnImage(...): Runs inference on an image path.TfliteNext.close(): Closes the TFLite interpreter and releases resources.
(Refer to your plugin's Dart API documentation for the full list of methods and parameters.)
Example App 📱
Check out the example folder for a working application showcasing the plugin's capabilities.
Limitations and Known Issues ⚠️
- List any known limitations (e.g., specific model types not supported, performance considerations on certain devices).
- Link to your issue tracker for current bugs.
Contributing 🤝
Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.
- Fork the repository.
- Create your feature branch (
git checkout -b feature/AmazingFeature). - Commit your changes (
git commit -m 'Add some AmazingFeature'). - Push to the branch (
git push origin feature/AmazingFeature). - Open a Pull Request.
License 📄
This project is licensed under the MIT License - see the LICENSE file for details.
Made with ❤️ by DevTestify Labs