realtime_image_labeler

pub package license

A Flutter widget that provides a live camera preview for real-time object detection using a built-in SSD MobileNet TFLite model. It displays bounding boxes with labels and confidence scores for detected objects and offers a simple interface for integration.

Preview

Widget Preview

Features

  • Performs real-time object detection using the included SSD MobileNet TFLite model.
  • Draws bounding boxes and labels around detected objects.
  • Provides detection results via the onResult callback (includes label, confidence, location).
  • Optionally includes a customizable button to capture pictures (onTakePicture callback).
  • Handles camera initialization and lifecycle management internally.

Getting Started

  1. Add Dependency: Add this to your project's pubspec.yaml file:

    dependencies:
      realtime_image_labeler: ^0.0.1 # Replace with the latest published version
      camera_android: ^0.10.10
    
  2. Install: Run flutter pub get in your terminal.

Platform Setup (Permissions)

Camera access requires platform-specific configuration.

Android (android/app/src/main/AndroidManifest.xml)

Add the following permission before the <application> tag:

<uses-permission android:name="android.permission.CAMERA" />
<!-- Optional, but recommended -->
<uses-feature android:name="android.hardware.camera" android:required="false" />
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />

Use code with caution.

iOS (Runner/Info.plist)

Add the following key-string pair inside the main

<key>NSCameraUsageDescription</key>
<string>This app needs camera access to perform live object detection.</string>

Use code with caution.

Note: While this widget handles camera initialization, you might still want to request camera permission before navigating to the screen containing DetectorWidget, using a package like permission_handler for a smoother user experience. Basic Usage Import the package and integrate DetectorWidget into your screen.

import 'package:flutter/material.dart';
import 'package:realtime_image_labeler/realtime_image_labeler.dart'; // Import the package
import 'dart:io'; // Required for File operations if using onTakePicture
import 'package:camera/camera.dart'; // Required for XFile

class DetectionScreen extends StatelessWidget {
  const DetectionScreen({Key? key}) : super(key: key);

  ScreenParams.screenSize = MediaQuery.of(context).size; // --> Important!

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('Real-time Object Detection')),
      body: DetectorWidget(
        // Required callback for detection results
        onResult: (List<Recognition> results) {
          // Process the list of detected objects (Recognition)
          // Each 'Recognition' object contains label, confidence score,
          // and location (bounding box).
          if (results.isNotEmpty) {
            // Example: Print the primary detected object's label and confidence
            final firstResult = results.first;
            print(
                "Detected: ${firstResult.label} (${(firstResult.score * 100).toStringAsFixed(0)}%)");
          }
          // Consider updating your UI based on the results here
        },

        // Optional callback when a picture is taken
        onTakePicture: (XFile file) {
          print('Picture captured: ${file.path}');

          // Example: Navigate to a new screen to display the picture
          // Navigator.push(
          //   context,
          //   MaterialPageRoute(
          //     builder: (context) => DisplayPictureScreen(imagePath: file.path),
          //   ),
          // );

          // Or display in a dialog:
          showDialog(
            context: context,
            builder: (context) => AlertDialog(
              title: Text("Picture Taken"),
              content: Image.file(File(file.path)), // Requires 'dart:io'
              actions: [
                TextButton(
                    onPressed: Navigator.of(context).pop, child: Text("OK"))
              ],
            ),
          );
        },

        // --- Optional Styling ---
        // iconSize: 70,
        // icon: Icons.camera_alt,
        // backgroundColor: Colors.blue,
        // foregroundColor: Colors.white,
        // showCameraButton: true, // Defaults to true if onTakePicture is provided
      ),
    );
  }
}

/// Structure of the Recognition object (as typically used in TFLite examples)
/// You might need to adjust based on your exact internal 'Recognition' class.
/*
class Recognition {
  final int id;        // Usually an index
  final String label;  // Detected object name
  final double score;  // Confidence score (0.0 to 1.0)
  final Rect location; // Bounding box (relative coordinates)

  Recognition(this.id, this.label, this.score, this.location);
}
*/

Use code with caution.


DetectorWidget Parameters
onResult (required, Function(List<Recognition>)): Callback invoked frequently with the list of detected objects. Each Recognition object provides details like label, confidence score, and location.
onTakePicture (optional, Function(XFile)?): Callback invoked when the user taps the capture button. Returns an XFile object (from the camera package). If this is null, the camera button will not be shown by default.
icon (optional, IconData): Icon used for the capture button. Defaults to Icons.camera.
backgroundColor (optional, Color): Background color of the capture button circle. Defaults to Colors.white.
foregroundColor (optional, Color): Color of the icon on the capture button. Defaults to Colors.black.
iconSize (optional, double): Size of the capture button icon. Defaults to 100.0.
showCameraButton (optional, bool): Explicitly controls the visibility of the camera button. Defaults to true if onTakePicture is provided, otherwise false. If set to true but onTakePicture is null, the button will appear but do nothing when tapped.

Model Used

This package uses a pre-trained SSD MobileNet v1 model (ssd_mobilenet.tflite) and corresponding labels (labelmap.txt) bundled as assets to perform object detection. These assets are included within the package.

Acknowledgement

The core detection logic and widget structure are inspired by and adapt concepts from the official TensorFlow Lite Flutter example:

https://github.com/tensorflow/flutter-tflite/blob/main/example/live_object_detection_ssd_mobilenet

Additional Information

Contributions are welcome!

License

This package is licensed under the MIT License.

ptBR

Um widget Flutter que fornece uma visualização de câmera ao vivo para detecção de objetos em tempo real usando um modelo SSD MobileNet TFLite embutido. Ele exibe caixas delimitadoras com rótulos e pontuações de confiança para objetos detectados e oferece uma interface simples para integração.

Funcionalidades

  • Realiza detecção de objetos em tempo real usando o modelo SSD MobileNet TFLite incluído.
  • Desenha caixas delimitadoras (bounding boxes) e rótulos ao redor dos objetos detectados.
  • Fornece resultados da detecção através do callback onResult (inclui rótulo, confiança, localização).
  • Opcionalmente, inclui um botão personalizável para capturar fotos (callback onTakePicture).
  • Gerencia internamente a inicialização da câmera e o ciclo de vida.