facetagr 0.0.21 copy "facetagr: ^0.0.21" to clipboard
facetagr: ^0.0.21 copied to clipboard

Flutter plugin for FaceTagr biometric face verification with real-time face detection, liveness checking, and secure API authentication.

FaceTagr Flutter SDK #

FaceTagr is a Flutter plugin that enables biometric face verification using the FaceTagr backend.

Features #

✔ Face detection
✔ Secure API authentication
✔ Real-time face verification
✔ Enterprise branding customization

📚 Contents #

  • Installation
  • Quick Start
  • Setup
  • Permissions
  • Initialization
  • Live Preview
  • Testing the Integration
  • Events
  • Flow Diagram
  • Troubleshooting

📦 Installation #

dependencies: facetagr: ^0.0.21

Usage #

import 'package:facetagr/facetagr.dart';

final faceTagr = FaceTagr();

faceTagr.initializeAndAwait(...);

🚀 Quick Start (2 Minutes) #

  1. Add package to pubspec.yaml
  2. Run flutter pub get
  3. Run facetagr_init command
  4. Call initializeAndAwait()
  5. Open FaceTagrLivePreview

🚀 Minimal Integration Example #

await Facetagr.initializeAndAwait(...);

Navigator.push(
 context,
 MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
);

Platform Support: Android ✔ | iOS ✔ | Web ❌ (Web is not supported) #

Add the SDK to your pubspec.yaml:

dependencies:
  facetagr: ^0.0.21
  camera: ^0.10.5+9
  wakelock_plus: 1.3.3
  uuid: ^4.5.1

🛠️ Install FaceTagr Tools CLI (Dev Dependency) #

The FaceTagr Tools CLI is required only during development to initialize and configure your FaceTagr environment.

Step 1 — Add to pubspec.yaml

Open your pubspec.yaml file and add the following under dev_dependencies:

dev_dependencies:
  facetagr_tools: ^0.0.21

Step 2 — Install Dependencies

Run the following command in your terminal:

flutter pub get

This will install the FaceTagr Tools CLI locally for your project.

📥 Download Required SDK Images #

The FaceTagr SDK uses two images for the camera UI and guidance screens.

Please download the following files:

Logo image

👉 https://notiontag.facetagr.com/images/sdk/logo.png

Help/Instruction image

👉 https://notiontag.facetagr.com/images/sdk/help.png

📁 Step 1 — Add Images to Your Flutter Project

Create the following folder inside your Flutter project:

/assets/facetagr/

Place the downloaded images inside:

Expected folder structure:

project_root/
 ├─ assets/
 │   └─ facetagr/
 │       ├─ logo.png
 │       └─ help.png

⚙️ Step 2 — Register Assets in pubspec.yaml

Open pubspec.yaml and add:

flutter:
  assets:
    - assets/facetagr/logo.png
    - assets/facetagr/help.png

⚙️ Setup — FaceTagr Tools CLI Initialization #

Before running your application, you must initialize the FaceTagr environment. This step downloads required models and prepares the SDK configuration.

🚀 Step 1 — Run Initialization Command

Execute the following command in your project terminal:

dart run facetagr_tools:facetagr_init \
  --clientID <clientID> \
  --clientKey <clientKey> \
  --apiURL <apiURL> \
  --path <path>

🔑 Required Parameters

Parameter Description
--clientID Your FaceTagr Client ID provided by FaceTagr
--clientKey Secure client key used to generate authentication hash
--apiURL Base URL of your FaceTagr backend API
--path Local Pub cache directory where FaceTagr models will be installed

📁 Step 2 — Provide Pub Cache Path

The --path parameter must point to your Flutter Pub cache folder.

Windows:

C:\Users\<USERNAME>\AppData\Local\Pub\Cache\hosted\pub.dev

Mac/Linux:

/Users/<USERNAME>/.pub-cache/hosted/pub.dev

✅ What This Command Does

•	Downloads FaceTagr model files
•	Configures SDK environment
•	Prepares local runtime dependencies

📱 Required Permissions for FaceTagr (Android & iOS) #

The FaceTagr Flutter package requires certain platform permissions to access the device camera and perform secure face recognition.

Include the following configurations in AndroidManifest.xml and iOS Info.plist.

🟩 Android — Required Permissions #

Add the following inside:

android/app/src/main/AndroidManifest.xml

(Place permissions outside application)

✅ Required AndroidManifest.xml Section for FaceTagr


<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.yourapp">

    <!-- Permissions -->
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.WAKE_LOCK" />

    <!-- Required Hardware -->
    <uses-feature android:name="android.hardware.camera" android:required="false" />
    <uses-feature android:name="android.hardware.camera.front" android:required="false" />
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />

    <application
        android:label="yourapp"
        android:icon="@mipmap/ic_launcher"
        android:requestLegacyExternalStorage="true"
        android:usesCleartextTraffic="true">

        <!-- (Other Flutter auto-generated metadata remains unchanged) -->

    </application>

</manifest>

🟦 iOS — Required Permissions #

Add the following inside:

ios/Runner/Info.plist (Place inside the main

✅ Required iOS Info.plist Section for FaceTagr

<key>NSCameraUsageDescription</key>
<string>This app requires camera access to use the FaceTagr SDK.</string>
<key>NSMicrophoneUsageDescription</key>
<string>This app may require microphone access to support features in the FaceTagr SDK.</string>

📥 Import Package #

import 'package:facetagr/facetagr.dart';
import 'package:camera/camera.dart';
import 'dart:convert';
import 'package:crypto/crypto.dart';
import 'package:uuid/uuid.dart';

🛠️ Initialization #

FaceTagr provides two ways to initialize the SDK. Choose one method, depending on your app flow.

Use when you want a clean async call and then navigate.

class _HomePageState extends State<HomePage> {
  bool _isProcessing = false;

  @override
  void initState() {
    super.initState();
  }

  @override
  void dispose() {
    super.dispose();
  }

  String fn_get_hash(String clientID, String utctime, String requestID, String clientKey) {
    String input = clientID + utctime + requestID + clientKey;
    var bytes = utf8.encode(input);
    var hash = sha512.convert(bytes);
    return hash.toString();
  }
  void _facetagr_initialize() {
    String clientKey = "clientKey";
    String apiURL = "https://yourapiurl.com";
    String clientID = "yourClientID";
    String externalID = "yourExternalID";
    String requestID = const Uuid().v4();
    String utcTime = DateTime.now().toUtc().toString();
    String hashcode = "hashcode";
    Facetagr.initializeAndAwait(
      apiURL: apiURL,
      clientID: clientID,
      externalID: externalID,
      hashcode: hashcode,
      utcTime: utcTime,
      requestID: requestID,

      // Optional – enables user face registration flow  
      // Default value is "false".
      allowUserRegistration: false,
    ).then((message) {
      if (message['StatusCode'] == "1001") {
        if (!mounted) return;
        setState(() => _isProcessing = true);
        Navigator.of(context).push(
          MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
        ).then((_) => mounted ? setState(() => _isProcessing = false) : null);
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(message['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('FaceTagr')),
      body: Center(
        child: Padding(
          padding: const EdgeInsets.all(20),
          child: Column(
            crossAxisAlignment: CrossAxisAlignment.center,
            mainAxisAlignment: MainAxisAlignment.center,

            children: [
              ElevatedButton.icon(
                icon: const Icon(Icons.face),
                label: const Text('Start'),
                onPressed: () {
                  _facetagr_initialize();
                },
              ),
              const SizedBox(height: 30),
              if (_isProcessing) const CircularProgressIndicator(),
            ],
          ),
        ),
      ),
    );
  }
}

✔ Direct result handling (no streams)

Option 2 — Event-based initialization (Stream Listener) #

Use when your app listens for init events globally.

class _HomePageState extends State<HomePage> {
StreamSubscription<String>? _initSub;
Facetagr _faceTagr = Facetagr();

@override
void initState() {
  super.initState();
  _facetagr_initialize();
  _listenToBroadcast();
}
void _listenToBroadcast() {
    _initSub = Facetagr.initStream.listen((message) {
      final decoded = jsonDecode(message);
      if (decoded['StatusCode'] == "1001") {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(decoded['StatusMessage'])),
        );
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(decoded['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });
  }

void _facetagr_initialize() {
    String clientKey = "clientKey";
    String apiURL = "https://yourapiurl.com";
    String clientID = "yourClientID";
    String externalID = "yourExternalID";
    String requestID = const Uuid().v4();
    String utcTime = DateTime.now().toUtc().toString();
    String hashcode = "hashcode";
    _faceTagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
  }
  @override
  void dispose() {
    _initSub?.cancel();
    super.dispose();
  }
}

✔ Automatically receives events

✔ Matches native FaceTagr behavior

🔑 Hash Logic #

FaceTagr uses a SHA-512 hash for request signing.


import 'dart:convert';
import 'package:crypto/crypto.dart';

String fn_get_hash(String clientID, String utcTime, String requestID, String clientKey) {
  String input = clientID + utcTime + requestID + clientKey;
  var bytes = utf8.encode(input);
  var hash = sha512.convert(bytes);
  return hash.toString();
}

Example Flow

String requestID = const Uuid().v4();
String utcTime   = DateTime.now().toUtc().toString();
String hash      = fn_get_hash(clientID, utcTime, requestID, clientKey);

_faceTagr.init(apiURL, clientID, externalID, hash, utcTime, requestID);

🔒 Best practice: Generate the hash server-side (so the clientKey never sits inside the app).

🎧 Listening to Events #

• Initialization → Facetagr.initStream.listen(...)

• Face Match → Facetagr.faceMatchStream.listen(...)

Events are returned as JSON:


{
  "StatusCode": "1001",
  "StatusMessage": "Success"
}

🖼️ Live Preview Widget #

Navigator.push(
  context,
  MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
);

The widget provides: #

• Front camera stream

• Face bounding box overlays

• Spinner while matching

• Dialogs on success/failure

📷 FaceTagr Camera Widget (FaceTagrLivePreview.dart) #

Create a new file named FaceTagrLivePreview.dart in your Flutter app and add the widget code below. This widget provides the camera preview and integrates with the FaceTagr SDK.

import 'dart:async';
import 'dart:convert';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:facetagr/facetagr.dart';
import 'package:wakelock_plus/wakelock_plus.dart';
import 'main.dart';

const bool kDetectorMirrorsFront = false;
const bool kDetectorReturnsPreviewOrientedBoxes = true;

class FaceTagrLivePreview extends StatefulWidget {
  const FaceTagrLivePreview({Key? key}) : super(key: key);

  @override
  State<FaceTagrLivePreview> createState() => _FaceTagrLivePreviewState();
}

class _FaceTagrLivePreviewState extends State<FaceTagrLivePreview> {
  StreamSubscription<String>? _matchSub;
  CameraController? _controller;
  bool _isInitializing = false;
  bool _isDetecting = false;
  bool _isStreamPaused = false;
  Rect? _faceBox;
  String _status = "Initializing camera...";
  bool _showSpinner = false;
  bool _showWhiteScreen = false;
  String _deviceType = "";

  @override
  void initState() {
    super.initState();
    _deviceType = Platform.isIOS ? "ios" : "android";
    WakelockPlus.enable();
    _listenToBroadcast();
    initializeCamera();
  }

  Future<void> _listenToBroadcast() async {
    _matchSub = Facetagr.faceMatchStream.listen((message) {
      try {
        final decoded = jsonDecode(message);
        final int statusCode =
            int.tryParse(decoded['StatusCode'].toString()) ?? -1;
        final String statusMessage = decoded['StatusMessage'];
        if (!mounted) return;
        if (statusCode < 5000) {
          _showPopup(statusCode, statusMessage);
        } else {
          setState(() {
            _status = statusMessage;
            _faceBox = null;
          });
        }
      } catch (_) {
        // not JSON; ignore
      }
    });
  }

  void _showPopup(int statusCode, String message) {
    showDialog(
      context: context,
      barrierDismissible: false, // prevent closing by tapping outside
      builder: (BuildContext context) {
        return AlertDialog(
          shape: RoundedRectangleBorder(
            borderRadius: BorderRadius.circular(12),
          ),
          backgroundColor: Colors.white,
          title: const Text(
            "FaceTagr",
            style: TextStyle(color: Colors.blue),
          ),
          content: Text(
            message,
            style: const TextStyle(color: Colors.lightBlue),
          ),
          actions: [
            TextButton(
              child: const Text("OK", style: TextStyle(color: Colors.blue)),
              onPressed: () {
                Navigator.of(context).pop();
                if (statusCode == 1001) {
                  Navigator.of(context).pushAndRemoveUntil(
                    MaterialPageRoute(builder: (_) => const HomePage()),
                        (route) => false,
                  );
                } else {
                  setState(() {
                    _showSpinner = false;
                  });
                  _isStreamPaused = false;
                }
              },
            ),
          ],
        );
      },
    );
  }

  Future<void> initializeCamera() async {
    if (_isInitializing) return;
    _isInitializing = true;

    try {
      final cameras = await availableCameras();
      if (cameras.isEmpty) {
        if (mounted) setState(() => _status = "No cameras found");
        return;
      }

      final camera = cameras.firstWhere(
            (c) => c.lensDirection == CameraLensDirection.front,
        orElse: () => cameras.first,
      );

      _controller = CameraController(
        camera,
        ResolutionPreset.medium,
        imageFormatGroup: _deviceType == "android"
            ? ImageFormatGroup.nv21
            : ImageFormatGroup.bgra8888,
        enableAudio: false,
      );

      await _controller!.initialize();
      if (!mounted) return;

      if (!_controller!.value.isInitialized) {
        if (mounted) setState(() => _status = "Camera failed to initialize");
        return;
      }

      int lastFrameTime = 0;

      await _controller!.startImageStream((CameraImage image) async {
        if (!mounted) return;

        if (_isStreamPaused) return;

        final int currentTime = DateTime.now().millisecondsSinceEpoch;
        if (_isDetecting || (currentTime - lastFrameTime < 500)) return;

        lastFrameTime = currentTime;

        _isDetecting = true;
        try {
          final int width = image.width;
          final int height = image.height;
          Map<String, dynamic>? result;
          if (_deviceType == "android") {
            final yuv = _concatenatePlanes(image.planes);
            result = await Facetagr.detectFace(yuv, width, height, 8);
          } else if (_deviceType == "ios") {
            final yuv = _bgraToYUV420(image); // BGRA to YUV420
            result = await Facetagr.detectFace(yuv, width, height, 1);
          }

          if (!mounted) return;

          if (result is Map && result?["status"] != null) {
            final int status = result?["status"];
            final String msg = (result?["message"] ?? "").toString();
            final double left = (result?['x1'] ?? 0).toDouble();
            final double top = (result?['y1'] ?? 0).toDouble();
            final double w = (result?['width'] ?? 0).toDouble();
            final double h = (result?['height'] ?? 0).toDouble();

            if (status == 1003 || status == 4001) {
              setState(() {
                _status = "";
                _showWhiteScreen = true;
                _showSpinner = false;
                _faceBox = null;
              });
              return;
            }

            setState(() {
              _showWhiteScreen = false;
            });

            if (status == 1001 || status == 1002) {
              _showSpinner = true;
              setState(() {
                _status = "";
                _faceBox = Rect.fromLTRB(left, top, w, h);
              });

              _isStreamPaused = true;

            } else if (status == 1000) {
              setState(() {
                _status = msg;
                _faceBox = Rect.fromLTRB(left, top, w, h);
              });
            }
          } else {
            setState(() {
              _status = "Error";
              _faceBox = null;
            });
          }
        } catch (e) {
          if (mounted) {
            setState(() {
              _status = "Error: $e";
              _faceBox = null;
            });
          }
        } finally {
          _isDetecting = false;
        }
      });
    } catch (e) {
      if (!mounted) return;
      setState(() => _status = "Camera error: $e");
    } finally {
      _isInitializing = false;
      _isDetecting = false;
    }
  }

  // Put this inside your State class (_FaceTagrLivePreviewState)
  Uint8List _concatenatePlanes(List<Plane> planes) {
    final WriteBuffer allBytes = WriteBuffer();
    for (Plane plane in planes) {
      allBytes.putUint8List(plane.bytes);
    }
    return allBytes.done().buffer.asUint8List();
  }

  Uint8List _bgraToYUV420(CameraImage image) {
    final int width = image.width;
    final int height = image.height;
    final int frameSize = width * height;
    final Uint8List yuv = Uint8List(frameSize + (frameSize ~/ 2));
    final Uint8List bgra = image.planes[0].bytes;

    int yIndex = 0;
    int uvIndex = frameSize;

    for (int j = 0; j < height; j++) {
      for (int i = 0; i < width; i++) {
        final int index = (j * width + i) * 4;

        final int b = bgra[index];
        final int g = bgra[index + 1];
        final int r = bgra[index + 2];

        final y = (((66 * r + 129 * g + 25 * b + 128) >> 8) + 16).clamp(0, 255);
        final u =
        (((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128).clamp(0, 255);
        final v =
        (((112 * r - 94 * g - 18 * b + 128) >> 8) + 128).clamp(0, 255);

        yuv[yIndex++] = y;

        if (j % 2 == 0 && i % 2 == 0) {
          yuv[uvIndex++] = v;
          yuv[uvIndex++] = u;
        }
      }
    }

    return yuv;
  }

  @override
  void dispose() {
    WakelockPlus.disable();
    _matchSub?.cancel();
    final controller = _controller;
    _controller = null;
    try {
      if (controller != null) {
        if (controller.value.isStreamingImages) {
          unawaited(controller.stopImageStream());
        }
        controller.dispose();
      }
    } catch (e, st) {
      if (kDebugMode) {
        debugPrint('Error disposing camera controller: $e');
        debugPrint('$st');
      }
    }
    super.dispose();
  }

  Widget _buildCompactInstruction(IconData icon, String text) {
    return Expanded(
      child: Column(
        mainAxisSize: MainAxisSize.min,
        children: [
          Icon(icon, color: Colors.black54, size: 24),
          const SizedBox(height: 6),
          Text(
            text,
            textAlign: TextAlign.center,
            style: const TextStyle(
              color: Colors.black54,
              fontSize: 16,
              fontWeight: FontWeight.w600,
            ),
          ),
        ],
      ),
    );
  }

  @override
  Widget build(BuildContext context) {
    final controller = _controller;
    if (controller == null || !controller.value.isInitialized) {
      return const Scaffold(body: Center(child: CircularProgressIndicator()));
    }
    final Widget basePreview = CameraPreview(controller);
    Widget previewWidget = basePreview;

    final Size sensorSize = controller.value.previewSize!;
    final Size screenSize = MediaQuery.of(context).size;

    final bool isPortrait = screenSize.height > screenSize.width;
    final Size orientedSensor =
    isPortrait ? Size(sensorSize.height, sensorSize.width) : sensorSize;

    final int overlayRotation = kDetectorReturnsPreviewOrientedBoxes
        ? 0
        : controller.description.sensorOrientation;

    final Size imageSpaceSize = kDetectorReturnsPreviewOrientedBoxes
        ? orientedSensor
        : Size(sensorSize.width, sensorSize.height);

    bool overlayMirror = true;
    if (_deviceType == "ios") {
      overlayMirror = false;
    }

    return Scaffold(
      appBar: AppBar(title: const Text('FaceTagr')),
      body: Stack(
        children: [
          Positioned.fill(
            child: FittedBox(
              fit: BoxFit.cover,
              child: SizedBox(
                width: orientedSensor.width,
                height: orientedSensor.height,
                child: Stack(
                  fit: StackFit.passthrough,
                  children: [
                    previewWidget,
                    if (_faceBox != null)
                      CustomPaint(
                        size: orientedSensor,
                        painter: FaceBoxPainter(
                          faceBoxImageSpace: _faceBox!,
                          imageSize: imageSpaceSize,
                          mirrorHorizontally: overlayMirror,
                          rotationDegrees: overlayRotation,
                          label: "",
                        ),
                      ),
                  ],
                ),
              ),
            ),
          ),
          if (_showSpinner)
            Positioned.fill(
              child: Container(
                color: Colors.blue,
                child: const Center(
                  child: CircularProgressIndicator(),
                ),
              ),
            ),
          if (_showWhiteScreen)
            Positioned.fill(
              child: Container(
                color: Colors.white,
                padding: const EdgeInsets.all(32.0),
                child: Stack(
                  children: [
                    Positioned.fill(
                      child: Column(
                        mainAxisAlignment: MainAxisAlignment.center,
                        children: [
                          const Text(
                            "Identity Verification",
                            style: TextStyle(
                              color: Colors.blue,
                              fontSize: 24,
                              fontWeight: FontWeight.bold,
                            ),
                          ),
                          const Spacer(flex: 1),
                          Image.asset(
                            'assets/facetagr/help.png',
                            width: 300,
                            fit: BoxFit.fitWidth,
                          ),
                          const SizedBox(height: 20),
                          const Text(
                            "Please look at the camera",
                            style: TextStyle(
                              color: Colors.black,
                              fontSize: 20,
                              fontWeight: FontWeight.normal,
                            ),
                          ),
                          const SizedBox(height: 20),
                          Row(
                            mainAxisAlignment: MainAxisAlignment.spaceEvenly,
                            crossAxisAlignment: CrossAxisAlignment.start,
                            children: [
                              _buildCompactInstruction(
                                  Icons.face, "Keep face straight"),
                              _buildCompactInstruction(
                                  Icons.visibility_off, "Remove mask or glasses"),
                              _buildCompactInstruction(
                                  Icons.light_mode, "Ensure good lighting"),
                            ],
                          ),
                          const SizedBox(height: 20),
                          Container(
                            width: 100,
                            height: 4,
                            decoration: BoxDecoration(
                              color: Colors.blue.withOpacity(0.2),
                              borderRadius: BorderRadius.circular(2),
                            ),
                          ),
                          const Spacer(flex: 2),
                        ],
                      ),
                    ),
                    Align(
                      alignment: Alignment.bottomRight,
                      child: Image.asset(
                        'assets/facetagr/logo.png',
                        width: 150,
                      ),
                    ),
                  ],
                ),
              ),
            ),
          if (_status != "" && !_showWhiteScreen)
            Positioned(
              left: 16,
              right: 16,
              bottom: 16,
              child: Container(
                padding:
                const EdgeInsets.symmetric(horizontal: 12, vertical: 8),
                decoration: BoxDecoration(
                  color: Colors.white,
                  borderRadius: BorderRadius.circular(8),
                ),
                child:
                Text(_status, style: const TextStyle(color: Colors.blue)),
              ),
            ),
        ],
      ),
    );
  }
}

class FaceBoxPainter extends CustomPainter {
  final Rect faceBoxImageSpace;
  final Size imageSize;
  final bool mirrorHorizontally;
  final int rotationDegrees;
  final String? label;

  FaceBoxPainter({
    required this.faceBoxImageSpace,
    required this.imageSize,
    required this.mirrorHorizontally,
    required this.rotationDegrees,
    this.label,
  });

  @override
  void paint(Canvas canvas, Size size) {
    final _Rotated r =
    _rotateRect(faceBoxImageSpace, imageSize, rotationDegrees);
    final double sx = size.width / r.rotatedImageSize.width;
    final double sy = size.height / r.rotatedImageSize.height;

    Rect box = Rect.fromLTWH(
      r.rect.left * sx,
      r.rect.top * sy,
      r.rect.width * sx,
      r.rect.height * sy,
    );

    if (mirrorHorizontally) {
      box = Rect.fromLTWH(
          size.width - (box.left + box.width), box.top, box.width, box.height);
    }

    box = Rect.fromLTWH(
      box.left,
      box.top,
      box.width,
      box.height,
    );
    _drawCornerTicks(canvas, box,
        color: Colors.green, length: 28, thickness: 2);

    if ((label ?? '').isNotEmpty) {
      _drawLabel(canvas, size, box, label!);
    }
  }

  void _drawCornerTicks(Canvas canvas, Rect box,
      {required Color color, double length = 22, double thickness = 3}) {
    final Paint p = Paint()
      ..color = color
      ..strokeWidth = thickness
      ..strokeCap = StrokeCap.round;

    final tl = box.topLeft;
    final tr = box.topRight;
    final bl = box.bottomLeft;
    final br = box.bottomRight;

    canvas.drawLine(tl, tl + Offset(length, 0), p);
    canvas.drawLine(tl, tl + Offset(0, length), p);
    canvas.drawLine(tr, tr + Offset(-length, 0), p);
    canvas.drawLine(tr, tr + Offset(0, length), p);
    canvas.drawLine(bl, bl + Offset(length, 0), p);
    canvas.drawLine(bl, bl + Offset(0, -length), p);
    canvas.drawLine(br, br + Offset(-length, 0), p);
    canvas.drawLine(br, br + Offset(0, -length), p);
  }

  void _drawLabel(Canvas canvas, Size screenSize, Rect box, String text) {
    final TextPainter tp = TextPainter(
      text: TextSpan(
        text: text,
        style: const TextStyle(
          color: Colors.white,
          fontSize: 14,
          fontWeight: FontWeight.bold,
        ),
      ),
      textDirection: TextDirection.ltr,
    )..layout(maxWidth: screenSize.width * 0.8);

    Offset textOffset = Offset(box.left, box.top - tp.height - 6);
    if (textOffset.dy < 0) {
      textOffset = Offset(box.left, box.bottom + 6);
    }

    tp.paint(canvas, textOffset);
  }

  @override
  bool shouldRepaint(covariant FaceBoxPainter old) =>
      old.faceBoxImageSpace != faceBoxImageSpace ||
          old.imageSize != imageSize ||
          old.mirrorHorizontally != mirrorHorizontally ||
          old.rotationDegrees != rotationDegrees ||
          old.label != label;
}

class _Rotated {
  final Rect rect;
  final Size rotatedImageSize;
  _Rotated(this.rect, this.rotatedImageSize);
}

_Rotated _rotateRect(Rect r, Size img, int deg) {
  switch (deg % 360) {
    case 0:
      return _Rotated(r, img);
    case 90:
      return _Rotated(
        Rect.fromLTWH(
          img.height - (r.top + r.height),
          r.left,
          r.height,
          r.width,
        ),
        Size(img.height, img.width),
      );
    case 180:
      return _Rotated(
        Rect.fromLTWH(
          img.width - (r.left + r.width),
          img.height - (r.top + r.height),
          r.width,
          r.height,
        ),
        img,
      );
    case 270:
      return _Rotated(
        Rect.fromLTWH(
          r.top,
          img.width - (r.left + r.width),
          r.height,
          r.width,
        ),
        Size(img.height, img.width),
      );
    default:
      return _Rotated(r, img);
  }
}

📷 Open FaceTagr Camera #

Navigator.of(context).push(
  MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
).then((_) => mounted ? setState(() => _isProcessing = false) : null);

This launches the built-in FaceTagrLivePreview widget with face recognition.

🧪 Testing the Integration #

After completing setup, verify your integration:

  1. Run the Flutter application.
  2. Tap the Start button.
  3. Camera preview opens.
  4. Face detection begins automatically.
  5. Successful verification returns:
{
  "StatusCode": "1001",
  "StatusMessage": "Face verified successfully."
}

🔐 Logout #

await faceTagr.fnLogout();

This clears local tokens and resets the session.

📡 Event Payload Reference #

✅ Initialization Events (both init and initializeAndAwait) #

These events are sent after calling:

Facetagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
Facetagr.initializeAndAwait(
      apiURL,
      clientID,
      externalID,
      hashcode,
      utcTime,
      requestID,
    ).then((message) {
      if (message['StatusCode'] == "1001") {
        if (!mounted) return;
        setState(() => _isProcessing = true);
        Navigator.of(context).push(
          MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
        ).then((_) => mounted ? setState(() => _isProcessing = false) : null);
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(message['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });

Possible Payloads

StatusCode StatusMessage
1001 Connected successfully.
4001 Mandatory inputs can not be empty. Please try again with valid values.
4002 Input JSON is not valid.
4003 Given ClientID is not valid.
4004 Authentication failed.
4005 Given ExternalID is not valid.
4006 Licensing limits exceeded.
4007 Failed to connect.
5001 Oops! Something went wrong! Please try again!
5002 Unable to connect to the server. Please try again later.
5003 Internal server error. Please try again.
5004 Oops! Something went wrong! Please try again!

🎯 Face Match Events (faceMatchStream) #

These events are sent every time the camera detects a face and FaceTagr completes the verification.

Possible Payloads

StatusCode StatusMessage
1001 Face verified successfully.
1002 Face is not matching.
4001 No face found.
4002 Face size is less than the minimum required size.
4003 Face should be facing the camera straight.
4004 Face is blurred and/or not clear.
4005 Face is not live. Spoofing detected.
4006 Image format error.
4007 Input JSON is not valid.
4008 Given ClientID is not valid.
4009 Authentication failed.
4010 Given ExternalID is not valid.
5001 Oops! Something went wrong! Please try again!
5002 Unable to connect to the server. Please try again later.
5003 Face verification failed. Server error.
5004 Oops! Something went wrong! Please try again!
5005 Oops! Something went wrong! Please try again!
5006 Oops! Something went wrong! Please try again!

🧩 Event Payload Reference (Registration Mode – allowUserRegistration = true) #

When allowUserRegistration: true is passed in initialization, the FaceTagr SDK may return the following Face Registration Events:

✅ Face Registration Events

StatusCode StatusMessage
201 JSON format error. Given input not in the correct format.
200 Authentication failed.
300 This person already exists in your database: [ExternalID]
301 Collection name does not exist.
302 DisplayName is mandatory.
303 ExternalID is mandatory.
304 ExternalID exists. ExternalID should be unique.
310 The captured photo is not correct. Please try again !!
401 Face registration failed. No face found.
402 Face registration failed. Face box size is less than minimum required size.
403 Face registration failed. Face is blurred and/or not clear.
405 Face registration failed. Face should be facing the camera straight.
501 Face registration failed. Server error.
503 Oops something went wrong. Please try again !!

🔄 Flow Diagram #

sequenceDiagram
    participant App
    participant FaceTagr SDK
    participant Backend API

    App->>FaceTagr SDK: init(apiURL, clientID, externalID, hash, time, reqID)
    FaceTagr SDK->>Backend API: Validate credentials
    Backend API-->>FaceTagr SDK: Auth success
    FaceTagr SDK-->>App: Init success (1001)

    App->>FaceTagr SDK: Open Camera
    FaceTagr SDK->>Backend API: Stream frames
    Backend API-->>FaceTagr SDK: Match success (1001)
    FaceTagr SDK-->>App: FaceMatch event

✅ Quick Recap #

  1. Add facetagr to dependencies
  2. Add facetagr_tools to dev_dependencies
  3. Run initialization command
  4. Generate SHA-512 hash.
  5. Call init() with credentials.
  6. Open camera (FaceTagrLivePreview).
  7. Listen for faceMatchStream.

🔥 Troubleshooting #

If you encounter issues during integration, check the following common scenarios:

📷 Camera not opening #

✔ Ensure camera permission is added:

Android → AndroidManifest.xml
iOS → Info.plist

✔ Verify runtime permissions are granted.


🔐 Authentication Failed (4004 / 4009) #

✔ Verify clientID and externalID values.

✔ Ensure hashcode is generated correctly.

✔ Confirm utcTime and requestID values are valid.


👤 No Face Detected #

✔ Ensure good lighting conditions.

✔ Face must be clearly visible and facing camera.

✔ Avoid masks, glasses, or heavy shadows.


📡 Unable to connect to server (5002) #

✔ Check API URL.

✔ Verify internet connectivity.


🎥 Camera Preview Black Screen #

✔ Ensure device supports front camera.

✔ Restart app after granting permissions.


If issues persist, contact:

📧 support@facetagr.com

License #

This package is part of the FaceTagr ecosystem.

© 2026 FaceTagr. All rights reserved.

💬 Support #

For integration support, please contact:

📧 support@facetagr.com

🌐 https://www.facetagr.com

0
likes
100
points
306
downloads

Documentation

API reference

Publisher

unverified uploader

Weekly Downloads

Flutter plugin for FaceTagr biometric face verification with real-time face detection, liveness checking, and secure API authentication.

Homepage

License

unknown (license)

Dependencies

dio, flutter, flutter_secure_storage, image, uuid

More

Packages that depend on facetagr

Packages that implement facetagr