facetagr 0.0.19
facetagr: ^0.0.19 copied to clipboard
At FaceTagr, we are pioneers in advanced face recognition technology, delivering solutions that are accurate, reliable, and scalable. Our NIST-tested algorithms, with 99.91% accuracy, ensure that our [...]
FaceTagr Flutter Package #
The FaceTagr Flutter package allows third-party teams to integrate face recognition capabilities into their applications. This package provides two primary functions: initialization (init) and face matching (fnFaceMatch).
๐ Contents #
- Installation
- Quick Start
- Setup
- Permissions
- Initialization
- Live Preview
- Testing the Integration
- Events
- Flow Diagram
- Troubleshooting
๐ฆ Installation #
๐ Quick Start (2 Minutes) #
- Add package to pubspec.yaml
- Run flutter pub get
- Run facetagr_init command
- Call initializeAndAwait()
- Open FaceTagrLivePreview
๐ Minimal Integration Example #
await Facetagr.initializeAndAwait(...);
Navigator.push(
context,
MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
);
โ Platform Support: Android โ | iOS โ | Web โ (Web is not supported) #
Add the SDK to your pubspec.yaml:
dependencies:
facetagr: ^0.0.19
camera: ^0.10.5+9
wakelock_plus: 1.3.3
uuid: ^4.5.1
๐ ๏ธ Install FaceTagr Tools CLI (Dev Dependency) #
The FaceTagr Tools CLI is required only during development to initialize and configure your FaceTagr environment.
Step 1 โ Add to pubspec.yaml
Open your pubspec.yaml file and add the following under dev_dependencies:
dev_dependencies:
facetagr_tools: ^0.0.19
Step 2 โ Install Dependencies
Run the following command in your terminal:
flutter pub get
This will install the FaceTagr Tools CLI locally for your project.
๐ฅ Download Required SDK Images #
The FaceTagr SDK uses two images for the camera UI and guidance screens.
Please download the following files:
Logo image
๐ https://notiontag.facetagr.com/images/sdk/logo.png
Help/Instruction image
๐ https://notiontag.facetagr.com/images/sdk/help.png
๐ Step 1 โ Add Images to Your Flutter Project
Create the following folder inside your Flutter project:
/assets/facetagr/
Place the downloaded images inside:
Expected folder structure:
project_root/
โโ assets/
โ โโ facetagr/
โ โโ logo.png
โ โโ help.png
โ๏ธ Step 2 โ Register Assets in pubspec.yaml
Open pubspec.yaml and add:
flutter:
assets:
- assets/facetagr/logo.png
- assets/facetagr/help.png
โ๏ธ Setup โ FaceTagr Tools CLI Initialization #
Before running your application, you must initialize the FaceTagr environment. This step downloads required models and prepares the SDK configuration.
๐ Step 1 โ Run Initialization Command
Execute the following command in your project terminal:
dart run facetagr_tools:facetagr_init \
--clientID <clientID> \
--clientKey <clientKey> \
--apiURL <apiURL> \
--path <path>
๐ Required Parameters
| Parameter | Description |
|---|---|
--clientID |
Your FaceTagr Client ID provided by FaceTagr |
--clientKey |
Secure client key used to generate authentication hash |
--apiURL |
Base URL of your FaceTagr backend API |
--path |
Local Pub cache directory where FaceTagr models will be installed |
๐ Step 2 โ Provide Pub Cache Path
The --path parameter must point to your Flutter Pub cache folder.
Windows:
C:\Users\<USERNAME>\AppData\Local\Pub\Cache\hosted\pub.dev
Mac/Linux:
/Users/<USERNAME>/.pub-cache/hosted/pub.dev
โ What This Command Does
โข Downloads FaceTagr model files
โข Configures SDK environment
โข Prepares local runtime dependencies
๐ฑ Required Permissions for FaceTagr (Android & iOS) #
The FaceTagr Flutter package requires certain platform permissions to access the device camera and perform secure face recognition.
Include the following configurations in AndroidManifest.xml and iOS Info.plist.
๐ฉ Android โ Required Permissions #
Add the following inside:
android/app/src/main/AndroidManifest.xml
(Place permissions outside application)
โ Required AndroidManifest.xml Section for FaceTagr
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.yourapp">
<!-- Permissions -->
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<!-- Required Hardware -->
<uses-feature android:name="android.hardware.camera" android:required="false" />
<uses-feature android:name="android.hardware.camera.front" android:required="false" />
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />
<application
android:label="yourapp"
android:icon="@mipmap/ic_launcher"
android:requestLegacyExternalStorage="true"
android:usesCleartextTraffic="true">
<!-- (Other Flutter auto-generated metadata remains unchanged) -->
</application>
</manifest>
๐ฆ iOS โ Required Permissions #
Add the following inside:
ios/Runner/Info.plist (Place inside the main
โ Required iOS Info.plist Section for FaceTagr
<key>NSCameraUsageDescription</key>
<string>This app requires camera access to use the FaceTagr SDK.</string>
<key>NSMicrophoneUsageDescription</key>
<string>This app may require microphone access to support features in the FaceTagr SDK.</string>
๐ฅ Import Package #
import 'package:facetagr/facetagr.dart';
import 'package:camera/camera.dart';
import 'dart:convert';
import 'package:crypto/crypto.dart';
import 'package:uuid/uuid.dart';
๐ ๏ธ Initialization #
FaceTagr provides two ways to initialize the SDK. Choose one method, depending on your app flow.
Option 1 โ Awaitable initialization (Recommended) #
Use when you want a clean async call and then navigate.
class _HomePageState extends State<HomePage> {
bool _isProcessing = false;
@override
void initState() {
super.initState();
}
@override
void dispose() {
super.dispose();
}
String fn_get_hash(String clientID, String utctime, String requestID, String clientKey) {
String input = clientID + utctime + requestID + clientKey;
var bytes = utf8.encode(input);
var hash = sha512.convert(bytes);
return hash.toString();
}
void _facetagr_initialize() {
String clientKey = "clientKey";
String apiURL = "https://yourapiurl.com";
String clientID = "yourClientID";
String externalID = "yourExternalID";
String requestID = const Uuid().v4();
String utcTime = DateTime.now().toUtc().toString();
String hashcode = "hashcode";
Facetagr.initializeAndAwait(
apiURL: apiURL,
clientID: clientID,
externalID: externalID,
hashcode: hashcode,
utcTime: utcTime,
requestID: requestID,
// Optional โ enables user face registration flow
// Default value is "false".
allowUserRegistration: false,
).then((message) {
if (message['StatusCode'] == "1001") {
if (!mounted) return;
setState(() => _isProcessing = true);
Navigator.of(context).push(
MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
).then((_) => mounted ? setState(() => _isProcessing = false) : null);
} else {
if (!mounted) return;
ScaffoldMessenger.of(context).showSnackBar(
SnackBar(content: Text(message['StatusMessage'])),
);
}
setState(() => _isProcessing = false);
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('FaceTagr')),
body: Center(
child: Padding(
padding: const EdgeInsets.all(20),
child: Column(
crossAxisAlignment: CrossAxisAlignment.center,
mainAxisAlignment: MainAxisAlignment.center,
children: [
ElevatedButton.icon(
icon: const Icon(Icons.face),
label: const Text('Start'),
onPressed: () {
_facetagr_initialize();
},
),
const SizedBox(height: 30),
if (_isProcessing) const CircularProgressIndicator(),
],
),
),
),
);
}
}
โ Direct result handling (no streams)
Option 2 โ Event-based initialization (Stream Listener) #
Use when your app listens for init events globally.
class _HomePageState extends State<HomePage> {
StreamSubscription<String>? _initSub;
Facetagr _faceTagr = Facetagr();
@override
void initState() {
super.initState();
_facetagr_initialize();
_listenToBroadcast();
}
void _listenToBroadcast() {
_initSub = Facetagr.initStream.listen((message) {
final decoded = jsonDecode(message);
if (decoded['StatusCode'] == "1001") {
if (!mounted) return;
ScaffoldMessenger.of(context).showSnackBar(
SnackBar(content: Text(decoded['StatusMessage'])),
);
} else {
if (!mounted) return;
ScaffoldMessenger.of(context).showSnackBar(
SnackBar(content: Text(decoded['StatusMessage'])),
);
}
setState(() => _isProcessing = false);
});
}
void _facetagr_initialize() {
String clientKey = "clientKey";
String apiURL = "https://yourapiurl.com";
String clientID = "yourClientID";
String externalID = "yourExternalID";
String requestID = const Uuid().v4();
String utcTime = DateTime.now().toUtc().toString();
String hashcode = "hashcode";
_faceTagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
}
@override
void dispose() {
_initSub?.cancel();
super.dispose();
}
}
โ Automatically receives events
โ Matches native FaceTagr behavior
๐ Hash Logic #
FaceTagr uses a SHA-512 hash for request signing.
import 'dart:convert';
import 'package:crypto/crypto.dart';
String fn_get_hash(String clientID, String utcTime, String requestID, String clientKey) {
String input = clientID + utcTime + requestID + clientKey;
var bytes = utf8.encode(input);
var hash = sha512.convert(bytes);
return hash.toString();
}
Example Flow
String requestID = const Uuid().v4();
String utcTime = DateTime.now().toUtc().toString();
String hash = fn_get_hash(clientID, utcTime, requestID, clientKey);
_faceTagr.init(apiURL, clientID, externalID, hash, utcTime, requestID);
๐ Best practice: Generate the hash server-side (so the clientKey never sits inside the app).
๐ง Listening to Events #
โข Initialization โ Facetagr.initStream.listen(...)
โข Face Match โ Facetagr.faceMatchStream.listen(...)
Events are returned as JSON:
{
"StatusCode": "1001",
"StatusMessage": "Success"
}
๐ผ๏ธ Live Preview Widget #
Navigator.push(
context,
MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
);
The widget provides: #
โข Front camera stream
โข Face bounding box overlays
โข Spinner while matching
โข Dialogs on success/failure
๐ท FaceTagr Camera Widget (FaceTagrLivePreview.dart) #
Create a new file named FaceTagrLivePreview.dart in your Flutter app and add the widget code below. This widget provides the camera preview and integrates with the FaceTagr SDK.
import 'dart:async';
import 'dart:convert';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:facetagr/facetagr.dart';
import 'package:wakelock_plus/wakelock_plus.dart';
import 'main.dart';
const bool kDetectorMirrorsFront = false;
const bool kDetectorReturnsPreviewOrientedBoxes = true;
class FaceTagrLivePreview extends StatefulWidget {
const FaceTagrLivePreview({Key? key}) : super(key: key);
@override
State<FaceTagrLivePreview> createState() => _FaceTagrLivePreviewState();
}
class _FaceTagrLivePreviewState extends State<FaceTagrLivePreview> {
StreamSubscription<String>? _matchSub;
CameraController? _controller;
bool _isDetecting = false;
Rect? _faceBox;
String _status = "Initializing camera...";
bool _showSpinner = false;
bool _showWhiteScreen = false;
String _deviceType = "";
@override
void initState() {
super.initState();
_deviceType = Platform.isIOS ? "ios" : "android";
WakelockPlus.enable();
_listenToBroadcast();
initializeCamera();
}
Future<void> _listenToBroadcast() async {
_matchSub = Facetagr.faceMatchStream.listen((message) {
try {
final decoded = jsonDecode(message);
final int statusCode =
int.tryParse(decoded['StatusCode'].toString()) ?? -1;
final String statusMessage = decoded['StatusMessage'];
if (!mounted) return;
if (statusCode < 5000) {
_showPopup(statusCode, statusMessage);
} else {
setState(() {
_status = statusMessage;
_faceBox = null;
});
}
} catch (_) {
// not JSON; ignore
}
});
}
void _showPopup(int statusCode, String message) {
showDialog(
context: context,
barrierDismissible: false, // prevent closing by tapping outside
builder: (BuildContext context) {
return AlertDialog(
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(12),
),
backgroundColor: Colors.white,
title: const Text(
"FaceTagr",
style: TextStyle(color: Colors.blue),
),
content: Text(
message,
style: const TextStyle(color: Colors.lightBlue),
),
actions: [
TextButton(
child: const Text("OK", style: TextStyle(color: Colors.blue)),
onPressed: () {
Navigator.of(context).pop();
if (statusCode == 1001) {
Navigator.of(context).pushAndRemoveUntil(
MaterialPageRoute(builder: (_) => const HomePage()),
(route) => false,
);
} else {
setState(() {
_showSpinner = false;
});
initializeCamera();
}
},
),
],
);
},
);
}
Future<void> initializeCamera() async {
try {
final cameras = await availableCameras();
final camera = cameras.firstWhere(
(c) => c.lensDirection == CameraLensDirection.front,
orElse: () => cameras.first,
);
if (_deviceType == "android") {
_controller = CameraController(
camera,
ResolutionPreset.medium,
imageFormatGroup: ImageFormatGroup.nv21,
enableAudio: false,
);
} else {
_controller = CameraController(
camera,
ResolutionPreset.medium,
imageFormatGroup: ImageFormatGroup.nv21,
enableAudio: false,
);
}
await _controller!.initialize();
if (!mounted) return;
int lastFrameTime = 0;
await _controller!.startImageStream((CameraImage image) async {
if (!mounted) return;
final int currentTime = DateTime.now().millisecondsSinceEpoch;
if (_isDetecting || (currentTime - lastFrameTime < 500)) return;
lastFrameTime = currentTime;
_isDetecting = true;
try {
final int width = image.width;
final int height = image.height;
Map<String, dynamic>? result;
if (_deviceType == "android") {
final yuv = _concatenatePlanes(image.planes);
result = await Facetagr.detectFace(yuv, width, height, 8);
} else if (_deviceType == "ios") {
final yuv = _bgraToYUV420(image); // BGRA to YUV420
result = await Facetagr.detectFace(yuv, width, height, 1);
}
if (!mounted) return;
if (result is Map && result?["status"] != null) {
final int status = result?["status"];
final String msg = (result?["message"] ?? "").toString();
final double left = (result?['x1'] ?? 0).toDouble();
final double top = (result?['y1'] ?? 0).toDouble();
final double w = (result?['width'] ?? 0).toDouble();
final double h = (result?['height'] ?? 0).toDouble();
if (status == 1003 || status == 4001) {
setState(() {
_status = "";
_showWhiteScreen = true;
_showSpinner = false;
_faceBox = null;
});
return;
}
setState(() {
_showWhiteScreen = false;
});
if (status == 1001 || status == 1002) {
_showSpinner = true;
setState(() {
_status = "";
_faceBox = Rect.fromLTRB(left, top, w, h);
});
await _controller?.stopImageStream();
} else if (status == 1000) {
setState(() {
_status = msg;
_faceBox = Rect.fromLTRB(left, top, w, h);
});
}
} else {
setState(() {
_status = "Error";
_faceBox = null;
});
}
} catch (e) {
if (mounted) {
setState(() {
_status = "Error: $e";
_faceBox = null;
});
}
} finally {
_isDetecting = false;
}
});
} catch (e) {
if (!mounted) return;
setState(() => _status = "Camera error: $e");
} finally {
_isDetecting = false;
}
}
// Put this inside your State class (_FaceTagrLivePreviewState)
Uint8List _concatenatePlanes(List<Plane> planes) {
final WriteBuffer allBytes = WriteBuffer();
for (Plane plane in planes) {
allBytes.putUint8List(plane.bytes);
}
return allBytes.done().buffer.asUint8List();
}
Uint8List _bgraToYUV420(CameraImage image) {
final int width = image.width;
final int height = image.height;
final int frameSize = width * height;
final Uint8List yuv = Uint8List(frameSize + (frameSize ~/ 2));
final Uint8List bgra = image.planes[0].bytes;
int yIndex = 0;
int uvIndex = frameSize;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
final int index = (j * width + i) * 4;
final int b = bgra[index];
final int g = bgra[index + 1];
final int r = bgra[index + 2];
final y = (((66 * r + 129 * g + 25 * b + 128) >> 8) + 16).clamp(0, 255);
final u =
(((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128).clamp(0, 255);
final v =
(((112 * r - 94 * g - 18 * b + 128) >> 8) + 128).clamp(0, 255);
yuv[yIndex++] = y;
if (j % 2 == 0 && i % 2 == 0) {
yuv[uvIndex++] = v;
yuv[uvIndex++] = u;
}
}
}
return yuv;
}
@override
void dispose() {
WakelockPlus.disable();
_controller?.dispose();
_matchSub?.cancel();
super.dispose();
}
Widget _buildCompactInstruction(IconData icon, String text) {
return Expanded(
child: Column(
mainAxisSize: MainAxisSize.min,
children: [
Icon(icon, color: Colors.black54, size: 24),
const SizedBox(height: 6),
Text(
text,
textAlign: TextAlign.center,
style: const TextStyle(
color: Colors.black54,
fontSize: 16,
fontWeight: FontWeight.w600,
),
),
],
),
);
}
@override
Widget build(BuildContext context) {
final controller = _controller;
if (controller == null || !controller.value.isInitialized) {
return const Scaffold(body: Center(child: CircularProgressIndicator()));
}
final Widget basePreview = CameraPreview(controller);
Widget previewWidget = basePreview;
final Size sensorSize = controller.value.previewSize!;
final Size screenSize = MediaQuery.of(context).size;
final bool isPortrait = screenSize.height > screenSize.width;
final Size orientedSensor =
isPortrait ? Size(sensorSize.height, sensorSize.width) : sensorSize;
final int overlayRotation = kDetectorReturnsPreviewOrientedBoxes
? 0
: controller.description.sensorOrientation;
final Size imageSpaceSize = kDetectorReturnsPreviewOrientedBoxes
? orientedSensor
: Size(sensorSize.width, sensorSize.height);
bool overlayMirror = true;
if (_deviceType == "ios") {
overlayMirror = false;
}
return Scaffold(
appBar: AppBar(title: const Text('FaceTagr')),
body: Stack(
children: [
Positioned.fill(
child: FittedBox(
fit: BoxFit.cover,
child: SizedBox(
width: orientedSensor.width,
height: orientedSensor.height,
child: Stack(
fit: StackFit.passthrough,
children: [
previewWidget,
if (_faceBox != null)
CustomPaint(
size: orientedSensor,
painter: FaceBoxPainter(
faceBoxImageSpace: _faceBox!,
imageSize: imageSpaceSize,
mirrorHorizontally: overlayMirror,
rotationDegrees: overlayRotation,
label: "",
),
),
],
),
),
),
),
if (_showSpinner)
Positioned.fill(
child: Container(
color: Colors.blue,
child: const Center(
child: CircularProgressIndicator(),
),
),
),
if (_showWhiteScreen)
Positioned.fill(
child: Container(
color: Colors.white,
padding: const EdgeInsets.all(32.0),
child: Stack(
children: [
Positioned.fill(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
const Text(
"Identity Verification",
style: TextStyle(
color: Colors.blue,
fontSize: 24,
fontWeight: FontWeight.bold,
),
),
const Spacer(flex: 1),
Image.asset(
'assets/facetagr/help.png',
width: 300,
fit: BoxFit.fitWidth,
),
const SizedBox(height: 20),
const Text(
"Please look at the camera",
style: TextStyle(
color: Colors.black,
fontSize: 20,
fontWeight: FontWeight.normal,
),
),
const SizedBox(height: 20),
Row(
mainAxisAlignment: MainAxisAlignment.spaceEvenly,
crossAxisAlignment: CrossAxisAlignment.start,
children: [
_buildCompactInstruction(
Icons.face, "Keep face straight"),
_buildCompactInstruction(
Icons.visibility_off, "Remove mask or glasses"),
_buildCompactInstruction(
Icons.light_mode, "Ensure good lighting"),
],
),
const SizedBox(height: 20),
Container(
width: 100,
height: 4,
decoration: BoxDecoration(
color: Colors.blue.withOpacity(0.2),
borderRadius: BorderRadius.circular(2),
),
),
const Spacer(flex: 2),
],
),
),
Align(
alignment: Alignment.bottomRight,
child: Image.asset(
'assets/facetagr/logo.png',
width: 150,
),
),
],
),
),
),
if (_status != "" && !_showWhiteScreen)
Positioned(
left: 16,
right: 16,
bottom: 16,
child: Container(
padding:
const EdgeInsets.symmetric(horizontal: 12, vertical: 8),
decoration: BoxDecoration(
color: Colors.white,
borderRadius: BorderRadius.circular(8),
),
child:
Text(_status, style: const TextStyle(color: Colors.blue)),
),
),
],
),
);
}
}
class FaceBoxPainter extends CustomPainter {
final Rect faceBoxImageSpace;
final Size imageSize;
final bool mirrorHorizontally;
final int rotationDegrees;
final String? label;
FaceBoxPainter({
required this.faceBoxImageSpace,
required this.imageSize,
required this.mirrorHorizontally,
required this.rotationDegrees,
this.label,
});
@override
void paint(Canvas canvas, Size size) {
final _Rotated r =
_rotateRect(faceBoxImageSpace, imageSize, rotationDegrees);
final double sx = size.width / r.rotatedImageSize.width;
final double sy = size.height / r.rotatedImageSize.height;
Rect box = Rect.fromLTWH(
r.rect.left * sx,
r.rect.top * sy,
r.rect.width * sx,
r.rect.height * sy,
);
if (mirrorHorizontally) {
box = Rect.fromLTWH(
size.width - (box.left + box.width), box.top, box.width, box.height);
}
box = Rect.fromLTWH(
box.left,
box.top,
box.width,
box.height,
);
_drawCornerTicks(canvas, box,
color: Colors.green, length: 28, thickness: 2);
if ((label ?? '').isNotEmpty) {
_drawLabel(canvas, size, box, label!);
}
}
void _drawCornerTicks(Canvas canvas, Rect box,
{required Color color, double length = 22, double thickness = 3}) {
final Paint p = Paint()
..color = color
..strokeWidth = thickness
..strokeCap = StrokeCap.round;
final tl = box.topLeft;
final tr = box.topRight;
final bl = box.bottomLeft;
final br = box.bottomRight;
canvas.drawLine(tl, tl + Offset(length, 0), p);
canvas.drawLine(tl, tl + Offset(0, length), p);
canvas.drawLine(tr, tr + Offset(-length, 0), p);
canvas.drawLine(tr, tr + Offset(0, length), p);
canvas.drawLine(bl, bl + Offset(length, 0), p);
canvas.drawLine(bl, bl + Offset(0, -length), p);
canvas.drawLine(br, br + Offset(-length, 0), p);
canvas.drawLine(br, br + Offset(0, -length), p);
}
void _drawLabel(Canvas canvas, Size screenSize, Rect box, String text) {
final TextPainter tp = TextPainter(
text: TextSpan(
text: text,
style: const TextStyle(
color: Colors.white,
fontSize: 14,
fontWeight: FontWeight.bold,
),
),
textDirection: TextDirection.ltr,
)..layout(maxWidth: screenSize.width * 0.8);
Offset textOffset = Offset(box.left, box.top - tp.height - 6);
if (textOffset.dy < 0) {
textOffset = Offset(box.left, box.bottom + 6);
}
tp.paint(canvas, textOffset);
}
@override
bool shouldRepaint(covariant FaceBoxPainter old) =>
old.faceBoxImageSpace != faceBoxImageSpace ||
old.imageSize != imageSize ||
old.mirrorHorizontally != mirrorHorizontally ||
old.rotationDegrees != rotationDegrees ||
old.label != label;
}
class _Rotated {
final Rect rect;
final Size rotatedImageSize;
_Rotated(this.rect, this.rotatedImageSize);
}
_Rotated _rotateRect(Rect r, Size img, int deg) {
switch (deg % 360) {
case 0:
return _Rotated(r, img);
case 90:
return _Rotated(
Rect.fromLTWH(
img.height - (r.top + r.height),
r.left,
r.height,
r.width,
),
Size(img.height, img.width),
);
case 180:
return _Rotated(
Rect.fromLTWH(
img.width - (r.left + r.width),
img.height - (r.top + r.height),
r.width,
r.height,
),
img,
);
case 270:
return _Rotated(
Rect.fromLTWH(
r.top,
img.width - (r.left + r.width),
r.height,
r.width,
),
Size(img.height, img.width),
);
default:
return _Rotated(r, img);
}
}
๐ท Open FaceTagr Camera #
Navigator.of(context).push(
MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
).then((_) => mounted ? setState(() => _isProcessing = false) : null);
This launches the built-in FaceTagrLivePreview widget with face recognition.
๐งช Testing the Integration #
After completing setup, verify your integration:
- Run the Flutter application.
- Tap the Start button.
- Camera preview opens.
- Face detection begins automatically.
- Successful verification returns:
{
"StatusCode": "1001",
"StatusMessage": "Face verified successfully."
}
๐ Logout #
await faceTagr.fnLogout();
This clears local tokens and resets the session.
๐ก Event Payload Reference #
โ Initialization Events (both init and initializeAndAwait) #
These events are sent after calling:
Facetagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
Facetagr.initializeAndAwait(
apiURL,
clientID,
externalID,
hashcode,
utcTime,
requestID,
).then((message) {
if (message['StatusCode'] == "1001") {
if (!mounted) return;
setState(() => _isProcessing = true);
Navigator.of(context).push(
MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
).then((_) => mounted ? setState(() => _isProcessing = false) : null);
} else {
if (!mounted) return;
ScaffoldMessenger.of(context).showSnackBar(
SnackBar(content: Text(message['StatusMessage'])),
);
}
setState(() => _isProcessing = false);
});
Possible Payloads
| StatusCode | StatusMessage |
|---|---|
| 1001 | Connected successfully. |
| 4001 | Mandatory inputs can not be empty. Please try again with valid values. |
| 4002 | Input JSON is not valid. |
| 4003 | Given ClientID is not valid. |
| 4004 | Authentication failed. |
| 4005 | Given ExternalID is not valid. |
| 4006 | Licensing limits exceeded. |
| 4007 | Failed to connect. |
| 5001 | Oops! Something went wrong! Please try again! |
| 5002 | Unable to connect to the server. Please try again later. |
| 5003 | Internal server error. Please try again. |
| 5004 | Oops! Something went wrong! Please try again! |
๐ฏ Face Match Events (faceMatchStream) #
These events are sent every time the camera detects a face and FaceTagr completes the verification.
Possible Payloads
| StatusCode | StatusMessage |
|---|---|
| 1001 | Face verified successfully. |
| 1002 | Face is not matching. |
| 4001 | No face found. |
| 4002 | Face size is less than the minimum required size. |
| 4003 | Face should be facing the camera straight. |
| 4004 | Face is blurred and/or not clear. |
| 4005 | Face is not live. Spoofing detected. |
| 4006 | Image format error. |
| 4007 | Input JSON is not valid. |
| 4008 | Given ClientID is not valid. |
| 4009 | Authentication failed. |
| 4010 | Given ExternalID is not valid. |
| 5001 | Oops! Something went wrong! Please try again! |
| 5002 | Unable to connect to the server. Please try again later. |
| 5003 | Face verification failed. Server error. |
| 5004 | Oops! Something went wrong! Please try again! |
| 5005 | Oops! Something went wrong! Please try again! |
| 5006 | Oops! Something went wrong! Please try again! |
๐งฉ Event Payload Reference (Registration Mode โ allowUserRegistration = true) #
When allowUserRegistration: true is passed in initialization, the FaceTagr SDK may return the following Face Registration Events:
โ Face Registration Events
| StatusCode | StatusMessage |
|---|---|
| 201 | JSON format error. Given input not in the correct format. |
| 200 | Authentication failed. |
| 300 | This person already exists in your database: [ExternalID] |
| 301 | Collection name does not exist. |
| 302 | DisplayName is mandatory. |
| 303 | ExternalID is mandatory. |
| 304 | ExternalID exists. ExternalID should be unique. |
| 310 | The captured photo is not correct. Please try again !! |
| 401 | Face registration failed. No face found. |
| 402 | Face registration failed. Face box size is less than minimum required size. |
| 403 | Face registration failed. Face is blurred and/or not clear. |
| 405 | Face registration failed. Face should be facing the camera straight. |
| 501 | Face registration failed. Server error. |
| 503 | Oops something went wrong. Please try again !! |
๐ Flow Diagram #
sequenceDiagram
participant App
participant FaceTagr SDK
participant Backend API
App->>FaceTagr SDK: init(apiURL, clientID, externalID, hash, time, reqID)
FaceTagr SDK->>Backend API: Validate credentials
Backend API-->>FaceTagr SDK: Auth success
FaceTagr SDK-->>App: Init success (1001)
App->>FaceTagr SDK: Open Camera
FaceTagr SDK->>Backend API: Stream frames
Backend API-->>FaceTagr SDK: Match success (1001)
FaceTagr SDK-->>App: FaceMatch event
โ Quick Recap #
- Add facetagr to dependencies
- Add facetagr_tools to dev_dependencies
- Run initialization command
- Generate SHA-512 hash.
- Call init() with credentials.
- Open camera (FaceTagrLivePreview).
- Listen for faceMatchStream.
๐ฅ Troubleshooting #
If you encounter issues during integration, check the following common scenarios:
๐ท Camera not opening #
โ Ensure camera permission is added:
Android โ AndroidManifest.xml
iOS โ Info.plist
โ Verify runtime permissions are granted.
๐ Authentication Failed (4004 / 4009) #
โ Verify clientID and externalID values.
โ Ensure hashcode is generated correctly.
โ Confirm utcTime and requestID values are valid.
๐ค No Face Detected #
โ Ensure good lighting conditions.
โ Face must be clearly visible and facing camera.
โ Avoid masks, glasses, or heavy shadows.
๐ก Unable to connect to server (5002) #
โ Check API URL.
โ Verify internet connectivity.
๐ฅ Camera Preview Black Screen #
โ Ensure device supports front camera.
โ Restart app after granting permissions.
If issues persist, contact:
๐ง support@facetagr.com
License #
This package is part of the FaceTagr ecosystem.
ยฉ 2026 NotionTag Technologies Pvt Ltd. All rights reserved.
๐ฌ Support #
For integration support, please contact:
๐ง support@facetagr.com