flutter_face_liveness 3.0.0
flutter_face_liveness: ^3.0.0 copied to clipboard
Production-ready Flutter SDK for face detection, liveness verification, and anti-spoof protection using ML Kit and TensorFlow Lite.
flutter_face_liveness #

Production-ready AI-powered Flutter SDK for real-time face liveness detection, anti-spoof protection, and persistent face identity — powered by Google ML Kit + TensorFlow Lite. All processing runs entirely on-device with zero network calls (except the one-time FaceNet model download when Face ID is enabled).
Table of Contents #
- Features
- Use Cases
- Getting Started
- Quick Start
- Face Identity (Face ID)
- LivenessConfig Reference
- Liveness Actions
- LivenessResult Fields
- LivenessController API
- TFLite Integration
- Architecture
- Performance
- Security
- Example App
- Changelog
Features #
| Category | Feature |
|---|---|
| Liveness | 7 challenge actions — blink, turn left/right, look up/down, smile, open mouth |
| Face ID | Same face → always same ID, across sessions, restarts, and days. Powered by FaceNet TFLite (auto-downloaded, one-time ~23 MB) |
| New/Returning | isFaceIdNew flag tells you instantly if it's a first-time or returning face |
| Anti-Spoof | 9-signal composite engine — eye variance, geometry, pose, micro-motion, quality, tracking, brightness variance, motion jitter |
| Video Replay Detection | MiniFASNet-V2 TFLite model detects pre-recorded video replay attacks — enableVideoReplayDetection: true auto-downloads model |
| Frame Quality | BT.601 platform-correct brightness (Android NV21 + iOS BGRA8888), blur, overexposure — with 6-frame debounce |
| Replay Guard | FNV-1a frame hashing detects looped / static-image attacks |
| Session Security | Cryptographically unique session IDs via Random.secure() |
| Action Randomisation | Fisher-Yates shuffle prevents predictable replay attacks |
| Isolate ML | YUV→NV21 conversion, quality analysis, and face embedding — all in background isolates |
| TFLite Anti-Spoof | Bundled FaceAntiSpoofing model — enableTFLite: true auto-downloads & runs in a background isolate. Custom model supported via tfliteModelPath / tfliteModelUrl |
| Theming | Dark / light / system mode via LivenessConfig.themeMode |
| Debug Overlay | Real-time Euler angles, eye/smile probabilities, brightness, blur on screen |
Use Cases #
KYC (Know Your Customer) #
Financial onboarding, account opening, and identity verification flows require proof that a real human is present — not a printed photo or screen replay.
FlutterFaceLiveness(
actions: [LivenessAction.blink, LivenessAction.turnLeft, LivenessAction.turnRight],
config: LivenessConfig(
enableAntiSpoof: true,
enableFaceId: true,
randomizeActions: true,
),
onSuccess: (result) {
final faceId = result.faceId; // "FID-3A9F2B1C4E8D…"
final isNew = result.isFaceIdNew; // true = first time, false = returning
final sessionId = result.sessionId; // "LV-018F3A2B9C4E-D7E31F08"
final score = result.confidenceScore;
// Send faceId + sessionId to your backend for audit trail
},
onFailed: (reason) => showError(reason),
)
Banking / Fintech #
Transaction authorisation, step-up authentication, and high-risk operation confirmation. Face ID ensures the authorising person is the account holder across every session.
FlutterFaceLiveness(
actions: [LivenessAction.blink, LivenessAction.turnLeft, LivenessAction.smile],
config: LivenessConfig(
enableFaceId: true,
faceIdSimilarityThreshold: 0.72, // stricter for banking
enableAntiSpoof: true,
sessionTimeoutMs: 30000, // 30-second window
),
onSuccess: (result) {
if (result.isFaceIdNew == false && result.faceId == storedFaceId) {
authoriseTransaction(); // returning, known face
} else {
flagForReview(); // new or unexpected face
}
},
onFailed: (reason) => showError(reason),
)
Attendance Systems #
Employee / student attendance where the same person must be recognised across multiple daily check-ins.
// Enrolment (first check-in): isFaceIdNew == true → store faceId in database
// Daily check-in: isFaceIdNew == false → mark present
FlutterFaceLiveness(
actions: [LivenessAction.blink], // quick single-action check
config: LivenessConfig(
enableFaceId: true,
enableAntiSpoof: true,
),
onSuccess: (result) {
if (result.isFaceIdNew == true) {
db.enrolEmployee(result.faceId!);
} else {
db.markAttendance(result.faceId!, DateTime.now());
}
},
onFailed: (reason) => showError(reason),
)
Authentication / Biometric Login #
Replace or augment PIN/password with a liveness-verified face. The persistent Face ID acts as the biometric credential stored on-device.
FlutterFaceLiveness(
actions: [LivenessAction.blink, LivenessAction.turnLeft],
config: LivenessConfig(
enableFaceId: true,
faceIdSimilarityThreshold: 0.70,
enableBrightnessCheck: true,
),
onSuccess: (result) {
final enrolled = prefs.getString('enrolled_face_id');
if (result.isFaceIdNew == false && result.faceId == enrolled) {
unlockApp();
} else if (enrolled == null && result.isFaceIdNew == true) {
prefs.setString('enrolled_face_id', result.faceId!);
showEnrolmentSuccess();
} else {
showError('Face not recognised — please contact support');
}
},
onFailed: (reason) => showError(reason),
)
Enterprise Security #
Multi-factor authentication, access control, and audit logging for enterprise applications.
onSuccess: (result) {
auditLog.record(
faceId: result.faceId,
isNewFace: result.isFaceIdNew,
sessionId: result.sessionId,
timestamp: DateTime.now(),
actions: result.completedActions.map((a) => a.name).toList(),
score: result.confidenceScore,
);
}
Getting Started #
1. Add the dependency #
dependencies:
flutter_face_liveness: ^3.0.0
2. Platform permissions #
Android — android/app/src/main/AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA" />
<!-- Required only when enableFaceId: true -->
<uses-permission android:name="android.permission.INTERNET" />
iOS — ios/Runner/Info.plist
<key>NSCameraUsageDescription</key>
<string>Camera is required for face liveness verification.</string>
3. Minimum SDK versions #
| Platform | Minimum | Notes |
|---|---|---|
| Android | API 26 (Android 8.0) | Required by TFLite Flutter 0.12+ |
| iOS | iOS 13.0 | |
| Dart | 3.0.0 | |
| Flutter | 3.10.0 |
Android — android/app/build.gradle:
defaultConfig {
minSdk 26
}
4. Fix tflite_flutter for Dart 3.4+ #
tflite_flutter 0.10.4 (pub.dev) uses UnmodifiableUint8ListView which was removed in Dart 3.4. Add a dependency_overrides block to pull the fixed version from git:
# pubspec.yaml
dependency_overrides:
tflite_flutter:
git:
url: https://github.com/tensorflow/flutter-tflite.git
ref: main
This resolves to tflite_flutter 0.12.1 automatically. No other changes required.
5. Internet permission note (Face ID only) #
The FaceNet model (~23 MB) is downloaded once on first launch with enableFaceId: true and cached permanently in the app's documents directory. All subsequent launches use the local cache — no network required.
Quick Start #
import 'package:flutter_face_liveness/flutter_face_liveness.dart';
class VerificationPage extends StatelessWidget {
@override
Widget build(BuildContext context) {
return FlutterFaceLiveness(
actions: [
LivenessAction.blink,
LivenessAction.turnLeft,
LivenessAction.turnRight,
],
config: LivenessConfig(
randomizeActions: true,
enableAntiSpoof: true,
),
onSuccess: (LivenessResult result) {
print('Verified!');
print('Session ID : ${result.sessionId}');
print('Confidence : ${(result.confidenceScore * 100).toStringAsFixed(1)}%');
print('Duration : ${result.sessionDurationMs}ms');
print('Anti-spoof : ${result.spoofDetected ? "FAILED" : "PASSED"}');
},
onFailed: (String reason) {
print('Failed: $reason');
},
);
}
}
Face Identity (Face ID) #
Key guarantee
A Face ID (
FID-XXXX) is permanently tied to one physical person's face. No matter how many times the same person is detected — different sessions, different days, different lighting, after app restarts — they will always receive the exact same Face ID. A new ID is only generated the very first time a completely unknown face is seen.User scans face on Day 1 → FID-3A9F2B1C4E8D7F62 isFaceIdNew: true User scans face on Day 7 → FID-3A9F2B1C4E8D7F62 isFaceIdNew: false ← same ID User scans face on Day 30 → FID-3A9F2B1C4E8D7F62 isFaceIdNew: false ← same ID Different person scans → FID-A817C3F0B24E9D51 isFaceIdNew: true ← new ID
How it works #
- First detection — FaceNet extracts a 128-dimensional embedding from the verified face crop. A unique ID is generated (
FID-3A9F2B1C4E8D…) and persisted inSharedPreferences.isFaceIdNewistrue. - Every subsequent detection — The new embedding is compared against all stored embeddings using cosine similarity. If the best match scores ≥
faceIdSimilarityThreshold(default0.65), the sameFID-XXXXis returned.isFaceIdNewisfalse. - Adapts over time — On every confirmed match the stored template is blended:
stored = 0.75 × stored + 0.25 × new(then re-normalised to unit length). The Face ID stays accurate even as lighting, hairstyle, or camera angle changes session to session. - Survives everything — Face IDs persist across app restarts, app updates, phone restarts, and re-installs (stored in
SharedPreferences; only cleared viaclearFaceIdentities()).
Enable it #
FlutterFaceLiveness(
actions: [LivenessAction.blink, LivenessAction.turnLeft],
config: LivenessConfig(
enableFaceId: true,
faceIdSimilarityThreshold: 0.65, // default — good for most apps
),
onSuccess: (result) {
final faceId = result.faceId!; // "FID-3A9F2B1C4E8D7F62A091"
final isNew = result.isFaceIdNew!; // true = registered, false = matched
if (isNew) {
print('New face registered: $faceId');
} else {
print('Welcome back! Recognised as: $faceId');
}
},
onFailed: (reason) => print('Failed: $reason'),
)
First-run download progress #
On first launch with enableFaceId: true, the built-in loading screen shows the download percentage automatically. No code required.
To observe progress from outside the widget:
final controller = LivenessController(
actions: [...],
config: LivenessConfig(enableFaceId: true),
onSuccess: ...,
onFailed: ...,
);
// Rebuild when this changes (it's a ChangeNotifier getter)
// double? faceIdModelDownloadProgress → 0.0–1.0 during download, null otherwise
Managing stored faces #
// Via LivenessController (recommended)
await controller.clearFaceIdentities(); // delete all on logout
// Via FaceIdentityService directly (advanced)
final service = FaceIdentityService(similarityThreshold: 0.65);
await service.initialize(
onModelDownloadProgress: (p) => print('${(p * 100).toInt()}%'),
);
List<String> ids = service.registeredFaceIds; // all IDs on this device
await service.removeFace('FID-3A9F2B…'); // remove one specific face
await service.clearAllFaces(); // remove all faces
service.dispose();
Cosine similarity thresholds guide #
| Threshold | Behaviour |
|---|---|
0.50 |
Very lenient — may match different people in similar conditions |
0.65 |
Default — good balance for normal use (different lighting, slight angle) |
0.72 |
Stricter — recommended for banking / high-security apps |
0.80 |
Very strict — may produce new IDs for same person in different lighting |
LivenessConfig Reference #
LivenessConfig({
// Session
int sessionTimeoutMs = 60000,
bool randomizeActions = true,
// Camera
ResolutionPreset cameraResolution = ResolutionPreset.high,
int targetFps = 20,
// Anti-spoof
bool enableAntiSpoof = true,
double antiSpoofThreshold = 0.45,
// Frame quality (brightness debounce: 6 consecutive bad frames required)
bool enableBrightnessCheck = true,
double brightnessMin = 0.12, // below = genuinely dark room
double brightnessMax = 0.92, // above = direct sunlight / overexposed
bool enableBlurDetection = true,
double blurThreshold = 80.0,
bool enableDuplicateFrameDetection = true,
int duplicateFrameWindowSize = 8,
// Face geometry
double faceTooFarRatio = 0.015,
double faceTooCloseRatio = 0.70,
// Face Identity
bool enableFaceId = false,
double faceIdSimilarityThreshold = 0.65,
// TFLite anti-spoof (bundled model — zero config)
bool enableTFLite = false,
String? tfliteModelPath = null, // override: asset key or absolute path
String? tfliteModelUrl = null, // override: custom download URL
int? tfliteInputSize = null, // override: null = auto (256 for bundled model)
double tfliteDeepfakeThreshold = 0.40, // score below this → deepfakeDetected: true
// Video replay detection (MiniFASNet-V2 — zero config)
bool enableVideoReplayDetection = false,
String? videoReplayModelPath = null,
String? videoReplayModelUrl = null,
int? videoReplayInputSize = null,
double videoReplayThreshold = 0.50, // score below this → videoReplayDetected: true
// UI
ThemeMode themeMode = ThemeMode.dark,
bool showDebugOverlay = false,
})
Full parameter table #
| Parameter | Type | Default | Description |
|---|---|---|---|
sessionTimeoutMs |
int |
60000 |
Auto-fail after this many ms |
randomizeActions |
bool |
true |
Fisher-Yates shuffle per session — prevents replay attacks |
cameraResolution |
ResolutionPreset |
high |
medium reduces CPU on low-end devices |
targetFps |
int |
20 |
Frame processing rate (1–30 fps) |
enableAntiSpoof |
bool |
true |
7-signal composite anti-spoof heuristic |
antiSpoofThreshold |
double |
0.45 |
Minimum composite score to pass (0.0–1.0) |
enableBrightnessCheck |
bool |
true |
Block frames that are too dark or overexposed |
brightnessMin |
double |
0.12 |
Y-luminance below this = genuinely dark room. Uses BT.601 on iOS (BGRA) and Y-plane on Android (NV21). Triggers only after 6 consecutive dark frames to absorb camera auto-exposure settling |
brightnessMax |
double |
0.92 |
Y-luminance above this = overexposed / direct sun. Same 6-frame debounce applies |
enableBlurDetection |
bool |
true |
Block blurry frames |
blurThreshold |
double |
80.0 |
Y-plane variance; below this = blurry |
enableDuplicateFrameDetection |
bool |
true |
FNV-1a hash sliding-window replay detection |
duplicateFrameWindowSize |
int |
8 |
Sliding window size for duplicate streak |
faceTooFarRatio |
double |
0.015 |
Face bounding-box area ratio below which = "too far" |
faceTooCloseRatio |
double |
0.70 |
Face bounding-box area ratio above which = "too close" |
enableFaceId |
bool |
false |
Persistent face identity. FaceNet model auto-downloaded on first run |
faceIdSimilarityThreshold |
double |
0.65 |
Cosine similarity cutoff. Same face across lighting/angle typically scores 0.65–0.85 |
enableTFLite |
bool |
false |
Enable TFLite anti-spoof. Bundled model auto-downloaded on first use — no extra config needed |
tfliteModelPath |
String? |
null |
Override: Flutter asset key or absolute path to a custom .tflite model |
tfliteModelUrl |
String? |
null |
Override: HTTPS URL for a custom model download. When null the bundled model URL is used |
tfliteInputSize |
int? |
null |
Override: model input size (square, px). When null auto-resolved to 256 for the bundled model |
tfliteDeepfakeThreshold |
double |
0.40 |
TFLite real-score below this sets deepfakeDetected: true |
enableVideoReplayDetection |
bool |
false |
Enable MiniFASNet-V2 video-replay model. Auto-downloads on first use |
videoReplayModelPath |
String? |
null |
Override: local path for video-replay model |
videoReplayModelUrl |
String? |
null |
Override: download URL for video-replay model |
videoReplayInputSize |
int? |
null |
Override: input size for video-replay model (default 80) |
videoReplayThreshold |
double |
0.50 |
Real-score below this sets videoReplayDetected: true |
themeMode |
ThemeMode |
dark |
ThemeMode.system follows device theme |
showDebugOverlay |
bool |
false |
Euler angles, eye/smile probabilities, brightness, blur |
Liveness Actions #
| Action | Enum | How it triggers |
|---|---|---|
| Blink | LivenessAction.blink |
Both eye open-probability drops below 0.50 — fires instantly on close, no wait for re-open |
| Turn Left | LivenessAction.turnLeft |
Yaw angle > +15° held for ≥ 80 ms |
| Turn Right | LivenessAction.turnRight |
Yaw angle < −15° held for ≥ 80 ms |
| Look Up | LivenessAction.lookUp |
Pitch angle > +15° held for ≥ 80 ms |
| Look Down | LivenessAction.lookDown |
Pitch angle < −15° held for ≥ 80 ms |
| Smile | LivenessAction.smile |
Smile probability > 0.80 |
| Open Mouth | LivenessAction.openMouth |
Bounding-box height grows > 8% with smile probability < 0.30 |
Recommended action combinations #
// Quick check (low friction)
actions: [LivenessAction.blink]
// Standard (recommended for most apps)
actions: [LivenessAction.blink, LivenessAction.turnLeft, LivenessAction.turnRight]
// High-security KYC
actions: [LivenessAction.blink, LivenessAction.turnLeft,
LivenessAction.turnRight, LivenessAction.smile]
// Full challenge
actions: [LivenessAction.blink, LivenessAction.turnLeft, LivenessAction.turnRight,
LivenessAction.lookUp, LivenessAction.openMouth]
LivenessResult Fields #
class LivenessResult {
final bool isSuccess;
final List<LivenessAction> completedActions;
final double confidenceScore; // 0.0–1.0 composite anti-spoof score
final bool isRealHuman; // true when anti-spoof passes
final bool spoofDetected; // true if heuristic signals triggered
final bool deepfakeDetected; // true if TFLite score < tfliteDeepfakeThreshold
final double? tfliteScore; // raw TFLite real-face probability (when enabled)
final double? videoReplayScore; // raw MiniFASNet real-face probability (when enabled)
final bool videoReplayDetected; // true if video replay attack flagged
final String? failureReason; // human-readable reason on failure
final int? sessionDurationMs; // total session time in ms
// Session ID format: "LV-{12-char-timestamp-hex}-{8-char-secure-random-hex}"
// e.g. "LV-018F3A2B9C4E-D7E31F08" — generated via Random.secure()
final String? sessionId;
// Face ID format: "FID-{24 uppercase hex chars}"
// e.g. "FID-3A9F2B1C4E8D7F62A091B3C5" — non-null only when enableFaceId: true
final String? faceId;
// true → this face was seen for the FIRST TIME — new ID was created
// false → this face was RECOGNISED — existing ID returned
// null → Face ID is disabled (enableFaceId: false)
final bool? isFaceIdNew;
}
Handling the result #
onSuccess: (LivenessResult result) {
// 1. Confidence score
final pct = (result.confidenceScore * 100).toStringAsFixed(1);
print('Anti-spoof confidence: $pct%');
// 2. Face ID — new vs returning user
if (result.faceId != null) {
if (result.isFaceIdNew == true) {
// First time this face is seen on this device
print('New face registered: ${result.faceId}');
myBackend.registerUser(faceId: result.faceId!);
} else {
// Recognised — same ID as before
print('Welcome back: ${result.faceId}');
myBackend.loginUser(faceId: result.faceId!);
}
}
// 3. Session ID — send to backend for audit trail
myBackend.logSession(
sessionId: result.sessionId!,
faceId: result.faceId,
isNewFace: result.isFaceIdNew,
score: result.confidenceScore,
durationMs: result.sessionDurationMs,
actions: result.completedActions.map((a) => a.name).toList(),
);
},
LivenessController API #
For advanced use cases where you need to drive liveness from code rather than using FlutterFaceLiveness widget directly:
final controller = LivenessController(
actions: [LivenessAction.blink, LivenessAction.turnLeft],
config: LivenessConfig(enableFaceId: true),
onSuccess: (result) { ... },
onFailed: (reason) { ... },
);
await controller.initialize();
Public getters #
| Getter | Type | Description |
|---|---|---|
isInitialized |
bool |
True after camera + models are ready |
status |
DetectionStatus |
Current detection state (see below) |
currentAction |
LivenessAction? |
The action the user must perform now |
completedActions |
List<LivenessAction> |
Actions already completed this session |
remainingActions |
List<LivenessAction> |
Actions still to complete |
completedCount |
int |
Number of completed actions |
totalActions |
int |
Total actions in this session |
progress |
double |
0.0–1.0 completion progress |
isComplete |
bool |
True after all actions are done |
sessionId |
String? |
Current session ID |
currentFace |
FaceData? |
Most recent detected face data |
lastQuality |
FrameQuality? |
Most recent frame quality metrics |
faceIdModelDownloadProgress |
double? |
0.0–1.0 during model download, null otherwise |
error |
String? |
Non-null if initialization failed |
cameraController |
CameraController? |
Underlying camera controller |
DetectionStatus values #
| Status | Meaning |
|---|---|
initializing |
Camera / models loading |
noFace |
No face detected in frame |
multipleFaces |
More than one face visible |
faceTooFar |
Move closer to camera |
faceTooClose |
Move further from camera |
faceNotCentered |
Centre your face in the oval |
lowLight |
Too dark — triggered after 6 consecutive dark frames |
overExposed |
Too bright / direct light — same 6-frame debounce |
blurry |
Camera out of focus |
fakeDetected |
Anti-spoof or duplicate-frame check triggered |
actionInProgress |
Performing a liveness challenge |
completed |
All actions done — onSuccess will fire |
failed |
Session timed out or manually failed |
Methods #
await controller.initialize(); // start camera + load models
await controller.reset(); // restart session, keep camera running
await controller.clearFaceIdentities(); // delete all stored face embeddings
await controller.dispose(); // release all resources
TFLite Integration (Optional) #
Two TFLite models are available, both auto-downloaded on first use and cached permanently. All inference runs in a background isolate — the camera and blink detection are never blocked.
Anti-Spoof (FaceAntiSpoofing, 3.9 MB) #
config: LivenessConfig(
enableTFLite: true, // auto-downloads on first launch
tfliteDeepfakeThreshold: 0.40, // score below this → deepfakeDetected: true
),
onSuccess: (result) {
print('TFLite score : ${result.tfliteScore}'); // 0.0–1.0 real-face probability
print('Deepfake : ${result.deepfakeDetected}'); // true when score < 0.40
},
Video Replay Detection (MiniFASNet-V2, 1.7 MB) #
Detects pre-recorded video replay attacks — someone pointing a phone showing a video of a real person.
config: LivenessConfig(
enableVideoReplayDetection: true, // auto-downloads MiniFASNet-V2 on first launch
videoReplayThreshold: 0.50, // score below this → videoReplayDetected: true
),
onSuccess: (result) {
print('Replay score : ${result.videoReplayScore}'); // 0.0–1.0
print('Replay attack : ${result.videoReplayDetected}'); // true = video replay
},
Custom model #
Bring your own deepfake / PAD model if you need a different architecture:
config: LivenessConfig(
enableTFLite: true,
tfliteModelUrl: 'https://your-cdn.com/custom_model.tflite', // auto-download
// OR
tfliteModelPath: 'assets/custom_model.tflite', // bundled asset
tfliteInputSize: 128, // must match your model's input size
),
Expected model contract #
| Property | Requirement |
|---|---|
| Input shape | [1, H, W, 3] float32 |
| Input range | 0.0–1.0 (RGB, normalised) |
| Output (standard) | [1, 2] → [real_probability, spoof_probability] |
| Output (dual-tensor) | clss_pred [1, N] + leaf_node_mask [1, N] — FaceAntiSpoofing style automatically detected |
Internet permission (first launch only) #
The bundled model (~3.9 MB) is downloaded once and cached permanently. Add the internet permission the same way as Face ID:
<!-- AndroidManifest.xml -->
<uses-permission android:name="android.permission.INTERNET" />
iOS does not require an extra permission for network access.
Performance #
| Metric | Value |
|---|---|
| Per-frame latency (mid-range Android) | 40–60 ms |
| Per-frame latency (iPhone 12+) | 20–35 ms |
| Frame processing rate (default) | 20 fps |
| FaceNet inference (first call after load) | ~80 ms |
| FaceNet inference (warm, subsequent) | ~30–50 ms |
| Memory footprint (base, no Face ID) | ~45 MB |
| Memory footprint (with Face ID loaded) | ~90 MB |
| FaceNet model download (one-time) | ~23 MB |
Threading model:
| Work | Thread |
|---|---|
| ML Kit face detection | Main isolate (platform channel requirement) |
| YUV → NV21 + brightness/blur/hash | Background isolate (compute()) |
| Face crop + resize + normalise | Background isolate (compute()) |
| FaceNet embedding inference | Background isolate (compute()) |
TFLite preprocessing + invoke() |
Persistent background isolate (never blocks camera) |
| UI rendering | Main thread — never blocked |
Tuning tips:
- Lower
targetFpsto15on low-end devices to reduce CPU load - Use
ResolutionPreset.mediumif 60 fps UI rendering is dropping frames - Set
enableFaceId: falseif you don't need identity — saves ~45 MB RAM and skips all FaceNet work
Security #
| Threat | Mitigation |
|---|---|
| Printed photo | Eye variance + face geometry signals in AntiSpoofEngine |
| Screen replay (looped video) | FNV-1a frame hash sliding-window in FrameHasher |
| Static image held to camera | Duplicate frame detection + micro-motion signal |
| Pre-recorded live video | MiniFASNet-V2 TFLite model (enableVideoReplayDetection: true) + brightness variance + motion jitter heuristics |
| Deepfake / synthetic face | FaceAntiSpoofing TFLite model (enableTFLite: true) |
| Predictable action sequence | Fisher-Yates shuffle per session |
| Session replay attack | sessionId generated with Random.secure() — cryptographically unique |
| Identity spoofing (different person) | FaceNet cosine similarity ≥ threshold; isFaceIdNew signals mismatches |
| Low-quality frames | BT.601 brightness with 6-frame debounce + blur check block all liveness evaluation |
Note: This package provides strong on-device liveness verification. For high-assurance KYC (banking, government), pair
sessionIdandfaceIdwith a server-side signature verification step.
Example App #
The example/ directory contains a full demo app showcasing every feature.
Home screen #
Four challenge presets:
- Standard Verification — Blink · Turn Left · Turn Right
- Extended Challenge — Blink · Look Up · Look Down · Smile
- Full Challenge — Blink · Turn Left · Turn Right · Open Mouth
- With Face ID — Blink · Turn Left with persistent identity
Registered Faces card — appears after the first Face ID scan. Shows all stored FID-XXXX IDs with tap-to-copy. "Clear all" resets the device's face database.
Result screen #
After each successful verification:
| Field | What it shows |
|---|---|
| Confidence Score | Anti-spoof composite % |
| Completed Actions | Which actions were performed |
| Anti-Spoof | Passed / Spoof Detected |
| Face ID card | "New Face Registered" (blue, first time) or "Face Recognised — Welcome Back!" (green, returning) with the FID-XXXX — tap to copy |
| Session ID | Unique audit ID |
| Duration | Session time in seconds |
Run it #
cd example
flutter run
Testing Face ID persistence:
- Tap "With Face ID" → complete the check → see
FID-XXXXwith "New Face Registered" banner- Copy the Face ID (tap the row)
- Tap Back → run "With Face ID" again
- The result shows the exact same
FID-XXXXwith "Face Recognised — Welcome Back!" banner- Close the app completely → reopen → scan again → same ID still returned
This is the core guarantee: one face, one ID, forever.
License #
MIT — see LICENSE
Author #
Developed by Sanjay Sharma
GitHub: sanjaysharmajw/flutter_face_liveness
Issues: github.com/sanjaysharmajw/flutter_face_liveness/issues