nsfw_detect 1.1.0
nsfw_detect: ^1.1.0 copied to clipboard
On-device NSFW/nudity detection for iOS and Android photo libraries. CoreML + Vision on iOS, TensorFlow Lite on Android. Progressive result streaming, body part detection, and ready-to-use gallery widgets.
nsfw_detect #
Enterprise-grade, on-device NSFW/nudity detection for iOS and Android photo libraries. Native CoreML inference on iOS (Apple Neural Engine), TensorFlow Lite on Android — progressive result streaming, body part detection, and ready-to-use UI widgets.
All inference runs on-device. No images or scan results ever leave the device.
Features #
- On-device ML — CoreML + Vision + Apple Neural Engine (iOS), TensorFlow Lite (Android)
- Photo library scanning — images, videos, Live Photos
- Progressive streaming — results arrive as each asset is classified, not in a batch
- Video frame sampling — uniform temporal sampling with hard-threshold fast-exit
- Body part detection — optional fine-grained YOLO-based detection with bounding boxes
- Pluggable models — ships with OpenNSFW2, swap in Falconsai, AdamCodd, or your own
- Ready-to-use widgets —
NsfwGalleryView,NsfwResultBadge,NsfwScanProgressBar - Headless API — use
NsfwDetector.instancedirectly without any UI widgets
Requirements #
| Minimum | |
|---|---|
| iOS | 16.0+ |
| Android | API 24 (Android 7.0+) |
| Flutter | 3.22+ |
| Dart | 3.4+ |
| Xcode | 15+ |
Installation #
dependencies:
nsfw_detect: ^1.1.0
iOS setup #
Add to your app's Info.plist:
<key>NSPhotoLibraryUsageDescription</key>
<string>This app needs access to your photo library.</string>
Ensure your Podfile targets iOS 16 or higher:
platform :ios, '16.0'
Android setup #
Add to android/app/src/main/AndroidManifest.xml:
<!-- API 33+ -->
<uses-permission android:name="android.permission.READ_MEDIA_IMAGES" />
<uses-permission android:name="android.permission.READ_MEDIA_VIDEO" />
<!-- API < 33 -->
<uses-permission
android:name="android.permission.READ_EXTERNAL_STORAGE"
android:maxSdkVersion="32" />
Quick Start #
import 'package:nsfw_detect/nsfw_detect.dart';
// 1. Request permission
final status = await NsfwDetector.instance.requestPermission();
if (status != PhotoLibraryPermissionStatus.authorized &&
status != PhotoLibraryPermissionStatus.limited) {
return; // handle denial
}
// 2. Configure and start scan
final session = await NsfwDetector.instance.startScan(
const ScanConfiguration(
confidenceThreshold: 0.7,
includeVideos: true,
maxVideoFrames: 8,
concurrency: 4,
),
);
// 3. Stream results as they arrive
session.results.listen((ScanResult result) {
if (result.isNsfw) {
print('NSFW: ${result.item.localIdentifier} '
'${result.topCategory.displayName} '
'(${(result.topConfidence * 100).toStringAsFixed(1)}%)');
}
});
// 4. Track progress
session.progress.listen((ScanProgress p) {
print('${p.scannedCount} / ${p.totalCount}');
});
// 5. Await completion
final ScanSummary summary = await session.done;
print('Done — ${summary.nsfwCount} NSFW of ${summary.totalScanned} '
'in ${summary.elapsed.inSeconds}s');
Cancel a scan #
await session.cancel();
Scan a single asset #
final ScanResult result = await NsfwDetector.instance.scanAsset(
'CC95F08C-88C3-4012-9D6D-64A413D254B3/L0/001',
confidenceThreshold: 0.8,
);
print(result.topCategory.displayName); // "safe", "nudity", etc.
Widgets #
NsfwGalleryView #
Drop-in gallery that handles permissions, scanning, and live display:
NsfwGalleryView(
initialConfig: const ScanConfiguration(confidenceThreshold: 0.7),
theme: const NsfwGalleryTheme(
nsfwColor: Colors.red,
badgeOpacity: 0.88,
),
crossAxisCount: 3,
badgeStyle: BadgeStyle.compact,
blurNsfwTiles: true,
onResultTap: (result) => Navigator.push(
context,
MaterialPageRoute(builder: (_) => MyDetailScreen(result: result)),
),
onScanComplete: (summary) {
ScaffoldMessenger.of(context).showSnackBar(
SnackBar(content: Text('${summary.nsfwCount} NSFW items found')),
);
},
)
Custom thumbnails
Provide a thumbnail widget per item — useful with packages like
photo_manager_image_provider:
NsfwGalleryView(
thumbnailBuilder: (context, item) {
final entity = AssetEntity(
id: item.localIdentifier,
typeInt: item.type == MediaType.video ? 2 : 1,
width: item.width ?? 300,
height: item.height ?? 300,
);
return AssetEntityImage(
entity,
isOriginal: false,
thumbnailSize: const ThumbnailSize.square(300),
fit: BoxFit.cover,
);
},
)
Custom tile rendering
Override the full tile while keeping all scan logic:
NsfwGalleryView(
tileBuilder: (context, item, result, defaultTile) {
return Stack(
children: [
defaultTile,
if (result?.isNsfw == true)
Positioned.fill(
child: Container(
color: Colors.red.withValues(alpha: 0.4),
),
),
],
);
},
)
NsfwResultBadge #
Standalone badge for any ScanResult — pass null for a scanning animation:
NsfwResultBadge(
result: scanResult,
style: BadgeStyle.detailed, // compact | detailed | iconOnly | minimal
theme: NsfwGalleryTheme.defaults,
)
NsfwScanProgressBar #
NsfwScanProgressBar(
progressStream: session.progress,
style: ProgressBarStyle.linear, // linear | compact | textOnly
showItemCount: true,
)
Theming #
const NsfwGalleryTheme(
safeColor: Color(0xFF4CAF50),
suggestiveColor: Color(0xFFFF9800),
nsfwColor: Color(0xFFF44336),
explicitColor: Color(0xFF9C27B0),
pendingColor: Color(0xFF9E9E9E),
badgeOpacity: 0.85,
tileBorderRadius: BorderRadius.all(Radius.circular(8)),
scaffoldBackgroundColor: Colors.black,
)
Models #
The plugin ships with OpenNSFW2 (CoreML, ~11 MB, bundled — no download needed).
List available models #
final List<ModelDescriptor> models = await NsfwDetector.instance.availableModels();
for (final m in models) {
print('${m.id}: ${m.displayName} — available: ${m.isAvailable}');
}
Preload a model #
await NsfwDetector.instance.preloadModel(ModelIds.openNsfw2);
Download an additional model #
final bool ok = await NsfwDetector.instance.downloadModel(
ModelIds.falconsai,
url: 'https://your-cdn.example.com/falconsai_nsfw.zip',
);
Model IDs #
| Constant | ID string | Notes |
|---|---|---|
ModelIds.openNsfw2 |
opennsfw2_coreml |
Bundled, no download |
ModelIds.falconsai |
falconsai_nsfw |
Requires download |
ModelIds.adamcodd |
adamcodd_nsfw |
Requires download |
Classification Categories #
| Category | isNsfw |
Description |
|---|---|---|
safe |
false | No concerning content |
suggestive |
false | Revealing but not explicit |
nudity |
true | Nudity detected |
explicitNudity |
true | Explicit sexual content |
unknown |
false | Classification failed / unrecognized output |
// Top result
print(result.topCategory.displayName);
print(result.topConfidence);
// Per-category confidence
final double conf = result.confidenceFor(NsfwCategory.nudity);
// All labels sorted by confidence
for (final label in result.labels) {
print('${label.category.displayName}: ${label.confidence}');
}
Body Part Detection #
When available, ScanResult includes fine-grained BodyPartDetection entries:
if (result.hasDetections) {
final summary = result.detectionSummary!;
print('Severity: ${summary.severity.name}'); // safe | suggestive | nudity | explicit
print('Explicit detections: ${summary.explicitDetections.length}');
for (final d in result.detections!) {
print('${d.displayName} — ${(d.confidence * 100).toStringAsFixed(1)}% '
'(${d.severity.name})');
}
}
Video Scanning #
| Clip length | Sampling strategy |
|---|---|
| < 3 s | Frame every 0.5 s |
| ≥ 3 s | Uniform temporal sampling, always includes near-start and near-end |
| Any | Hard-threshold fast-exit: score > 0.9 on any frame → immediately flagged |
Center-weighted aggregation reduces false positives from title cards or transitions.
ScanConfiguration(
maxVideoFrames: 12, // max frames to sample, default: 8
videoFrameInterval: 1.5, // seconds between samples, default: 2.0
includeVideos: true,
includeLivePhotos: true,
)
Performance #
On iOS, images are submitted to CoreML in batches using MLModel.predictions(from:).
This reduces Apple Neural Engine and GPU setup overhead from once per image to once per
batch, resulting in 1.5–3× faster throughput on large photo libraries compared to
per-image inference.
The batch size matches ScanConfiguration.concurrency (default: 4). No code changes
are needed to benefit from this.
If you encounter device-specific issues, set disableBatchPrediction: true in
ScanConfiguration to revert to the previous per-image path:
ScanConfiguration(disableBatchPrediction: true)
Architecture #
Flutter app
│
Dart API ──────── NsfwDetector · ScanSession · ScanResult · ScanSummary
│
Dart widgets ──── NsfwGalleryView · NsfwResultBadge · NsfwScanProgressBar
│
Platform layer ── NsfwPlatformInterface (abstract)
│ └── NsfwMethodChannel
│
iOS native ──────── CoreML + Vision · VideoFrameSampler · ModelRegistry
Android native ──── TensorFlow Lite · MediaStore · ScanSessionTask
Channels:
nsfw_detect_ios/methods— commands: start, cancel, permissions, model managementnsfw_detect_ios/scan_events— streaming results + progress (EventChannel)
Testing #
# Unit tests (32 tests)
flutter test
# Integration tests (requires a real device with photos)
cd example && flutter test integration_test/
Privacy #
- All ML inference runs on-device — CoreML / TensorFlow Lite, no network calls
- No telemetry, analytics, or automatic data transmission of any kind
- Scan results are never persisted by the plugin — that is the app's responsibility
- Photo access uses the minimum required permission scope
License #
MIT © 2024 — see LICENSE