nsfw_detector_flutter 2.0.0
nsfw_detector_flutter: ^2.0.0 copied to clipboard
A Flutter package to detect NSFW (Not Safe For Work / NUDE / adults) content and classify SAFE content without requiring assets. Easily integrate NSFW detection into your app.
Changelog #
2.0.0 - 2026-04-06 #
Breaking Changes #
NsfwResultconstructor now uses named required parameters:NsfwResult(isNsfw: ..., score: ..., safeScore: ...).
Added #
NsfwResult.safeScore— exposes the model's safe-class probability alongsidescore.NsfwClassificationenum (safe,questionable,nsfw) andNsfwResult.classificationgetter.NsfwResult.toJson(),NsfwResult.fromJson(),NsfwResult.copyWith().NsfwDetector.initialize()/NsfwDetector.instance/NsfwDetector.disposeInstance()singleton API.NsfwDetector.detectBytesInBackground()— runs detection in a background isolate viacompute().NsfwDetector.detectNSFWFromXFile(XFile)— direct support forimage_picker/cameraoutput.NsfwDetector.detectNSFWFromUrl(Uri)— detects NSFW from a network image URL (10s timeout).NsfwDetector.detectBatch(List<Uint8List>)— scans multiple images in one call.NsfwDetector.load(useGpu: true)— optional GPU delegate with automatic CPU fallback.NsfwDetectorException— structured exception wrapping all detector errors withcauseandstackTrace.
Fixed #
detectNSFWFromFilewas silently returningnullfor non-JPEG formats (was usingdecodeJpg; now usesdecodeImage).detectNSFWFromFilewas blocking the UI thread with synchronous file I/O (now usesawait readAsBytes()).- Calling detect methods after
close()now throws aStateErrorinstead of crashing natively. - Concurrent inference on the same instance now throws a
StateErrorinstead of producing undefined behavior. thresholdparameter now validates range (rejects NaN, infinity, values outside 0.0–1.0).- Empty
Uint8Listinput todetectNSFWFromBytesnow throwsArgumentErrorinstead of returningnull.
Changed #
- Flutter SDK constraint tightened to
>=3.10.0(was>=1.17.0).
1.0.2 - 2024-07-06 #
Updated #
- Updated documentation to include new parameters for NSFW classification.