sensitive_content_analysis 1.1.0-rc.2 sensitive_content_analysis: ^1.1.0-rc.2 copied to clipboard
Dart plugin for detecting and alerting users to nudity in images and videos before displaying them, utilizing the SensitiveContentAnalysis Framework by Apple.
Sensitive Content Analysis #
Provide a safer experience in your app by detecting and alerting users to nudity in images and videos before displaying them onscreen.
Dart plugin for interacting with Apple's SensitiveContentAnalysis Framework.
Minimum requirements
- iOS/iPadOS
>=17.0
- macOS
>=14.0
Install #
Add the app entitlement:
The OS requires the com.apple.developer.sensitivecontentanalysis.client
entitlement in your app’s code signature to use SensitiveContentAnalysis. Calls
to the framework fail to return positive results without it. You can can add
this entitlement to your app by enabling the Sensitive Content Analysis
capability in Xcode.
<key>com.apple.developer.sensitivecontentanalysis.client</key>
<array>
<string>analysis</string>
</array>
Usage #
Check Policy: #
final sca = SensitiveContentAnalysis();
int? policy = await sca.checkPolicy();
if (policy != null) {
return policy;
}
Analyze File Image:
try {
final sca = SensitiveContentAnalysis();
final ImagePicker picker = ImagePicker();
// Pick an image.
final XFile? image = await picker.pickImage(source: ImageSource.gallery);
if (image != null) {
Uint8List imageData = await image.readAsBytes();
// Analyze the image for sensitive content.
final bool? isSensitive = await sca.analyzeImage(imageData);
if (isSensitive != null) {
return isSensitive;
} else {
debugPrint("Enable ”Sensitive Content Warning” in Settings -> Privacy & Security.");
return null;
}
}
} catch (e) {
return null;
}
Analyze Network Image:
Install the test profile
For testing purposes Apple provides a test profile with which you can test the QR code, without having to install actual NSFW content.
final String? analyzeUrl = "https://docs-assets.developer.apple.com/published/517e263450/rendered2x-1685188934.png";
try {
final sca = SensitiveContentAnalysis();
if (analyzeUrl != null) {
// Analyze the network image for sensitive content.
final bool? isSensitive = await sca.analyzeNetworkImage(url: analyzeUrl);
if (isSensitive != null) {
return isSensitive;
} else {
debugPrint("Enable ”Sensitive Content Warning” in Settings -> Privacy & Security.");
return null;
}
}
} catch (e) {
return null;
}
Analyze Network Video:
Future<void> analyzeNetworkVideo() async {
try {
Dio dio = Dio();
Directory tempDir = await getTemporaryDirectory();
const url = "https://developer.apple.com/sample-code/web/qr-sca.mov";
final videoName = p.basename(url);
final file = File("${tempDir.path}/$videoName");
final response = await dio.download(url, file.path);
if (response.statusCode == 200) {
bool? isSensitive = await sca.analyzeVideo(url: file.path);
debugPrint("SENSITIVE: $isSensitive");
await file.delete();
}
} catch (e) {
debugPrint(e.toString());
}
}
Analyze Local Video:
Future<void> analyzeLocalVideo() async {
try {
const XTypeGroup typeGroup = XTypeGroup(
label: 'video',
extensions: <String>['mp4', 'mkv', 'avi', 'mov'],
);
final XFile? selectedFile =
await openFile(acceptedTypeGroups: <XTypeGroup>[typeGroup]);
if (selectedFile != null) {
bool? isSensitive = await sca.analyzeVideo(url: selectedFile.path);
debugPrint("SENSITIVE: $isSensitive");
}
} catch (e) {
debugPrint(e.toString());
}
}
Apps using: #
Caveats #
Unlike with other ML models, the SensitiveContentAnalysis Framework:
- Does not return a list of probabilities.
- Does not allow additional training.
- Is not open source.
- Only works with Apple devices. (iOS 17.0+, macOS 14.0+, Mac Catalyst 17.0+, iPadOS 17.0+)
Notice: This package was initally created to be used in-house, as such the development is first and foremost aligned with the internal requirements.