flutter_meta_wearables_dat 0.1.2
flutter_meta_wearables_dat: ^0.1.2 copied to clipboard
Flutter bridge to Meta's Wearables DAT for iOS and Android.
flutter_meta_wearables_dat #
A Flutter plugin that provides a bridge to Meta's Wearables Device Access Toolkit (DAT), enabling integration with Meta AI Glasses for iOS and Android.
Table of contents #
- flutter_meta_wearables_dat
Publishing disclaimer #
The Meta Wearables Device Access Toolkit is currently in developer preview. During this phase:
- You can use the SDK to build, prototype, and test your app.
- You can distribute to testers within your organization or team (e.g. via the beta testing platform in the Meta Wearables Developer Center).
- Publishing to the general public is limited: only select partners can publish their DAT integrations to public app stores. Most apps using DAT cannot be published publicly yet.
Meta is running the preview to test, learn, and refine the toolkit; broader publishing (general availability) is planned for 2026. For full details, see Introducing the Meta Wearables Device Access Toolkit and the Meta Wearables FAQ.
Compatible devices #
- Ray-Ban Meta (Gen 1 & 2)
- Meta Ray-Ban Display
- Oakley Meta HSTN
- Oakley Meta Vanguard
Setup #
Glasses setup (Developer mode) #
To set up your glasses for development, you must enable Developer mode in the Meta AI app. See Enable developer mode in the Meta AI app for instructions.
iOS Configuration #
Minimum deployment target: iOS 17.0
Add the following to your Info.plist:
<key>NSBluetoothAlwaysUsageDescription</key>
<string>Needed to connect to Meta AI Glasses</string>
<key>LSApplicationQueriesSchemes</key>
<array>
<string>fb-viewapp</string>
</array>
<key>UISupportedExternalAccessoryProtocols</key>
<array>
<string>com.meta.ar.wearable</string>
</array>
<key>UIBackgroundModes</key>
<array>
<string>bluetooth-peripheral</string>
<string>external-accessory</string>
</array>
<!-- Deep link callback from Meta AI app - scheme must match AppLinkURLScheme below -->
<key>CFBundleURLTypes</key>
<array>
<dict>
<key>CFBundleURLSchemes</key>
<array>
<string>myexampleapp</string>
</array>
</dict>
</array>
<!-- Meta Wearables Device Access Toolkit Setup -->
<key>MWDAT</key>
<dict>
<key>AppLinkURLScheme</key>
<!-- Must match CFBundleURLSchemes above so Meta AI redirects to a URL this app handles -->
<string>myexampleapp://</string>
<key>MetaAppID</key>
<!-- Without Developer Mode, use the ID from the app registered in Wearables Developer Center -->
<string>YOUR_APP_ID</string>
<key>ClientToken</key>
<!-- Without Developer Mode, use the ClientToken from Wearables Developer Center -->
<string>YOUR_CLIENT_TOKEN</string>
<key>TeamID</key>
<!-- Your Apple Developer Team ID - Set this in Xcode under Signing & Capabilities -->
<string>$(DEVELOPMENT_TEAM)</string>
<key>Analytics</key>
<dict>
<key>OptOut</key>
<true/>
</dict>
</dict>
Android Configuration #
1. AndroidManifest.xml
Add the following to your app's AndroidManifest.xml:
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
<uses-permission android:name="android.permission.INTERNET" />
<application>
<!-- Required: Your application ID from Wearables Developer Center -->
<!-- Use "0" for Developer Mode -->
<meta-data
android:name="com.meta.wearable.mwdat.APPLICATION_ID"
android:value="0" />
<!-- Optional: Disable analytics -->
<meta-data
android:name="com.meta.wearable.mwdat.ANALYTICS_OPT_OUT"
android:value="true" />
<!-- Deep link callback from Meta AI app -->
<activity android:name=".MainActivity" android:launchMode="singleTop">
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.BROWSABLE" />
<category android:name="android.intent.category.DEFAULT" />
<data android:scheme="myexampleapp" />
</intent-filter>
</activity>
</application>
2. Repository Configuration
Add the GitHub Packages repository to your settings.gradle.kts. First, add the necessary imports at the top of the file:
import java.util.Properties
import kotlin.io.path.div
import kotlin.io.path.exists
import kotlin.io.path.inputStream
Then add the repository configuration:
val localProperties =
Properties().apply {
val localPropertiesPath = rootDir.toPath() / "local.properties"
if (localPropertiesPath.exists()) {
load(localPropertiesPath.inputStream())
}
}
dependencyResolutionManagement {
// Flutter's Gradle plugin adds a maven repo at the project level.
repositoriesMode.set(RepositoriesMode.PREFER_SETTINGS)
repositories {
google()
mavenCentral()
maven {
url = uri("https://maven.pkg.github.com/facebook/meta-wearables-dat-android")
credentials {
username = "" // not needed
password = System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token")
}
}
}
}
Note: We use PREFER_SETTINGS instead of FAIL_ON_PROJECT_REPOS because Flutter's Gradle plugin needs to add repositories at the project level.
Set a GitHub token with read:packages scope via:
- Environment variable:
GITHUB_TOKEN - Or in
local.properties:github_token=your_token_here
3. MainActivity configuration
Wearables permission requests use Wearables.RequestPermissionContract, which requires
the hosting Android Activity to be a ComponentActivity. In a Flutter app this means
you must extend FlutterFragmentActivity (which itself extends FragmentActivity
→ ComponentActivity), not FlutterActivity.
Update your android/app/src/main/kotlin/.../MainActivity.kt (or .java) to:
package com.yourcompany.yourapp
import io.flutter.embedding.android.FlutterFragmentActivity
class MainActivity : FlutterFragmentActivity()
If you keep using FlutterActivity, the DAT permission sheet will not be able to
register an ActivityResultLauncher and camera permission requests will fail.
Meta Wearables Developer Center #
Add and configure your app in the Meta Wearables Developer Center to obtain your MetaAppID and complete the setup.
Integration Lifecycle #
The plugin follows Meta's integration lifecycle as documented in the Meta Wearables Developer Documentation:
0. Android Permissions (Android only) #
- Call
MetaWearablesDat.requestAndroidPermissions()before any other DAT calls - This requests Bluetooth and Internet runtime permissions required by the DAT SDK
- Important: On Android, the DAT SDK is not initialized until these permissions are granted. This is critical for device discovery to work correctly.
- No-op on iOS
1. Registration (One-time) #
- User taps a call-to-action in your app (e.g., "Connect my glasses")
- Call
MetaWearablesDat.startRegistration()to open the Meta AI app - User confirms the connection in Meta AI app
- Meta AI returns to your app via deep link
- Handle the callback URL with
MetaWearablesDat.handleUrl(url)to complete registration - Monitor registration state via
MetaWearablesDat.registrationStateStream() - Monitor active device availability via
MetaWearablesDat.activeDeviceStream()
2. Permissions (First-time camera access) #
- When your app first attempts to access the camera, request permission
- Call
MetaWearablesDat.requestCameraPermission()to show the Meta AI permission bottom sheet - User can allow always, allow once, or deny
3. Session (After registration and permissions) #
- Once registered and permissions are granted, start a streaming session
- Call
MetaWearablesDat.startStreamSession(deviceUUID)— returns atextureId - Render the live video feed using Flutter's
Texturewidget with the returned ID - Call
MetaWearablesDat.stopStreamSession(deviceUUID)to end the session
// Start streaming — returns a texture ID for zero-copy rendering
final textureId = await MetaWearablesDat.startStreamSession(
deviceUUID,
fps: 30,
streamQuality: StreamQuality.high,
);
// Render the live video feed
Texture(textureId: textureId);
// Stop streaming when done
await MetaWearablesDat.stopStreamSession(deviceUUID);
Video frames are pushed directly from native (CVPixelBuffer on iOS, SurfaceTexture on Android) to the Flutter engine — no JPEG encoding, no byte copying, no Dart-side decoding.
Accessing raw frame bytes
For use cases that need pixel-level access — OCR, on-device ML inference, computer vision — use captureStreamFrame. This rasterizes the Flutter texture on the Dart side and returns raw RGBA bytes:
import 'dart:async';
// After startStreamSession returns a textureId...
Timer? _frameTimer;
void startFrameProcessing(int textureId) {
_frameTimer = Timer.periodic(const Duration(milliseconds: 400), (_) async {
final frame = await MetaWearablesDat.captureStreamFrame(textureId);
if (frame == null) return;
// frame.bytes is raw RGBA — feed directly to ML Kit, Vision, etc.
// frame.width → 720
// frame.height → 1280
await runOcr(frame.bytes, frame.width, frame.height);
});
}
void stopFrameProcessing() => _frameTimer?.cancel();
Parameters:
textureId— the ID returned bystartStreamSession(required)width/height— capture resolution, defaults to720×1280(glasses native resolution)format—FrameFormat.rawRgba(default),FrameFormat.rawStraightRgba, orFrameFormat.png
Memory note: Raw RGBA at 720×1280 is ~3.7 MB per frame. Capture on demand (every 200–500 ms is typical for OCR/ML) rather than every rendered frame.
Note: See the example app for a complete implementation.
Troubleshooting #
If you run into issues, try these steps first:
- Update Meta AI app — Make sure you have the latest version of the Meta AI app installed on your phone.
- Update Glasses in Meta AI app — In the Meta AI app, check for and install any available firmware updates for your glasses.
- Verify installation — Ensure you have followed all installation steps above, including configuration in your code and in the Meta Wearables Developer Center.
- Restart your glasses — If the glasses don't connect or the stream doesn't start, try restarting them:
- Switch the power button to off.
- Press and hold the capture button, then slide the power switch on.
- Release the capture button when the LED turns red (don't wait until the LED turns white).
Common issues:
- Registration deep link not returning — If registration opens the Meta AI app but the callback does not return to your app, verify that your URL scheme matches the one registered in the Meta Wearables Developer Center. On iOS, ensure
CFBundleURLSchemesinInfo.plist(andAppLinkURLSchemein theMWDATdict) use the same scheme. On Android, ensure thedata android:schemein your activity's intent-filter matches that scheme.
Still having issues? — Open a GitHub issue with all the details you can provide. This helps us pinpoint the problem and assist you more efficiently.
Example app #
The example app is a clone of the Meta's sample Camera Access native app.
Here's a demo showing how the DAT integration looks like:
Contributing #
Contributions are welcome! Feel free to open issues for bugs or feature requests, and pull requests for improvements.
License #
MIT License — see LICENSE for details.