LiveKit Flutter SDK
Use this SDK to add realtime video, audio and data features to your Flutter app. By connecting to LiveKit Cloud or a self-hosted server, you can quickly build applications such as multi-modal AI, live streaming, or video calls with just a few lines of code.
This package is published to pub.dev as livekit_client.
Docs
More Docs and guides are available at https://docs.livekit.io
Supported platforms
LiveKit client SDK for Flutter is designed to work across all platforms supported by Flutter:
- Android
- iOS
- Web
- macOS
- Windows
- Linux
Example app
We built a multi-user conferencing app as an example in the example/ folder. LiveKit is compatible cross-platform: you could join the same room using any of our supported realtime SDKs.
Online demo: https://livekit.github.io/client-sdk-flutter/
Installation
Include this package to your pubspec.yaml
---
dependencies:
livekit_client: <version>
iOS
Camera and microphone usage need to be declared in your Info.plist
file.
<dict>
...
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) uses your camera</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) uses your microphone</string>
Your application can still run the voice call when it is switched to the background if the background mode is enabled. Select the app target in Xcode, click the Capabilities tab, enable Background Modes, and check Audio, AirPlay, and Picture in Picture.
Your Info.plist
should have the following entries.
<dict>
...
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
</array>
Notes
Since xcode 14 no longer supports 32bit builds, and our latest version is based on libwebrtc m104+ the iOS framework no longer supports 32bit builds, we strongly recommend upgrading to flutter 3.3.0+. if you are using flutter 3.0.0 or below, there is a high chance that your flutter app cannot be compiled correctly due to the missing i386 and arm 32bit framework #132 #172.
You can try to modify your {projects_dir}/ios/Podfile
to fix this issue.
post_install do |installer|
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)
target.build_configurations.each do |config|
# Workaround for https://github.com/flutter/flutter/issues/64502
config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= this line
end
end
end
For iOS, the minimum supported deployment target is 12.1
. You will need to add the following to your Podfile.
platform :ios, '12.1'
You may need to delete Podfile.lock
and re-run pod install
after updating deployment target.
Android
We require a set of permissions that need to be declared in your AppManifest.xml
. These are required by Flutter WebRTC, which we depend on.
<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.your.package">
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
...
</manifest>
For using the bluetooth headset correctly on the android device, you need to add permission_handler
to your project.
And call the following code after launching your app for the first time.
import 'package:permission_handler/permission_handler.dart';
Future<void> _checkPermissions() async {
var status = await Permission.bluetooth.request();
if (status.isPermanentlyDenied) {
print('Bluetooth Permission disabled');
}
status = await Permission.bluetoothConnect.request();
if (status.isPermanentlyDenied) {
print('Bluetooth Connect Permission disabled');
}
}
void main() async {
WidgetsFlutterBinding.ensureInitialized();
await _checkPermissions();
runApp(MyApp());
}
Audio Modes
By default, we use the communication
audio mode on Android which works best for two-way voice communication.
If your app is media playback oriented and does not need the use of the device's microphone, you can use the media
audio mode which will provide better audio quality.
import 'package:flutter_webrtc/flutter_webrtc.dart' as webrtc;
Future<void> _initializeAndroidAudioSettings() async {
await webrtc.WebRTC.initialize(options: {
'androidAudioConfiguration': webrtc.AndroidAudioConfiguration.media.toMap()
});
webrtc.Helper.setAndroidAudioConfiguration(
webrtc.AndroidAudioConfiguration.media);
}
void main() async {
await _initializeAudioSettings();
runApp(const MyApp());
}
Note: the audio routing will become controlled by the system and cannot be manually changed with functions like
Hardware.selectAudioOutput
.
Desktop support
In order to enable Flutter desktop development, please follow instructions here.
On Windows VS 2019 is needed (link in flutter docs will download VS 2022).
Usage
Connecting to a room, publish video & audio
final roomOptions = RoomOptions(
adaptiveStream: true,
dynacast: true,
// ... your room options
)
final room = Room();
// you can use `prepareConnection` to speed up connection.
await room.prepareConnection(url, token);
await room.connect(url, token, roomOptions: roomOptions);
try {
// video will fail when running in ios simulator
await room.localParticipant.setCameraEnabled(true);
} catch (error) {
print('Could not publish video, error: $error');
}
await room.localParticipant.setMicrophoneEnabled(true);
Screen sharing
Screen sharing is supported across all platforms. You can enable it with:
room.localParticipant.setScreenShareEnabled(true);
Android
On Android, you will have to use a media projection foreground service.
In our example, we use the flutter_background
package to handle this. In the app's AndroidManifest.xml file, declare the service with the appropriate types and permissions as following:
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<!-- Required permissions for screen share -->
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION" />
<application>
...
<service
android:name="de.julianassmann.flutter_background.IsolateHolderService"
android:enabled="true"
android:exported="false"
android:foregroundServiceType="mediaProjection" />
</application>
</manifest>
Before starting the background service and enabling screen share, you must call Helper.requestCapturePermission()
from flutter_webrtc
, and only proceed if it returns true. Refer to our example implementation for details.
iOS
On iOS, a broadcast extension is needed in order to capture screen content from other apps. See setup guide for instructions.
Desktop(Windows/macOS)
On dekstop you can use ScreenSelectDialog
to select the window or screen you want to share.
try {
final source = await showDialog<DesktopCapturerSource>(
context: context,
builder: (context) => ScreenSelectDialog(),
);
if (source == null) {
print('cancelled screenshare');
return;
}
print('DesktopCapturerSource: ${source.id}');
var track = await LocalVideoTrack.createScreenShareTrack(
ScreenShareCaptureOptions(
sourceId: source.id,
maxFrameRate: 15.0,
),
);
await room.localParticipant.publishVideoTrack(track);
} catch (e) {
print('could not publish screen sharing: $e');
}
End to End Encryption
LiveKit supports end-to-end encryption for audio/video data sent over the network.
By default, the native platform can support E2EE without any settings, but for flutter web, you need to use the following steps to create e2ee.worker.dart.js
file.
# for example app
dart compile js web/e2ee.worker.dart -o example/web/e2ee.worker.dart.js -m
# for your project
export YOU_PROJECT_DIR=your_project_dir
git clone https://github.com/livekit/client-sdk-flutter.git
cd client-sdk-flutter && flutter pub get
dart compile js web/e2ee.worker.dart -o ${YOU_PROJECT_DIR}/web/e2ee.worker.dart.js -m
Advanced track manipulation
The setCameraEnabled/setMicrophoneEnabled helpers are wrappers around the Track API.
You can also manually create and publish tracks:
var localVideo = await LocalVideoTrack.createCameraTrack();
await room.localParticipant.publishVideoTrack(localVideo);
Rendering video
Each track can be rendered separately with the provided VideoTrackRenderer
widget.
VideoTrack? track;
@override
Widget build(BuildContext context) {
if (track != null) {
return VideoTrackRenderer(track);
} else {
return Container(
color: Colors.grey,
);
}
}
Audio handling
Audio tracks are played automatically as long as you are subscribed to them.
Handling changes
LiveKit client makes it simple to build declarative UI that reacts to state changes. It notifies changes in two ways
ChangeNotifier
- generic notification of changes. This is useful when you are building reactive UI and only care about changes that may impact rendering.EventsListener<Event>
- listener pattern to listen to specific events (see events.dart).
This example will show you how to use both to react to room events.
class RoomWidget extends StatefulWidget {
final Room room;
RoomWidget(this.room);
@override
State<StatefulWidget> createState() {
return _RoomState();
}
}
class _RoomState extends State<RoomWidget> {
late final EventsListener<RoomEvent> _listener = widget.room.createListener();
@override
void initState() {
super.initState();
// used for generic change updates
widget.room.addListener(_onChange);
// used for specific events
_listener
..on<RoomDisconnectedEvent>((_) {
// handle disconnect
})
..on<ParticipantConnectedEvent>((e) {
print("participant joined: ${e.participant.identity}");
})
}
@override
void dispose() {
// be sure to dispose listener to stop listening to further updates
_listener.dispose();
widget.room.removeListener(_onChange);
super.dispose();
}
void _onChange() {
// perform computations and then call setState
// setState will trigger a build
setState(() {
// your updates here
});
}
@override
Widget build(BuildContext context) {
// your build function
}
}
Similarly, you could do the same when rendering participants. Reacting to changes makes it possible to handle tracks published/unpublished or re-ordering participants in your UI.
class VideoView extends StatefulWidget {
final Participant participant;
VideoView(this.participant);
@override
State<StatefulWidget> createState() {
return _VideoViewState();
}
}
class _VideoViewState extends State<VideoView> {
TrackPublication? videoPub;
@override
void initState() {
super.initState();
widget.participant.addListener(this._onParticipantChanged);
// trigger initial change
_onParticipantChanged();
}
@override
void dispose() {
widget.participant.removeListener(this._onParticipantChanged);
super.dispose();
}
@override
void didUpdateWidget(covariant VideoView oldWidget) {
oldWidget.participant.removeListener(_onParticipantChanged);
widget.participant.addListener(_onParticipantChanged);
_onParticipantChanged();
super.didUpdateWidget(oldWidget);
}
void _onParticipantChanged() {
var subscribedVideos = widget.participant.videoTracks.values.where((pub) {
return pub.kind == TrackType.VIDEO &&
!pub.isScreenShare &&
pub.subscribed;
});
setState(() {
if (subscribedVideos.length > 0) {
var videoPub = subscribedVideos.first;
// when muted, show placeholder
if (!videoPub.muted) {
this.videoPub = videoPub;
return;
}
}
this.videoPub = null;
});
}
@override
Widget build(BuildContext context) {
var videoPub = this.videoPub;
if (videoPub != null) {
return VideoTrackRenderer(videoPub.track as VideoTrack);
} else {
return Container(
color: Colors.grey,
);
}
}
}
Mute, unmute local tracks
On LocalTrackPublication
s, you could control if the track is muted by setting its muted
property. Changing the mute status will generate an onTrackMuted
or onTrack Unmuted
delegate call for the local participant. Other participant will receive the status change as well.
// mute track
trackPub.muted = true;
// unmute track
trackPub.muted = false;
Subscriber controls
When subscribing to remote tracks, the client has precise control over status of its subscriptions. You could subscribe or unsubscribe to a track, change its quality, or disabling the track temporarily.
These controls are accessible on the RemoteTrackPublication
object.
For more info, see Subscribing to tracks.
Getting help / Contributing
Please join us on Slack to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there.
License
Apache License 2.0
Thanks
A huge thank you to flutter-webrtc for making it possible to use WebRTC in Flutter.
LiveKit Ecosystem | |
---|---|
Realtime SDKs | React Components · Browser · Swift Components · iOS/macOS/visionOS · Android · Flutter · React Native · Rust · Node.js · Python · Unity (web) · Unity (beta) |
Server APIs | Node.js · Golang · Ruby · Java/Kotlin · Python · Rust · PHP (community) |
Agents Frameworks | Python · Playground |
Services | LiveKit server · Egress · Ingress · SIP |
Resources | Docs · Example apps · Cloud · Self-hosting · CLI |
Libraries
- livekit_client
- Flutter Client SDK to LiveKit.
- livekit_client_web