flutter_mpv 1.2.10 copy "flutter_mpv: ^1.2.10" to clipboard
flutter_mpv: ^1.2.10 copied to clipboard

A cross-platform video player & audio player for Flutter & Dart with advanced performance controls. Fork of flutter_mpv.

flutter_mpv #

A cross-platform video player & audio player for Flutter & Dart with advanced performance controls.

This is a fork of flutter_mpv with additional video performance configuration options.

Installation #

flutter_mpv is split into multiple packages to improve modularity & reduce bundle size.

For apps that need video playback:

dependencies:
  flutter_mpv: ^1.2.9 # Primary package.
  flutter_mpv_video: ^2.0.3 # For video rendering.
  flutter_mpv_libs_video: ^1.0.10 # Native video dependencies.

For apps that need audio playback:

dependencies:
  flutter_mpv: ^1.2.9 # Primary package.
  flutter_mpv_libs_audio: ^1.0.10 # Native audio dependencies.

Notes:

Platforms #

Platform Video Audio Notes
Android Android 5.0 or above.
iOS iOS 9 or above.
macOS macOS 10.9 or above.
Windows Windows 7 or above.
GNU/Linux Any modern GNU/Linux distribution.
Web Any modern web browser.
  • ✅ Video playback
  • ✅ Audio playback
  • ✅ Cross platform
  • ✅ Wide format/codec support
  • ✅ Hardware/GPU acceleration
  • ✅ Playlist support with next/previous/jump/shuffle
  • ✅ Volume/Rate/Pitch change
  • ✅ Video/Audio/Subtitle track selection
  • ✅ External audio/subtitle track selection
  • ✅ HTTP headers
  • ✅ Video controls
  • ✅ Subtitle styling
  • ✅ Screenshot

TL;DR #

A quick usage example.

import 'package:flutter/material.dart';

// Make sure to add following packages to pubspec.yaml:
// * flutter_mpv
// * flutter_mpv_video
// * flutter_mpv_libs_video
import 'package:flutter_mpv/flutter_mpv.dart';                      // Provides [Player], [Media], [Playlist] etc.
import 'package:flutter_mpv_video/flutter_mpv_video.dart';          // Provides [VideoController] & [Video] etc.

void main() {
  WidgetsFlutterBinding.ensureInitialized();
  // Necessary initialization for flutter_mpv.
  FlutterMpv.ensureInitialized();
  runApp(
    const MaterialApp(
      home: MyScreen(),
    ),
  );
}

class MyScreen extends StatefulWidget {
  const MyScreen({Key? key}) : super(key: key);
  @override
  State<MyScreen> createState() => MyScreenState();
}

class MyScreenState extends State<MyScreen> {
  // Create a [Player] to control playback.
  late final player = Player();
  // Create a [VideoController] to handle video output from [Player].
  late final controller = VideoController(player);

  @override
  void initState() {
    super.initState();
    // Play a [Media] or [Playlist].
    player.open(Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'));
  }

  @override
  void dispose() {
    player.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Center(
      child: SizedBox(
        width: MediaQuery.of(context).size.width,
        height: MediaQuery.of(context).size.width * 9.0 / 16.0,
        // Use [Video] widget to display video output.
        child: Video(controller: controller),
      ),
    );
  }
}

Note: You may need to add required permissions to your project (only if required).

Guide #

A usage guide for flutter_mpv.

Tip: Use Ctrl + F to quickly search for things.

Contents #

Initialization #

FlutterMpv.ensureInitialized must be called before using the package:

void main() {
  WidgetsFlutterBinding.ensureInitialized();
  // Make sure to add the required packages to pubspec.yaml:
  // * See Installation section
  FlutterMpv.ensureInitialized();
  runApp(const MyApp());
}

The method also has some optional arguments to customize the global behavior. To handle any initialization errors, this may be surrounded by try/catch.

Note: For backward compatibility, FlutterMpv.ensureInitialized() still works but is deprecated. Use FlutterMpv.ensureInitialized() instead.

Create a Player #

A Player instance is used to start & control the playback of a media source e.g. URL or file.

final Player player = Player();

Additional options may be provided using the configuration argument in the constructor. In general situations, you will never require this.

final Player player = Player(
  configuration: PlayerConfiguration(
    // Supply your options:
    title: 'My awesome flutter_mpv application',
    ready: () {
      print('The initialization is complete.');
    },
  ),
);

Dispose a Player #

It is extremely important to release the allocated resources back to the system:

await player.dispose();

Open a Media or Playlist #

A Playable can either be a Media or a Playlist.

  • Media: Single playback source (file or URL).
  • Playlist: Queue of playback sources (file or URL).

Use the Player.open method to load & start playback.

Media

final playable = Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4');
await player.open(playable);

Playlist

final playable = Playlist(
  [
    Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373709-603a7a89-2105-4e1b-a5a5-a6c3567c9a59.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373716-76da0a4e-225a-44e4-9ee7-3e9006dbc3e3.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373718-86ce5e1d-d195-45d5-baa6-ef94041d0b90.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373720-14d69157-1a56-4a78-a2f4-d7a134d7c3e9.mp4'),
  ],
);
await player.open(playable);

Notes:

  1. By default, this will automatically start playing the playable. This may be disabled as follows:
await player.open(
  playable,
  play: false,
);
  1. By default, the playlist will start at the index 0. This may be changed as follows:
final playable = Playlist(
  [
    Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373709-603a7a89-2105-4e1b-a5a5-a6c3567c9a59.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373716-76da0a4e-225a-44e4-9ee7-3e9006dbc3e3.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373718-86ce5e1d-d195-45d5-baa6-ef94041d0b90.mp4'),
    Media('https://user-images.githubusercontent.com/28951144/229373720-14d69157-1a56-4a78-a2f4-d7a134d7c3e9.mp4'),
  ],
  // Declare the starting position.
  index: 0,
);
await player.open(playable);

Play, pause or play/pause #

The 3 methods are:

await player.play();
await player.pause();
await player.playOrPause();

Stop #

The stop method may be used to stop the playback of currently opened Media or Playlist.

await player.stop();

It does not release allocated resources back to the system (unlike dispose) & Player still stays usable.

Seek #

Supply the final position to Player.seek method as Duration:

await player.seek(
  const Duration(
    minutes: 6,
    seconds: 9,
  ),
);

Loop or repeat #

Three PlaylistModes are available:

  • PlaylistMode.none: End playback once end of the playlist is reached.
  • PlaylistMode.single: Indefinitely loop over the currently playing file in the playlist.
  • PlaylistMode.loop: Loop over the playlist & restart it from beginning once end is reached.
await player.setPlaylistMode(PlaylistMode.single);

Set volume, rate or pitch #

Set the volume

This controls the loudness of audio output. The maximum volume is 100.0.

await player.setVolume(50.0);

Set the rate

This controls the playback speed.

await player.setRate(1.5);

Set the pitch

This controls the pitch of the audio output.

await player.setPitch(1.2);

Note: This requires pitch argument to be true in PlayerConfiguration.

Handle playback events #

You can access or subscribe to Player's state changes.

Event handling is an extremely important part of media playback. It is used to show changes in the UI, handle errors, detect the occurrence of play/pause, end-of-file, position updates etc.

  • Player.stream.*: Provides access to Player's state as Stream(s).
  • Player.state.*: Provides access to Player's state directly (for instantaneous access).

A typical example will be:

player.stream.playing.listen(
  (bool playing) {
    if (playing) {
      // Playing.
    } else {
      // Paused.
    }
  },
);
player.stream.position.listen(
  (Duration position) {
    setState(() {
      // Update UI.
    });
  },
);

The following state(s) are available as events:

Type Name Description
Stream<Playlist> playlist Currently opened media sources.
Stream<bool> playing Whether playing or not.
Stream<bool> completed Whether end of currently playing media source has been reached.
Stream<Duration> position Current playback position.
Stream<Duration> duration Current playback duration.
Stream<double> volume Current volume.
Stream<double> rate Current playback rate.
Stream<double> pitch Current pitch.
Stream<bool> buffering Whether buffering or not.
Stream<Duration> buffer Current buffer position. This indicates how much of the stream has been decoded & cached by the demuxer.
Stream<PlaylistMode> playlistMode Current playlist mode.
Stream<bool> shuffle Whether playlist is shuffled or not.
Stream<AudioParams> audioParams Audio parameters of the currently playing media source e.g. sample rate, channels, etc.
Stream<VideoParams> videoParams Video parameters of the currently playing media source e.g. width, height, rotation etc.
Stream<double?> audioBitrate Audio bitrate of the currently playing media source.
Stream<AudioDevice> audioDevice Currently selected audio device.
Stream<List<AudioDevice>> audioDevices Currently available audio devices.
Stream<Track> track Currently selected video, audio and subtitle track.
Stream<Tracks> tracks Currently available video, audio and subtitle tracks.
Stream<int> width Currently playing video's width.
Stream<int> height Currently playing video's height.
Stream<int> subtitle Currently displayed subtitle.
Stream<PlayerLog> log Internal logs.
Stream<String> error Error messages. This may be used to handle & display errors to the user.

Shuffle the queue #

You may find the requirement to shuffle the Playlist you open'd in Player, like some music players do.

await player.setShuffle(true);

Note: This option is reset upon the next Player.open call.

Use HTTP headers #

Declare the httpHeaders argument in Media constructor. It takes the HTTP headers as Map<String, String>.

final playable = Media(
  'https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4',
  httpHeaders: {
    'Foo': 'Bar',
    'Accept': '*/*',
    'Range': 'bytes=0-',
  },
);

Use extras to store additional data with Media #

The extras argument may be utilized to store additional data with a Media in form of Map<String, dynamic>.

final playable = Media(
  'https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4',
  extras: {
    'track': '9',
    'year': '2012',
    'title': 'Courtesy Call',
    'artist': 'Thousand Foot Krutch',
    'album': 'The End Is Where We Begin',
  },
);

Modify Player's queue #

You can add or remove (etc.) a Media in an already playing Playlist:

Add

Add a new Media to the back of the queue:

await player.add(Media('https://user-images.githubusercontent.com/28951144/229373695-22f88f13-d18f-4288-9bf1-c3e078d83722.mp4'));

Remove

Remove any item from the queue:

await player.remove(0);

Move

Move any item in the queue from one position to another:

await player.move(6, 9);

Go to next, previous or any other position in queue #

Skip to the next queue item

await player.next();

Skip to the previous queue item

await player.previous();

Skip to any other queue item

await player.jump(5);

Select video, audio or subtitle track #

A media source may contain multiple video, audio or subtitle tracks e.g. for multiple languages. Available video, audio or subtitle tracks are notified through Player's state. See "Handle playback events" section for related information.

By default, video, audio & subtitle track is selected automatically i.e. VideoTrack.auto(), AudioTrack.auto() & SubtitleTrack.auto().

Automatic selection

await player.setVideoTrack(VideoTrack.auto());

await player.setAudioTrack(AudioTrack.auto());

await player.setSubtitleTrack(SubtitleTrack.auto());

Disable track

This may be used to essentially disable video output, disable audio output or stop rendering of subtitles etc.

await player.setVideoTrack(VideoTrack.no());

await player.setAudioTrack(AudioTrack.no());

await player.setSubtitleTrack(SubtitleTrack.no());

Select custom track

  • Retrieve currently available tracks:
List<VideoTrack> videos = player.state.tracks.video;
List<AudioTrack> audios = player.state.tracks.audio;
List<SubtitleTrack> subtitles = player.state.tracks.subtitle;

// Get notified as [Stream]:
player.stream.tracks.listen((event) {
  List<VideoTrack> videos = event.video;
  List<AudioTrack> audios = event.audio;
  List<SubtitleTrack> subtitles = event.subtitle;
});
  • Select the track:
await player.setVideoTrack(videos[0]);
await player.setAudioTrack(audios[1]);
await player.setSubtitleTrack(subtitles[2]);
  • Get notified about currently selected track:
VideoTrack video = player.state.track.video;
AudioTrack audio = player.state.track.audio;
SubtitleTrack subtitle = player.state.track.subtitle;

// Get notified as [Stream]:
player.stream.track.listen((event) {
  VideoTrack video = event.video;
  AudioTrack audio = event.audio;
  SubtitleTrack subtitle = event.subtitle;
});

Select audio device #

Available audio devices are notified through Player's state. See "Handle playback events" section for related information.

By default, audio device is selected automatically i.e. AudioDevice.auto().

Default selection

await player.setAudioDevice(AudioDevice.auto());

Disable audio output

await player.setAudioDevice(AudioDevice.no());

Select custom audio device

  • Retrieve currently available audio devices:
List<AudioDevice> devices = player.state.audioDevices;

// Get notified as [Stream]:
player.stream.audioDevices.listen((event) {
  List<AudioDevice> devices = event;
});
  • Select the audio device:
await player.setAudioDevice(devices[1]);
  • Get notified about currently selected audio device:
AudioDevice device = player.state.audioDevice;

// Get notified as [Stream]:
player.stream.audioDevice.listen((event) {
  AudioDevice device = event;
});

Display the video #

The existing "TL;DR example" should provide you better idea.

For displaying the video inside Flutter UI, you must:

  • Create VideoController
    • Pass the Player you already have.
  • Create Video widget
    • Pass the VideoController you already have.

The code is easier to understand:

class _MyScreenState extends State<MyScreen> {
  late final Player player = Player();
  late final VideoController controller = VideoController(player);

  @override
  void dispose() {
    player.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      body: Video(
        controller: controller,
      ),
    );
  }
}

The video playback uses hardware acceleration i.e. GPU by default.

Additional options may be provided using the configuration argument in the constructor. In general situations, you will never require this.

final VideoController controller = VideoController(
  player,
  configuration: const VideoControllerConfiguration(
    // Supply your options:
    enableHardwareAcceleration: true,      // default: true
    width: 640,                            // default: null
    height: 480,                           // default: null
    // The in-code comments is best place to know more about these options:
    // See: flutter_mpv_video package source code
  ),
);

Capture screenshot #

The screenshot method takes the snapshot of the current video frame & returns encoded image bytes as Uint8List.

final Uint8List? screenshot = await player.screenshot();

Additionally format argument may be specified to change the encoding format. Following formats are supported:

  • image/jpeg: Returns a JPEG encoded image.
  • image/png: Returns a PNG encoded image.
  • null: Returns BGRA pixel buffer.

Customize subtitles #

SubtitleViewConfiguration can be passed to the Video widget for customizing the subtitles. The code is easier to understand:

Notably, TextStyle, TextAlign & EdgeInsetsGeometry can be provided.

Video(
  controller: controller,
  subtitleViewConfiguration: const SubtitleViewConfiguration(
    style: TextStyle(
      height: 1.4,
      fontSize: 24.0,
      letterSpacing: 0.0,
      wordSpacing: 0.0,
      color: Color(0xffffffff),
      fontWeight: FontWeight.normal,
      backgroundColor: Color(0xaa000000),
    ),
    textAlign: TextAlign.center,
    padding: EdgeInsets.all(24.0),
  ),
);

https://user-images.githubusercontent.com/28951144/253067794-73b5ca5d-e90d-4892-bc09-2a80f05c9f0b.mp4

Load external subtitle track #

The SubtitleTrack.uri constructor can be used to load external subtitle track with URI e.g. SRT, WebVTT etc. The code is easier to understand:

await player.setSubtitleTrack(
  SubtitleTrack.uri(
    'https://www.iandevlin.com/html5test/webvtt/upc-video-subtitles-en.vtt',
    title: 'English',
    language: 'en',
  ),
);

The SubtitleTrack.data constructor can be used to load external subtitle track with data e.g. SRT, WebVTT etc. The code is easier to understand:

player.setSubtitleTrack(
  SubtitleTrack.data(
    '''WEBVTT FILE

1
00:00:03.500 --> 00:00:05.000 D:vertical A:start
Everyone wants the most from life

2
00:00:06.000 --> 00:00:09.000 A:start
Like internet experiences that are rich <b>and</b> entertaining

3
00:00:11.000 --> 00:00:14.000 A:end
Phone conversations where people truly <c.highlight>connect</c>

4
00:00:14.500 --> 00:00:18.000
Your favourite TV programmes ready to watch at the touch of a button

5
00:00:19.000 --> 00:00:24.000
Which is why we are bringing TV, internet and phone together in <c.highlight>one</c> super package

6
00:00:24.500 --> 00:00:26.000
<c.highlight>One</c> simple way to get everything

7
00:00:26.500 --> 00:00:27.500 L:12%
UPC

8
00:00:28.000 --> 00:00:30.000 L:75%
Simply for <u>everyone</u>
''',
    title: 'English',
    language: 'en',
  ),
);

Load external audio track #

The AudioTrack.uri constructor can be used to load external audio track with URI. The code is easier to understand:

await player.setAudioTrack(
  AudioTrack.uri(
    'https://www.iandevlin.com/html5test/webvtt/v/upc-tobymanley.mp4',
    title: 'English',
    language: 'en',
  ),
);

Video controls #

flutter_mpv provides highly-customizable pre-built video controls for usage.

Apart from theming, layout can be customized, position of buttons can be modified, custom buttons can be created etc. Necessary features like fullscreen, keyboard shortcuts & swipe-based controls are also supported by default.

MaterialDesktopVideoControls MaterialVideoControls
  • Video widget provides controls argument to display & customize video controls.
  • By default, AdaptiveVideoControls are used.

Types

Type Description
AdaptiveVideoControls Selects MaterialVideoControls, CupertinoVideoControls etc. based on platform.
MaterialVideoControls Material Design video controls.
MaterialDesktopVideoControls Material Design video controls for desktop.
CupertinoVideoControls iOS-style video controls.
NoVideoControls Disable video controls i.e. only render video output.
Custom Provide custom builder for video controls.

Select existing video controls

Modify the controls argument. For advanced theming of existing video controls, see theming & modifying video controls section.

Scaffold(
  body: Video(
    controller: controller,
    // Select [MaterialVideoControls].
    controls: MaterialVideoControls,
  ),
);
Scaffold(
  body: Video(
    controller: controller,
    // Select [CupertinoVideoControls].
    controls: CupertinoVideoControls,
  ),
);

Build custom video controls

Pass custom builder Widget Function(BuildContext, VideoController) as controls argument.

Scaffold(
  body: Video(
    controller: controller,
    // Provide custom builder for controls.
    controls: (state) {
      return Center(
        child: IconButton(
          onPressed: () {
            state.widget.controller.player.playOrPause();
          },
          icon: StreamBuilder(
            stream: state.widget.controller.player.stream.playing,
            builder: (context, playing) => Icon(
              playing.data == true ? Icons.pause : Icons.play_arrow,
            ),
          ),
          // It's not necessary to use [StreamBuilder] or to use [Player] & [VideoController] from [state].
          // [StreamSubscription]s can be made inside [initState] of this widget.
        ),
      );
    },
  ),
);

Use & modify video controls

AdaptiveVideoControls
MaterialVideoControls
  • Material Design video controls.
  • Theming:
    • Use MaterialVideoControlsTheme widget.
    • Video widget(s) in the child tree will follow the specified theme:
// Wrap [Video] widget with [MaterialVideoControlsTheme].
MaterialVideoControlsTheme(
  normal: MaterialVideoControlsThemeData(
    // Modify theme options:
    buttonBarButtonSize: 24.0,
    buttonBarButtonColor: Colors.white,
    // Modify top button bar:
    topButtonBar: [
      const Spacer(),
      MaterialDesktopCustomButton(
        onPressed: () {
          debugPrint('Custom "Settings" button pressed.');
        },
        icon: const Icon(Icons.settings),
      ),
    ],
  ),
  fullscreen: const MaterialVideoControlsThemeData(
    // Modify theme options:
    displaySeekBar: false,
    automaticallyImplySkipNextButton: false,
    automaticallyImplySkipPreviousButton: false,
  ),
  child: Scaffold(
    body: Video(
      controller: controller,
    ),
  ),
);
  • Related widgets (may be used in primaryButtonBar, topButtonBar & bottomButtonBar):
    • MaterialPlayOrPauseButton
    • MaterialSkipNextButton
    • MaterialSkipPreviousButton
    • MaterialFullscreenButton
    • MaterialCustomButton
    • MaterialPositionIndicator
MaterialDesktopVideoControls
  • Material Design video controls for desktop.
  • Theming:
    • Use MaterialDesktopVideoControlsTheme widget.
    • Video widget(s) in the child tree will follow the specified theme:
// Wrap [Video] widget with [MaterialDesktopVideoControlsTheme].
MaterialDesktopVideoControlsTheme(
  normal: MaterialDesktopVideoControlsThemeData(
    // Modify theme options:
    seekBarThumbColor: Colors.blue,
    seekBarPositionColor: Colors.blue,
    toggleFullscreenOnDoublePress: false,
    // Modify top button bar:
    topButtonBar: [
      const Spacer(),
      MaterialDesktopCustomButton(
        onPressed: () {
          debugPrint('Custom "Settings" button pressed.');
        },
        icon: const Icon(Icons.settings),
      ),
    ],
    // Modify bottom button bar:
    bottomButtonBar: const [
      Spacer(),
      MaterialDesktopPlayOrPauseButton(),
      Spacer(),
    ],
  ),
  fullscreen: const MaterialDesktopVideoControlsThemeData(),
  child: Scaffold(
    body: Video(
      controller: controller,
    ),
  ),
);
  • Related widgets (may be used in primaryButtonBar, topButtonBar & bottomButtonBar):
    • MaterialDesktopPlayOrPauseButton
    • MaterialDesktopSkipNextButton
    • MaterialDesktopSkipPreviousButton
    • MaterialDesktopFullscreenButton
    • MaterialDesktopCustomButton
    • MaterialDesktopVolumeButton
    • MaterialDesktopPositionIndicator
  • Keyboard shortcuts may be modified using keyboardShortcuts argument. Default ones are listed below:
Shortcut Action
Media Play Button Play
Media Pause Button Pause
Media Play/Pause Button Play/Pause
Media Next Track Button Skip Next
Media Previous Track Button Skip Previous
Space Play/Pause
J Seek 10s Behind
I Seek 10s Ahead
Arrow Left Seek 2s Behind
Arrow Right Seek 2s Ahead
Arrow Up Increase Volume 5%
Arrow Down Decrease Volume 5%
F Enter/Exit Fullscreen
Escape Exit Fullscreen
CupertinoVideoControls
  • iOS-style video controls.
  • Theming:
    • Use CupertinoVideoControlsTheme widget.
    • Video widget(s) in the child tree will follow the specified theme:
// Wrap [Video] widget with [CupertinoVideoControlsTheme].
CupertinoVideoControlsTheme(
  normal: const CupertinoVideoControlsThemeData(
    // W.I.P.
  ),
  fullscreen: const CupertinoVideoControlsThemeData(
    // W.I.P.
  ),
  child: Scaffold(
    body: Video(
      controller: controller,
    ),
  ),
);
NoVideoControls
  • Disable video controls i.e. only render video output.
  • Theming:
    • No theming applicable.

Advanced Settings #

flutter_mpv provides extensive control over video decoding and rendering performance through the VideoPerformanceConfiguration class. This allows you to fine-tune playback for specific use cases, device capabilities, or quality requirements.

For most use cases, using predefined presets is the easiest and recommended approach. Presets are optimized configurations tested for common scenarios:

import 'package:flutter_mpv/flutter_mpv.dart';
import 'package:flutter_mpv_video/flutter_mpv_video.dart';

void main() {
  FlutterMpv.ensureInitialized();
  runApp(const MyApp());
}

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    // Use a preset for common scenarios
    final player = Player(
      configuration: PlayerConfiguration(
        videoPerformance: VideoPerformancePresets.balanced, // Best for most cases
        bufferSize: 64 * 1024 * 1024,
      ),
    );

    final controller = VideoController(player);
    // ... rest of your code
  }
}

Available Presets #

Preset Best For Performance Quality Battery Impact
VideoPerformancePresets.powerSaver Older devices, battery saving, thermal throttling ⭐⭐⭐⭐⭐ ⭐⭐ Low
VideoPerformancePresets.balanced General purpose, most apps ⭐⭐⭐⭐ ⭐⭐⭐ Medium
VideoPerformancePresets.instantSeeking Local files, scrubbing, preview timelines ⭐⭐⭐⭐⭐ ⭐⭐ Medium
VideoPerformancePresets.quality High-end devices, sharper local playback ⭐⭐⭐ ⭐⭐⭐⭐⭐ High
VideoPerformancePresets.smoothMotion Sports, action, 24fps movie smoothing ⭐⭐ ⭐⭐⭐⭐ Very High
VideoPerformancePresets.streaming Online video streaming ⭐⭐⭐⭐ ⭐⭐⭐ Medium
VideoPerformancePresets.softwareDecoding Debugging, compatibility issues ⭐⭐ ⭐⭐ High

Video Performance Configuration #

For advanced users who need fine-grained control, VideoPerformanceConfiguration provides detailed options for video decoding, rendering, and performance tuning.

Basic Usage

final player = Player(
  configuration: PlayerConfiguration(
    videoPerformance: VideoPerformanceConfiguration(
      // Decoding
      hardwareDecoding: 'auto',
      frameDropping: 'decoder',
      fastDecoding: 'no',

      // Synchronization
      videoSync: 'audio',
      hrSeek: 'yes',
      fastSeek: 'no',

      // Scaling
      scaler: 'bicubic',
      downScaler: 'bicubic',
      interpolation: false,

      // Rendering
      deinterlacing: 'auto',
      gpuApi: 'auto',

      // Cache
      optimizeForLocalFiles: true,
      cacheSecs: 60,
    ),
    bufferSize: 64 * 1024 * 1024,
  ),
);

Configuration Options Reference

🔓 Decoding Settings

hardwareDecoding (String?)

Controls the hardware acceleration method for video decoding.

Value Description When to Use
'auto' Try hardware, fallback to software Recommended for most cases
'auto-copy' Auto with surface upload When using custom video rendering
'yes' Force hardware decoding When you're sure HW is available
'no' Force software decoding Debugging, compatibility issues
'mediacodec' Android MediaCodec Android-specific optimization
'videotoolbox' iOS/macOS VideoToolbox Apple devices
'd3d11va' Windows Direct3D 11 Windows devices
'vaapi' Linux VAAPI Linux devices

Default: null (auto-detect)

Performance Impact: Hardware decoding can reduce CPU usage by 50-80% and improve battery life.

// Example: Force software decoding for debugging
VideoPerformanceConfiguration(
  hardwareDecoding: 'no',
)

decoderThreads (int?)

Number of threads used for video decoding.

  • Range: 1-16
  • Default: null (auto-detect based on CPU cores)

Guidelines:

  • 1-2: Low-end devices, audio-only playback
  • 2-4: Most devices, standard HD playback (recommended)
  • 4-8: High-end devices, 4K playback
  • 8+: Professional workstations, 8K playback
// Example: Limit threads for battery saving
VideoPerformanceConfiguration(
  decoderThreads: 2,
)

frameDropping (String?)

Controls when frames can be dropped to maintain audio/video synchronization.

Value Description Use Case
'no' Never drop frames Quality-critical, may stutter
'decoder' Drop during decoding only Default, balanced
'vo' Drop during video output only Rarely used
'decoder+vo' Drop in both stages Smooth playback on slow devices

Default: 'decoder'

// Example: Maximum quality (may stutter on slow devices)
VideoPerformanceConfiguration(
  frameDropping: 'no',
)

// Example: Maximum smoothness (may drop frames)
VideoPerformanceConfiguration(
  frameDropping: 'decoder+vo',
)

fastDecoding (String?)

Enables faster, lower-quality decoding for software decoders.

  • 'yes': Faster decoding, lower quality
  • 'no': Standard quality (default)

Use Case: Enable on low-end devices when experiencing playback issues.

VideoPerformanceConfiguration(
  fastDecoding: 'yes', // For low-end devices
)

decoderOptions (String?)

Advanced FFmpeg decoder options as comma-separated key/value pairs.

Common Options:

  • 'threads=4': Set thread count
  • 'flags=low_delay': Low delay mode
  • 'skip_loop_filter=all': Skip loop filter (faster, lower quality)
// Example: Custom FFmpeg options
VideoPerformanceConfiguration(
  decoderOptions: 'threads=4,flags=low_delay',
)

hwdecCodecs (String?)

Restricts hardware decoding to specific codecs.

Options:

  • 'all': All codecs (default)
  • 'h264,hevc': H.264 and HEVC/H.265 only
  • 'h264': H.264 only
  • 'vp9': VP9 only
// Example: Only use HW decoding for H.264 and HEVC
VideoPerformanceConfiguration(
  hwdecCodecs: 'h264,hevc',
)

🔄 Synchronization Settings

videoSync (String?)

Controls how video frames are synchronized to audio.

Value Description Quality Performance Best For
'audio' Sync to audio clock Good ⭐⭐⭐⭐⭐ Default, most compatible
'display' Sync to display refresh Better ⭐⭐⭐⭐ Standard displays
'display-resample' Resample to display rate Best ⭐⭐⭐ High-end devices, smooth motion
'display-vdrop' Display with frame drop Good ⭐⭐⭐⭐ When frames need dropping
'mem-sync' Memory-based sync Good ⭐⭐⭐⭐ Specialized use cases

Default: 'audio'

// Example: Smoothest playback for high-end devices
VideoPerformanceConfiguration(
  videoSync: 'display-resample',
  interpolation: true, // Works best with interpolation
)

hrSeek (String?)

Enables high-resolution seeking for precise position control.

Value Description
'no' Fast, less accurate seeking
'yes' Default, precise seeking
'absolute' Absolute timestamp seeking

Default: 'yes'

Tip: Set to 'no' for faster seeking in long videos where frame-perfect accuracy isn't needed.

VideoPerformanceConfiguration(
  hrSeek: 'yes', // Default, precise
)

hrSeekFramedrop (String?)

Allows frame dropping during high-resolution seeking.

  • 'yes': Faster seeking (default)
  • 'no': Slower but smoother seeking

Default: 'yes'

// Enable for much faster seeking (recommended)
VideoPerformanceConfiguration(
  hrSeekFramedrop: 'yes',
)

fastSeek (String?)

Enables keyframe-based seeking for dramatically faster seek operations.

  • 'yes': Jump to nearest keyframe (very fast)
  • 'no': Exact frame seeking (default, slower)

Default: 'no'

Performance: Can improve seeking speed by 10x or more, but may not land on exact frame.

// Example: Enable for apps with frequent seeking
VideoPerformanceConfiguration(
  fastSeek: 'yes',
  hrSeek: 'yes',
  hrSeekFramedrop: 'yes',
)

📐 Scaling Settings

scaler (String?)

Video scaling algorithm used for upscaling (making video larger).

Algorithm Quality Speed Best For
'bilinear' Low ⭐⭐⭐⭐⭐ Low-end devices, fastest
'bicubic' Medium ⭐⭐⭐⭐ Default, balanced
'lanczos' High ⭐⭐⭐ High-quality playback
'spline36' Very High ⭐⭐ Professional quality
'ewa_lanczos' Highest Best quality, slowest

Default: 'bicubic'

// Example: High quality upscaling
VideoPerformanceConfiguration(
  scaler: 'lanczos',
)

// Example: Maximum performance
VideoPerformanceConfiguration(
  scaler: 'bilinear',
)

downScaler (String?)

Video scaling algorithm used for downscaling (making video smaller).

Same options as scaler.

Default: 'bicubic'

// Example: Match upscaling algorithm
VideoPerformanceConfiguration(
  scaler: 'lanczos',
  downScaler: 'lanczos',
)

interpolation (bool)

Enables frame interpolation to match video framerate to display refresh rate.

  • true: Creates intermediate frames for smoother motion
  • false: Default, no interpolation

Requirements:

  • Requires videoSync: 'display-resample' or similar
  • Increases CPU/GPU usage significantly
  • May cause artifacts in fast-motion scenes

Default: false

// Example: Enable for ultra-smooth motion
VideoPerformanceConfiguration(
  interpolation: true,
  videoSync: 'display-resample',
)

🎨 Rendering Settings

deinterlacing (String?)

Controls how interlaced video content is handled.

Value Description
'no' Disable deinterlacing
'yes' Always deinterlace
'auto' Deinterlace only when needed (default)

Default: 'auto'

Note: Most modern video is progressive (not interlaced), so 'auto' is recommended.

VideoPerformanceConfiguration(
  deinterlacing: 'auto', // Default, recommended
)

gpuApi (String?)

Forces a specific graphics API for rendering.

Value Platform Description
'auto' All Auto-detect (default)
'opengl' All Most compatible
'vulkan' Modern devices Best performance on supported devices
'd3d11' Windows Direct3D 11

Default: 'auto'

// Example: Force Vulkan for better performance
VideoPerformanceConfiguration(
  gpuApi: 'vulkan',
)

openglPbo (String?)

Enables OpenGL Pixel Buffer Objects for faster texture uploads.

  • 'yes': Default, improved performance
  • 'no': Disable (debugging only)

Default: 'yes'

VideoPerformanceConfiguration(
  openglPbo: 'yes', // Default, recommended
)

softwareDecodingDirectRendering (String?)

Enables direct rendering for software decoding.

  • 'yes': Default, better performance
  • 'no': Disable (debugging only)

Default: 'yes'

VideoPerformanceConfiguration(
  softwareDecodingDirectRendering: 'yes', // Default
)

videoLatencyHacks (String?)

Enables various optimizations to reduce video latency.

  • 'yes': Lower latency, may reduce quality
  • 'no': Default, standard quality

Default: 'no'

Use Case: Enable for live streaming or video calls where latency matters.

VideoPerformanceConfiguration(
  videoLatencyHacks: 'yes', // For low-latency scenarios
)

💾 Cache Settings

optimizeForLocalFiles (bool)

Optimizes playback for local file access vs. network streaming.

When true (default):

  • ✅ Disables network readahead
  • ✅ Increases back buffer for fast seeking
  • ✅ Reduces initial buffering time
  • ✅ Optimizes cache for local storage

When false:

  • Better for network/streaming content
  • Larger network buffers

Default: true

// For local video player apps
VideoPerformanceConfiguration(
  optimizeForLocalFiles: true, // Default
)

// For streaming apps
VideoPerformanceConfiguration(
  optimizeForLocalFiles: false,
)

cacheSecs (int)

Duration of content to cache in seconds.

  • Range: 10-300 seconds
  • Default: 60

Guidelines:

  • 10-30: Short videos, low memory devices
  • 60: Default, balanced
  • 120-300: Long videos, good network, frequent seeking
// Example: Increase cache for better seeking
VideoPerformanceConfiguration(
  cacheSecs: 120,
)

Complete Configuration Examples #

Example 1: High-Quality Local Video Player

final player = Player(
  configuration: PlayerConfiguration(
    videoPerformance: VideoPerformanceConfiguration(
      // Best quality decoding
      hardwareDecoding: 'auto',
      frameDropping: 'no',

      // Smooth synchronization
      videoSync: 'display-resample',
      hrSeek: 'yes',
      fastSeek: 'no',

      // High-quality scaling
      scaler: 'lanczos',
      downScaler: 'lanczos',

      // Optimized for local files
      optimizeForLocalFiles: true,
      cacheSecs: 120,
    ),
    bufferSize: 128 * 1024 * 1024, // Larger buffer
  ),
);

Example 2: Streaming App (Low Latency)

final player = Player(
  configuration: PlayerConfiguration(
    videoPerformance: VideoPerformanceConfiguration(
      // Balanced decoding
      hardwareDecoding: 'auto',
      frameDropping: 'decoder',

      // Fast seeking for scrubbing
      hrSeek: 'yes',
      hrSeekFramedrop: 'yes',
      fastSeek: 'yes',

      // Optimized for streaming
      optimizeForLocalFiles: false,
      cacheSecs: 30,

      // Lower latency
      videoLatencyHacks: 'yes',
    ),
    bufferSize: 32 * 1024 * 1024, // Smaller buffer for lower latency
  ),
);

Example 3: Low-End Device Optimization

final player = Player(
  configuration: PlayerConfiguration(
    videoPerformance: VideoPerformanceConfiguration(
      // Performance-focused
      hardwareDecoding: 'auto',
      frameDropping: 'decoder+vo',
      fastDecoding: 'yes',
      decoderThreads: 2,

      // Fast scaling
      scaler: 'bilinear',
      downScaler: 'bilinear',

      // Standard sync
      videoSync: 'audio',

      // Minimal cache
      cacheSecs: 30,
    ),
    bufferSize: 16 * 1024 * 1024,
  ),
);

Persisting Settings #

You can persist advanced settings using SharedPreferences or similar storage solutions. Here's a complete example:

import 'package:shared_preferences/shared_preferences.dart';
import 'package:flutter_mpv/flutter_mpv.dart';

class AppSettings {
  static late final SharedPreferences prefs;

  static Future<void> init() async {
    prefs = await SharedPreferences.getInstance();
  }

  // Hardware decoding setting
  static String get hwdec => prefs.getString('hwdec') ?? 'auto';
  static set hwdec(String value) => prefs.setString('hwdec', value);

  // Scaler setting
  static String get scaler => prefs.getString('scaler') ?? 'bicubic';
  static set scaler(String value) => prefs.setString('scaler', value);

  // Video sync setting
  static String get videoSync => prefs.getString('video_sync') ?? 'audio';
  static set videoSync(String value) => prefs.setString('video_sync', value);

  // Cache duration
  static int get cacheSecs => prefs.getInt('cache_secs') ?? 60;
  static set cacheSecs(int value) => prefs.setInt('cache_secs', value);

  // Reset to defaults
  static Future<void> reset() async {
    await prefs.clear();
  }
}

// Usage in your app initialization:
void main() async {
  WidgetsFlutterBinding.ensureInitialized();
  await AppSettings.init();
  FlutterMpv.ensureInitialized();
  runApp(const MyApp());
}

// Then use in Player configuration:
final player = Player(
  configuration: PlayerConfiguration(
    videoPerformance: VideoPerformanceConfiguration(
      hardwareDecoding: AppSettings.hwdec,
      scaler: AppSettings.scaler,
      videoSync: AppSettings.videoSync,
      cacheSecs: AppSettings.cacheSecs,
    ),
    bufferSize: 64 * 1024 * 1024,
  ),
);

Troubleshooting Guide #

Common Issues and Solutions

Issue: Video stuttering or frame drops

// Solution: Allow more frame dropping
VideoPerformanceConfiguration(
  frameDropping: 'decoder+vo',
  fastDecoding: 'yes',
)

Issue: High battery consumption

// Solution: Use more efficient settings
VideoPerformanceConfiguration(
  hardwareDecoding: 'auto', // Ensure HW decoding is enabled
  scaler: 'bilinear',
  decoderThreads: 2,
  interpolation: false,
)

Issue: Seeking is too slow

// Solution: Enable fast seeking
VideoPerformanceConfiguration(
  fastSeek: 'yes',
  hrSeekFramedrop: 'yes',
  cacheSecs: 120, // Increase cache for better seeking
)

Issue: Poor video quality

// Solution: Prioritize quality over performance
VideoPerformanceConfiguration(
  frameDropping: 'no',
  scaler: 'lanczos',
  downScaler: 'lanczos',
  videoSync: 'display-resample',
)

Issue: Compatibility problems on old devices

// Solution: Use software decoding
VideoPerformanceConfiguration(
  hardwareDecoding: 'no',
  fastDecoding: 'yes',
  scaler: 'bilinear',
)

Performance Tips #

  1. Start with presets: Use VideoPerformancePresets.balanced as a baseline
  2. Test on target devices: Performance varies significantly across devices
  3. Monitor battery impact: High-quality settings drain battery faster
  4. Consider your use case: Streaming vs. local playback need different optimizations
  5. Provide user settings: Allow users to adjust quality vs. performance
  6. Profile before optimizing: Use Flutter DevTools to identify bottlenecks

Next steps #

This guide follows a tutorial-like structure & covers nearly all features that fluttermpv offers. However, it is _not complete by any means. You are free to improve this page & add more documentation, which newcomers may find helpful. The following places can help you learn more:

Goals #

flutter_mpv is a library for Flutter & Dart which provides video & audio playback.

  • Strong: Supports most video & audio codecs.
  • Performant:
    • Handles multiple FHD videos flawlessly.
    • Rendering is GPU-powered (hardware accelerated).
    • 4K / 8K 60 FPS is supported.
    • Advanced video performance configuration for fine-tuned control.
  • Stable: Implementation is well-tested & used across number of intensive media playback related apps.
  • Feature Proof: A simple usage API while offering a large number of features to target multitude of apps.
  • Modular: Project is split into a number of packages for reducing bundle size.
  • Cross Platform: Implementation works on all platforms supported by Flutter & Dart:
    • Android
    • iOS
    • macOS
    • Windows
    • GNU/Linux
    • Web
  • Flexible Architecture:
    • Major part of implementation (80%+) is in 100% Dart (FFI) & shared across platforms.
      • Makes the behavior of library same & more predictable across platforms.
      • Makes development & implementation of new features easier & faster.
      • Avoids separate maintenance of native implementation for each platform.
    • Only video embedding code is platform-specific & part of separate package.

You may see project's architecture & implementation details for further information.

The project aims to meet demands of the community, this includes:

  1. Holding accountability.
  2. Ensuring timely maintenance.

Supported Formats #

A wide variety of formats & codecs are supported. Complete list may be found below:

3dostr          3DO STR
4xm             4X Technologies
aa              Audible AA format files
aac             raw ADTS AAC (Advanced Audio Coding)
aax             CRI AAX
ac3             raw AC-3
ace             tri-Ace Audio Container
acm             Interplay ACM
act             ACT Voice file format
adf             Artworx Data Format
adp             ADP
ads             Sony PS2 ADS
adx             CRI ADX
aea             MD STUDIO audio
afc             AFC
aiff            Audio IFF
aix             CRI AIX
alaw            PCM A-law
alias_pix       Alias/Wavefront PIX image
alp             LEGO Racers ALP
amr             3GPP AMR
amrnb           raw AMR-NB
amrwb           raw AMR-WB
anm             Deluxe Paint Animation
apac            raw APAC
apc             CRYO APC
ape             Monkey's Audio
apm             Ubisoft Rayman 2 APM
apng            Animated Portable Network Graphics
aptx            raw aptX
aptx_hd         raw aptX HD
aqtitle         AQTitle subtitles
argo_asf        Argonaut Games ASF
argo_brp        Argonaut Games BRP
argo_cvg        Argonaut Games CVG
asf             ASF (Advanced / Active Streaming Format)
asf_o           ASF (Advanced / Active Streaming Format)
ass             SSA (SubStation Alpha) subtitle
ast             AST (Audio Stream)
au              Sun AU
av1             AV1 Annex B
avi             AVI (Audio Video Interleaved)
avr             AVR (Audio Visual Research)
avs             Argonaut Games Creature Shock
avs2            raw AVS2-P2/IEEE1857.4
avs3            raw AVS3-P2/IEEE1857.10
bethsoftvid     Bethesda Softworks VID
bfi             Brute Force & Ignorance
bfstm           BFSTM (Binary Cafe Stream)
bin             Binary text
bink            Bink
binka           Bink Audio
bit             G.729 BIT file format
bitpacked       Bitpacked
bmp_pipe        piped bmp sequence
bmv             Discworld II BMV
boa             Black Ops Audio
bonk            raw Bonk
brender_pix     BRender PIX image
brstm           BRSTM (Binary Revolution Stream)
c93             Interplay C93
caf             Apple CAF (Core Audio Format)
cavsvideo       raw Chinese AVS (Audio Video Standard)
cdg             CD Graphics
cdxl            Commodore CDXL video
cine            Phantom Cine
codec2          codec2 .c2 demuxer
codec2raw       raw codec2 demuxer
concat          Virtual concatenation script
cri_pipe        piped cri sequence
dash            Dynamic Adaptive Streaming over HTTP
data            raw data
daud            D-Cinema audio
dcstr           Sega DC STR
dds_pipe        piped dds sequence
derf            Xilam DERF
dfa             Chronomaster DFA
dfpwm           raw DFPWM1a
dhav            Video DAV
dirac           raw Dirac
dnxhd           raw DNxHD (SMPTE VC-3)
dpx_pipe        piped dpx sequence
dsf             DSD Stream File (DSF)
dshow           DirectShow capture
dsicin          Delphine Software International CIN
dss             Digital Speech Standard (DSS)
dts             raw DTS
dtshd           raw DTS-HD
dv              DV (Digital Video)
dvbsub          raw dvbsub
dvbtxt          dvbtxt
dxa             DXA
ea              Electronic Arts Multimedia
ea_cdata        Electronic Arts cdata
eac3            raw E-AC-3
epaf            Ensoniq Paris Audio File
exr_pipe        piped exr sequence
f32be           PCM 32-bit floating-point big-endian
f32le           PCM 32-bit floating-point little-endian
f64be           PCM 64-bit floating-point big-endian
f64le           PCM 64-bit floating-point little-endian
ffmetadata      FFmpeg metadata in text
film_cpk        Sega FILM / CPK
filmstrip       Adobe Filmstrip
fits            Flexible Image Transport System
flac            raw FLAC
flic            FLI/FLC/FLX animation
flv             FLV (Flash Video)
frm             Megalux Frame
fsb             FMOD Sample Bank
fwse            Capcom's MT Framework sound
g722            raw G.722
g723_1          G.723.1
g726            raw big-endian G.726 ("left aligned")
g726le          raw little-endian G.726 ("right aligned")
g729            G.729 raw format demuxer
gdigrab         GDI API Windows frame grabber
gdv             Gremlin Digital Video
gem_pipe        piped gem sequence
genh            GENeric Header
gif             CompuServe Graphics Interchange Format (GIF)
gif_pipe        piped gif sequence
gsm             raw GSM
gxf             GXF (General eXchange Format)
h261            raw H.261
h263            raw H.263
h264            raw H.264 video
hca             CRI HCA
hcom            Macintosh HCOM
hdr_pipe        piped hdr sequence
hevc            raw HEVC video
hls             Apple HTTP Live Streaming
hnm             Cryo HNM v4
ico             Microsoft Windows ICO
idcin           id Cinematic
idf             iCE Draw File
iff             IFF (Interchange File Format)
ifv             IFV CCTV DVR
ilbc            iLBC storage
image2          image2 sequence
image2pipe      piped image2 sequence
imf             IMF (Interoperable Master Format)
ingenient       raw Ingenient MJPEG
ipmovie         Interplay MVE
ipu             raw IPU Video
ircam           Berkeley/IRCAM/CARL Sound Format
iss             Funcom ISS
iv8             IndigoVision 8000 video
ivf             On2 IVF
ivr             IVR (Internet Video Recording)
j2k_pipe        piped j2k sequence
jacosub         JACOsub subtitle format
jpeg_pipe       piped jpeg sequence
jpegls_pipe     piped jpegls sequence
jpegxl_pipe     piped jpegxl sequence
jv              Bitmap Brothers JV
kux             KUX (YouKu)
kvag            Simon & Schuster Interactive VAG
laf             LAF (Limitless Audio Format)
lavfi           Libavfilter virtual input device
live_flv        live RTMP FLV (Flash Video)
lmlm4           raw lmlm4
loas            LOAS AudioSyncStream
lrc             LRC lyrics
luodat          Video CCTV DAT
lvf             LVF
lxf             VR native stream (LXF)
m4v             raw MPEG-4 video
matroska,webm   Matroska / WebM
mca             MCA Audio Format
mcc             MacCaption
mgsts           Metal Gear Solid: The Twin Snakes
microdvd        MicroDVD subtitle format
mjpeg           raw MJPEG video
mjpeg_2000      raw MJPEG 2000 video
mlp             raw MLP
mlv             Magic Lantern Video (MLV)
mm              American Laser Games MM
mmf             Yamaha SMAF
mods            MobiClip MODS
moflex          MobiClip MOFLEX
mov,mp4,m4a,3gp,3g2,mj2 QuickTime / MOV
mp3             MP2/3 (MPEG audio layer 2/3)
mpc             Musepack
mpc8            Musepack SV8
mpeg            MPEG-PS (MPEG-2 Program Stream)
mpegts          MPEG-TS (MPEG-2 Transport Stream)
mpegtsraw       raw MPEG-TS (MPEG-2 Transport Stream)
mpegvideo       raw MPEG video
mpjpeg          MIME multipart JPEG
mpl2            MPL2 subtitles
mpsub           MPlayer subtitles
msf             Sony PS3 MSF
msnwctcp        MSN TCP Webcam stream
msp             Microsoft Paint (MSP))
mtaf            Konami PS2 MTAF
mtv             MTV
mulaw           PCM mu-law
musx            Eurocom MUSX
mv              Silicon Graphics Movie
mvi             Motion Pixels MVI
mxf             MXF (Material eXchange Format)
mxg             MxPEG clip
nc              NC camera feed
nistsphere      NIST SPeech HEader REsources
nsp             Computerized Speech Lab NSP
nsv             Nullsoft Streaming Video
nut             NUT
nuv             NuppelVideo
obu             AV1 low overhead OBU
ogg             Ogg
oma             Sony OpenMG audio
paf             Amazing Studio Packed Animation File
pam_pipe        piped pam sequence
pbm_pipe        piped pbm sequence
pcx_pipe        piped pcx sequence
pfm_pipe        piped pfm sequence
pgm_pipe        piped pgm sequence
pgmyuv_pipe     piped pgmyuv sequence
pgx_pipe        piped pgx sequence
phm_pipe        piped phm sequence
photocd_pipe    piped photocd sequence
pictor_pipe     piped pictor sequence
pjs             PJS (Phoenix Japanimation Society) subtitles
pmp             Playstation Portable PMP
png_pipe        piped png sequence
pp_bnk          Pro Pinball Series Soundbank
ppm_pipe        piped ppm sequence
psd_pipe        piped psd sequence
psxstr          Sony Playstation STR
pva             TechnoTrend PVA
pvf             PVF (Portable Voice Format)
qcp             QCP
qdraw_pipe      piped qdraw sequence
qoi_pipe        piped qoi sequence
r3d             REDCODE R3D
rawvideo        raw video
realtext        RealText subtitle format
redspark        RedSpark
rka             RKA (RK Audio)
rl2             RL2
rm              RealMedia
roq             id RoQ
rpl             RPL / ARMovie
rsd             GameCube RSD
rso             Lego Mindstorms RSO
rtp             RTP input
rtsp            RTSP input
s16be           PCM signed 16-bit big-endian
s16le           PCM signed 16-bit little-endian
s24be           PCM signed 24-bit big-endian
s24le           PCM signed 24-bit little-endian
s32be           PCM signed 32-bit big-endian
s32le           PCM signed 32-bit little-endian
s337m           SMPTE 337M
s8              PCM signed 8-bit
sami            SAMI subtitle format
sap             SAP input
sbc             raw SBC (low-complexity subband codec)
sbg             SBaGen binaural beats script
scc             Scenarist Closed Captions
scd             Square Enix SCD
sdns            Xbox SDNS
sdp             SDP
sdr2            SDR2
sds             MIDI Sample Dump Standard
sdx             Sample Dump eXchange
ser             SER (Simple uncompressed video format for astronomical capturing)
sga             Digital Pictures SGA
sgi_pipe        piped sgi sequence
shn             raw Shorten
siff            Beam Software SIFF
simbiosis_imx   Simbiosis Interactive IMX
sln             Asterisk raw pcm
smjpeg          Loki SDL MJPEG
smk             Smacker
smush           LucasArts Smush
sol             Sierra SOL
sox             SoX native
spdif           IEC 61937 (compressed data in S/PDIF)
srt             SubRip subtitle
stl             Spruce subtitle format
subviewer       SubViewer subtitle format
subviewer1      SubViewer v1 subtitle format
sunrast_pipe    piped sunrast sequence
sup             raw HDMV Presentation Graphic Stream subtitles
svag            Konami PS2 SVAG
svg_pipe        piped svg sequence
svs             Square SVS
swf             SWF (ShockWave Flash)
tak             raw TAK
tedcaptions     TED Talks captions
thp             THP
tiertexseq      Tiertex Limited SEQ
tiff_pipe       piped tiff sequence
tmv             8088flex TMV
truehd          raw TrueHD
tta             TTA (True Audio)
tty             Tele-typewriter
txd             Renderware TeXture Dictionary
ty              TiVo TY Stream
u16be           PCM unsigned 16-bit big-endian
u16le           PCM unsigned 16-bit little-endian
u24be           PCM unsigned 24-bit big-endian
u24le           PCM unsigned 24-bit little-endian
u32be           PCM unsigned 32-bit big-endian
u32le           PCM unsigned 32-bit little-endian
u8              PCM unsigned 8-bit
v210            Uncompressed 4:2:2 10-bit
v210x           Uncompressed 4:2:2 10-bit
vag             Sony PS2 VAG
vbn_pipe        piped vbn sequence
vc1             raw VC-1
vc1test         VC-1 test bitstream
vfwcap          VfW video capture
vidc            PCM Archimedes VIDC
vividas         Vividas VIV
vivo            Vivo
vmd             Sierra VMD
vobsub          VobSub subtitle format
voc             Creative Voice
vpk             Sony PS2 VPK
vplayer         VPlayer subtitles
vqf             Nippon Telegraph and Telephone Corporation (NTT) TwinVQ
w64             Sony Wave64
wady            Marble WADY
wav             WAV / WAVE (Waveform Audio)
wavarc          Waveform Archiver
wc3movie        Wing Commander III movie
webm_dash_manifest WebM DASH Manifest
webp_pipe       piped webp sequence
webvtt          WebVTT subtitle
wsaud           Westwood Studios audio
wsd             Wideband Single-bit Data (WSD)
wsvqa           Westwood Studios VQA
wtv             Windows Television (WTV)
wv              WavPack
wve             Psion 3 audio
xa              Maxis XA
xbin            eXtended BINary text (XBIN)
xbm_pipe        piped xbm sequence
xmd             Konami XMD
xmv             Microsoft XMV
xpm_pipe        piped xpm sequence
xvag            Sony PS3 XVAG
xwd_pipe        piped xwd sequence
xwma            Microsoft xWMA
yop             Psygnosis YOP
yuv4mpegpipe    YUV4MPEG pipe

Notes:

  • The list contains the supported formats (& not containers).
    • A video/audio format may be present in a number of containers.
    • e.g. an MP4 file generally contains H264 video stream.
  • On the web, format support depends upon the web browser.
    • It happens to be extremely limited as compared to native platforms.

Permissions #

You may need to declare & request internet access or file-system permissions depending upon platform.

Android #

Edit android/app/src/main/AndroidManifest.xml to add the following permissions inside <manifest> tag:

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.example.app">
    <application
      ...
      />
    </application>
    <!--
      Internet access permissions.
      -->
    <uses-permission android:name="android.permission.INTERNET" />
    <!--
      Media access permissions.
      Android 13 or higher.
      https://developer.android.com/about/versions/13/behavior-changes-13#granular-media-permissions
      -->
    <uses-permission android:name="android.permission.READ_MEDIA_AUDIO" />
    <uses-permission android:name="android.permission.READ_MEDIA_VIDEO" />
    <!--
      Storage access permissions.
      Android 12 or lower.
      -->
    <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
</manifest>

Use package:permission_handler to request access at runtime:

if (/* Android 13 or higher. */) {
  // Video permissions.
  if (await Permission.videos.isDenied || await Permission.videos.isPermanentlyDenied) {
    final state = await Permission.videos.request();
    if (!state.isGranted) {
      await SystemNavigator.pop();
    }
  }
  // Audio permissions.
  if (await Permission.audio.isDenied || await Permission.audio.isPermanentlyDenied) {
    final state = await Permission.audio.request();
    if (!state.isGranted) {
      await SystemNavigator.pop();
    }
  }
} else {
  if (await Permission.storage.isDenied || await Permission.storage.isPermanentlyDenied) {
    final state = await Permission.storage.request();
    if (!state.isGranted) {
      await SystemNavigator.pop();
    }
  }
}

iOS #

Edit ios/Runner/Info-Release.plist, ios/Runner/Info-Profile.plist, ios/Runner/Info-Debug.plist:

Enable internet access

<key>NSAppTransportSecurity</key>
<dict>
    <key>NSAllowsArbitraryLoads</key>
    <true/>
</dict>

Windows #

N/A

macOS #

Edit macos/Runner/Release.entitlements & macos/Runner/DebugProfile.entitlements:

Enable internet access

<key>com.apple.security.network.client</key>
<true/>

Disable sand-box to access files

<key>com.apple.security.app-sandbox</key>
<false/>

GNU/Linux #

N/A

Web #

N/A

Notes #

Android #

N/A

iOS #

N/A

Windows #

N/A

macOS #

During the build phase, the following warnings are not critical and cannot be silenced:

#import "Headers/flutter_mpv_video-Swift.h"
        ^
/path/to/flutter_mpv/flutter_mpv_test/build/macos/Build/Products/Debug/flutter_mpv_video/flutter_mpv_video.framework/Headers/flutter_mpv_video-Swift.h:270:31: warning: 'objc_ownership' only applies to Objective-C object or block pointer types; type here is 'CVPixelBufferRef' (aka 'struct __CVBuffer *')
- (CVPixelBufferRef _Nullable __unsafe_unretained)copyPixelBuffer SWIFT_WARN_UNUSED_RESULT;
# 1 "<command line>" 1
 ^
<command line>:20:9: warning: 'POD_CONFIGURATION_DEBUG' macro redefined
#define POD_CONFIGURATION_DEBUG 1 DEBUG=1
        ^
#define POD_CONFIGURATION_DEBUG 1
        ^

GNU/Linux #

Install libmpv

System shared libraries from distribution specific user-installed packages are used by-default. This is how GNU/Linux works. You can install these as follows:

Ubuntu/Debian
sudo apt install libmpv-dev mpv
Packaging

There are other ways to bundle these within your app package e.g. within Snap or Flatpak. Few examples:

Utilize mimalloc

You should consider replacing the default memory allocator with mimalloc for avoiding memory leaks.

This is as simple as adding one line to linux/CMakeLists.txt:

target_link_libraries(${BINARY_NAME} PRIVATE ${MIMALLOC_LIB})

In case you prefer dynamic linking of mimalloc, you can additionally add the following line to your linux/CMakeLists.txt:

# use dynamically linked mimalloc
set(MIMALLOC_USE_STATIC_LIBS OFF)

In this case, please ensure you install libmimalloc-dev at compile time and libmimalloc2.0 as runtime dependencies.

Ubuntu/Debian

sudo apt install libmimalloc-dev libmimalloc2.0

Web #

On the web, libmpv is not used. Video & audio playback is handled by embedding HTML <video> element. The format support depends upon the web browser. It happens to be extremely limited as compared to native platforms.

Architecture #

package:flutter_mpv #

Click on the zoom button on top-right or pinch inside.

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  Player *-- PlatformPlayer
  PlatformPlayer <|-- NativePlayer
  PlatformPlayer <|-- WebPlayer
  PlatformPlayer *-- PlayerState
  PlatformPlayer *-- PlayerStream
  PlatformPlayer o-- PlayerConfiguration

  NativePlayer <.. NativeLibrary
  NativePlayer <.. Initializer

  Playable <.. Media
  Playable <.. Playlist

  class Initializer {
    +create(path: String, callback: Function, options: Map<String, String>): Future<Pointer<mpv_handle>>
    +dispose(handle: Pointer<mpv_handle>)
  }

  class Playable {
  }

  class AudioDevice {
  }

  class Media {
    +String uri
    +dynamic extras
  }

  class Playlist {
    +List<Media> medias
    +index index
  }

  class PlayerStream {
    +Stream<Playlist> playlist
    +Stream<bool> playing
    +Stream<bool> completed
    +Stream<Duration> position
    +Stream<Duration> duration
    +Stream<Duration> buffer
    +Stream<double> volume
    +Stream<double> rate
    +Stream<double> pitch
    +Stream<bool> buffering
    +Stream<Duration> buffer
    +Stream<AudioParams> audioParams
    +Stream<VideoParams> videoParams
    +Stream<double?> audioBitrate
    +Stream<AudioDevice> audioDevice
    +Stream<List<AudioDevice>> audioDevices
    +Stream<Track> track
    +Stream<Tracks> tracks
    +Stream<int> width
    +Stream<int> height
    +Stream<List<String>> subtitle
    +Stream<PlayerLog> log
    +Stream<String> error
  }

  class PlayerState {
    +Playlist playlist
    +bool playing
    +bool completed
    +Duration position
    +Duration duration
    +Duration buffer
    +double volume
    +double rate
    +double pitch
    +bool buffering
    +Duration buffer
    +AudioParams audioParams
    +VideoParams videoParams
    +double? audioBitrate
    +AudioDevice audioDevice
    +List<AudioDevice audioDevices
    +Track track
    +Tracks tracks
    +int width
    +int height
    +List<String> subtitle
  }

  class Player {
    +PlatformPlayer? platform

    +«get» PlayerState state
    +«get» PlayerStream stream

    +dispose()
    +open(playable: Playable)
    +play()
    +stop()
    +pause()
    +playOrPause()
    +add(media: Media)
    +remove(index: int)
    +next()
    +previous()
    +jump(index: int)
    +move(from: int, to: int)
    +seek(duration: Duration)
    +setPlaylistMode(playlistMode: PlaylistMode)
    +setVolume(volume: double)
    +setRate(rate: double)
    +setPitch(pitch: double)
    +setShuffle(bool: double)
    +setAudioDevice(device: AudioDevice)
    +setVideoTrack(track: VideoTrack)
    +setAudioTrack(track: AudioTrack)
    +setSubtitleTrack(track: SubtitleTrack)
    +screenshot(): Uint8List
  }

  class PlatformPlayer {
    +PlayerState state
    +PlayerStream stream
    +PlayerConfiguration configuration

    +dispose()*
    +open(playable: Playable)*
    +play()*
    +stop()*
    +pause()*
    +playOrPause()*
    +add(media: Media)*
    +remove(index: int)*
    +next()*
    +previous()*
    +jump(index: int)*
    +move(from: int, to: int)*
    +seek(duration: Duration)*
    +setPlaylistMode(playlistMode: PlaylistMode)*
    +setVolume(volume: double)*
    +setRate(rate: double)*
    +setPitch(pitch: double)*
    +setShuffle(bool: double)*
    +setAudioDevice(device: AudioDevice)*
    +setVideoTrack(track: VideoTrack)*
    +setAudioTrack(track: AudioTrack)*
    +setSubtitleTrack(track: SubtitleTrack)*
    +screenshot(): Uint8List*

    +«get» handle: Future<int>*

    #StreamController<Playlist> playlistController
    #StreamController<bool> playingController
    #StreamController<bool> completedController
    #StreamController<Duration> positionController
    #StreamController<Duration> durationController
    #StreamController<Duration> bufferController
    #StreamController<double> volumeController
    #StreamController<double> rateController
    #StreamController<double> pitchController
    #StreamController<bool> bufferingController
    #StreamController<PlayerLog> logController
    #StreamController<PlayerError> errorController
    #StreamController<AudioParams> audioParamsController
    #StreamController<double?> audioBitrateController
    #StreamController<AudioDevice> audioDeviceController
    #StreamController<List<AudioDevice>> audioDevicesController
    #StreamController<Track> trackController
    #StreamController<Tracks> tracksController
    #StreamController<int> widthController
    #StreamController<int> heightController
  }

  class NativePlayer {
    +dispose()
    +open(playable: Playable)
    +play()
    +stop()
    +pause()
    +playOrPause()
    +add(media: Media)
    +remove(index: int)
    +next()
    +previous()
    +jump(index: int)
    +move(from: int, to: int)
    +seek(duration: Duration)
    +setPlaylistMode(playlistMode: PlaylistMode)
    +setVolume(volume: double)
    +setRate(rate: double)
    +setPitch(pitch: double)
    +setShuffle(bool: double)
    +setAudioDevice(device: AudioDevice)
    +setVideoTrack(track: VideoTrack)
    +setAudioTrack(track: AudioTrack)
    +setSubtitleTrack(track: SubtitleTrack)
    +screenshot(): Uint8List

    +«get» handle: Future<int>
  }

  class WebPlayer {
    +dispose()
    +open(playable: Playable)
    +play()
    +stop()
    +pause()
    +playOrPause()
    +add(media: Media)
    +remove(index: int)
    +next()
    +previous()
    +jump(index: int)
    +move(from: int, to: int)
    +seek(duration: Duration)
    +setPlaylistMode(playlistMode: PlaylistMode)
    +setVolume(volume: double)
    +setRate(rate: double)
    +setPitch(pitch: double)
    +setShuffle(bool: double)
    +setAudioDevice(device: AudioDevice)
    +setVideoTrack(track: VideoTrack)
    +setAudioTrack(track: AudioTrack)
    +setSubtitleTrack(track: SubtitleTrack)
    +screenshot(): Uint8List

    +«get» handle: Future<int>
  }

  class NativeLibrary {
    +find()$ String?
  }

package:flutter_mpv_video #

Click on the zoom button on top-right or pinch inside.

Android

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  FlutterMpvVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Create VideoOutput(s) to send back id & wid for render. Dispose to release.
  VideoOutput <.. FlutterMpvAndroidHelper: Create & dispose JNI global object reference to android.view.Surface (for --wid)

  class FlutterMpvVideoPlugin {
    -MethodChannel channel
    -VideoOutputManager videoOutputManager
  }

  class VideoOutputManager {
    -HashMap<Long, VideoOutput> videoOutputs
    -TextureRegistry textureRegistryReference
    -Object lock

    +create(handle: long, textureUpdateCallback: TextureUpdateCallback)
    +dispose(handle: long)
    +setSurfaceSize(handle: long, width: int, height: int): long
  }

  class VideoOutput {
    $Method newGlobalObjectRef
    $Method deleteGlobalObjectRef
    $Handler handler

    -long id
    -long wid

    -TextureUpdateCallback textureUpdateCallback
    -TextureRegistry.SurfaceProducer surfaceProducer

    -long handle
    -MethodChannel channelReference
    -TextureRegistry textureRegistryReference

    +dispose()
    +setSurfaceSize(width: int, height: int)
    +setSurfaceSize(width: int, height: int, force: boolean)
    -setSurfaceTextureSize(width: int, height: int)
    +onSurfaceCreated()
    +onSurfaceDestroyed()

    $newGlobalObjectRef(object: Object): long
    $deleteGlobalObjectRef(ref: long)
  }

  class FlutterMpvAndroidHelper {
    +newGlobalObjectRef(obj: Object): long
    +deleteGlobalObjectRef(ref: long)
    +setApplicationContext(context: Context)
    +copyAssetToExternalFilesDir(assetName: String): String
  }

iOS

TODO: documentation.

macOS

TODO: documentation.

Windows

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  FlutterMpvVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Takes PluginRegistrarWindows as reference
  VideoOutputManager "1" *-- "1" ThreadPool
  VideoOutput "*" o-- "1" ThreadPool: Post creation, resize & render etc. tasks involving EGL to ensure synchronous EGL/ANGLE usage across multiple VideoOutput(s)
  VideoOutput "1" *-- "1" ANGLESurfaceManager: Only for H/W accelerated rendering

  class FlutterMpvVideoPlugin {
    -flutter::PluginRegistrarWindows registrar_
    -std::unique_ptr<MethodChannel> channel_
    -std::unique_ptr<VideoOutputManager> video_output_manager_
    -HandleMethodCall(method_call, result);
  }

  class ThreadPool {
    +Post(function: std::function)
  }

  class VideoOutputManager {
    +Create(handle: int, width: optional<int>, height: optional<int>, texture_update_callback: std::function)
    +Dispose(handle: int)

    -std::mutex mutex_
    -std::unique_ptr<ThreadPool> thread_pool_
    -flutter::PluginRegistrarWindows registrar_
    -std::unordered_map<int64_t, std::unique_ptr<VideoOutput>> video_outputs_
  }

  class VideoOutput {
    +«get» texture_id: int64_t
    +«get» width: int64_t
    +«get» height: int64_t
    -mpv_handle* handle_
    -mpv_render_context* render_context_
    -std::optional<int64_t> width_
    -std::optional<int64_t> height_
    -bool enable_hardware_acceleration_
    -int64_t texture_id_
    -flutter::PluginRegistrarWindows registrar_
    -ThreadPool* thread_pool_ref_
    -bool destroyed_
    -std::mutex textures_mutex_
    -std::unordered_map<int64_t, std::unique_ptr<flutter::TextureVariant>> texture_variants_
    -std::unique_ptr<ANGLESurfaceManager> surface_manager_ HW
    -std::unordered_map<int64_t, std::unique_ptr<FlutterDesktopGpuSurfaceDescriptor>> textures_ HW
    -std::unique_ptr<uint8_t[]> pixel_buffer_ SW
    -std::unordered_map<int64_t, std::unique_ptr<FlutterDesktopPixelBuffer>> pixel_buffer_textures_ SW
    -std::function texture_update_callback_

    +SetTextureUpdateCallback(callback: std::function<void(int64_t, int64_t, int64_t)>)
    +SetSize(width: std::optional<int64_t>, height: std::optional<int64_t>)
    -NotifyRender()
    -Render()
    -CheckAndResize()
    -Resize(required_width: int64_t, required_height: int64_t)
    -GetVideoWidth(): int64_t
    -GetVideoHeight(): int64_t
  }

  class ANGLESurfaceManager {
    +«get» width: int32_t
    +«get» height: int32_t
    +«get» handle: HANDLE

    +HandleResize(width: int32_t, height: int32_t)
    +Draw(draw_callback: std::function<void()>)
    +Read()
    +MakeCurrent(value: bool)
    -CreateEGLDisplay()
    -SwapBuffers()
    -Create()
    -CleanUp(release_context: bool)
    -CreateD3DTexture()
    -CreateEGLDisplay()
    -CreateAndBindEGLSurface()

    -IDXGIAdapter* adapter_
    -int32_t width_
    -int32_t height_
    -HANDLE internal_handle_
    -HANDLE handle_
    -HANDLE mutex_
    -ID3D11Device* d3d_11_device_
    -ID3D11DeviceContext* d3d_11_device_context_
    -Microsoft::WRL::ComPtr<ID3D11Texture2D> internal_d3d_11_texture_2D_
    -Microsoft::WRL::ComPtr<IDXGISwapChain> d3d_11_texture_2D_
    -EGLSurface surface_
    -EGLDisplay display_
    -EGLContext context_
    -EGLConfig config_
  }

GNU/Linux

%%{
  init: {
    'themeVariables': {
      'fontFamily': 'BlinkMacSystemFont, Segoe UI, Noto Sans, Helvetica, Arial, Apple Color Emoji, Segoe UI Emoji'
    }
  }
}%%
classDiagram

  FlutterMpvVideoPlugin "1" *-- "1" VideoOutputManager: Create VideoOutput(s) with VideoOutputManager for handle passed through platform channel
  VideoOutputManager "1" *-- "*" VideoOutput: Takes FlTextureRegistrar as reference
  VideoOutput "1" *-- "1" TextureGL: For H/W rendering.
  TextureGL "1" o-- "1" VideoOutput: Take VideoOutput as reference
  VideoOutput "1" *-- "1" TextureSW: For S/W rendering.
  TextureSW "1" o-- "1" VideoOutput: Take VideoOutput as reference
  TextureGL "1" <-- "1" FlTextureGL
  TextureSW "1" <-- "1" FlTexture

  class FlutterMpvVideoPlugin {
    -FlMethodChannel* channel
    -VideoOutputManager* video_output_manager
  }

  class VideoOutputManager {
    -GHashTable* video_outputs
    -FlTextureRegistrar* texture_registrar
    +video_output_manager_create(self: VideoOutputManager*, handle: gint64, width: gint64, height: gint64, texture_update_callback: TextureUpdateCallback, texture_update_callback_context: gpointer)
    +video_output_manager_dispose(self: VideoOutputManager*, handle: gint64)
  }

  class VideoOutput {
    -TextureGL* texture_gl
    -GdkGLContext* context_gl
    -mpv_handle* handle
    -mpv_render_context* render_context
    -gint64 width
    -gint64 height
    -TextureUpdateCallback texture_update_callback
    -gpointer texture_update_callback_context
    -FlTextureRegistrar* texture_registrar
    +video_output_set_texture_update_callback(self: VideoOutput*, texture_update_callback: TextureUpdateCallback, texture_update_callback_context: gpointer)
    +video_output_get_render_context(self: VideoOutput*): mpv_render_context*
    +video_output_get_width(self: VideoOutput*): gint64
    +video_output_get_height(self: VideoOutput*): gint64
    +video_output_get_texture_id(self: VideoOutput*): gint64
    +video_output_notify_texture_update(self: VideoOutput*);
  }

  class TextureGL {
    -guint32 name
    -guint32 fbo
    -guint32 current_width
    -guint32 current_height
    -VideoOutput* video_output
    texture_gl_populate_texture(texture: FlTextureGL*, target: guint32*, name: guint32*, width: guint32*, height: guint32*, error: GError**): gboolean
  }

  class TextureSW {
    -guint32 current_width
    -guint32 current_height
    -VideoOutput* video_output
    texture_sw_copy_pixels(texture: FlPixelBufferTexture*, buffer: const uint8_t**, width: uint32_t*, height: uint32_t*, error: GError**): gboolean
  }

Web

TODO: documentation.

Implementation #

libmpv is used for leveraging audio & video playback. It seems the best possible option since supports a wide variety of audio & video formats, provides hardware acceleration & bundle size is also minimal (select only required decoders etc. in FFmpeg/mpv).

Another major advantage is that large part of implementation (80%+) is shared across platforms using FFI. This makes the behavior of package very-very similar on all supported platforms & makes maintenance easier (since there is less code & most of it within Dart).

Alternative backends may be implemented in future to meet certain demands (& project architecture makes it possible).

flutter_mpv #

flutter_mpv is entirely written in Dart. It uses dart:ffi to invoke native C API of libmpv through it's shared libraries. All the callback management, event-Streams, other methods to control playback of audio/video are implemented in Dart with the help of FFI. Event management i.e. position, duration, bitrate, audioParams Streams are important to render changes in the UI.

A big limitation with FFI in Dart SDK has been that it does not support async callbacks from another thread. Learn more about this at: dart/sdk#37022. Following situation will explain better:

If you pass a function pointer from Dart to C code, you can invoke it fine. But, as soon as you invoke it from some other thread on the native side, Dart VM will instantly crash. This feature is important because most events take place on a background thread.

However, I could easily do this within Dart because libmpv offers an "event polling"-like way to listen to events. I got awesome idea to spawn a background Isolate, where I run the event-loop. I get the memory address of each event and forward it outside the Isolate with the help of ReceivePort, where I finally interpret it using more FFI code. I have explained this in detail within the in-code comments of initializer.dart, where I had to perform a lot more trickery to get this to work.

Thus, invoking native methods & handling of events etc. could be done within 100% Dart using FFI. This is enough for audio playback & supports both Flutter SDK & Dart VM. Although event handling works entirely within Dart. Later, it was discovered that going beyond certain number of simultaneous instances caused a deadlock (dart-lang/sdk#51254 & dart-lang/sdk#51261), making UI entirely freezed along-side any other Dart code in execution. To deal with this, a new package package:flutter_mpv_native_event_loop is created. Adding package:flutter_mpv_native_event_loop to pubspec.yaml automatically resolves this issue without any chagnes to code!

Update: The above issue is resolved in Dart SDK 3.1.0. NativeCallable can now be used to make async C callbacks.

However, no such "event-polling" like API is possible for video rendering. So, I best idea seemed to create a new package flutter_mpv_video for specifically offering platform-specific video embedding implementation which internally handles Flutter's Texture Registry API & libmpv's OpenGL rendering API. This package only consumes the mpv_handle* (which can be shared as primitive int value easily) of the instance (created with flutter_mpv through FFI) to setup a new viewport. Detailed implementation is discussed below.

package:flutter_mpv_native_event_loop #

Platform specific threaded event handling for flutter_mpv. Enables support for higher number of concurrent instances.

The package contains a minimal C++ implementation which spawns a detach-ed std::thread. This runs the mpv_wait_event loop & forwads the events using postCObject, SendPort & ReceivePort to Dart VM. Necessary mutex synchronization also takes place.

Isolate based event loop is avoided once this package is added to the project.

package:flutter_mpv_video #

Android

On Android, texture registry API is based on android.graphics.SurfaceTexture.

libmpv can render directly onto an android.view.Surface after setting --wid. Creation of a new android.view.Surface requires reference to an existing android.graphics.SurfaceTexture, which can be consumed from the texture entry created by Flutter itself.

This requires --hwdec=mediacodec for hardware decoding, along with --vo=mediacodec_embed and --wid=(intptr_t)(*android.view.Surface).

More details may be found at: https://mpv.io/manual/stable/#video-output-drivers-mediacodec-embed

Obtaining a global reference pointer to a Java object (android.view.Surface in our case) requires JNI. For this, a custom shared library is used, you can find it's implementation at media-kit/media-kit-android-helper. Since compilation of this would require NDK (& make process tedious), pre-built shared libraries is bundled for each architecture at the time of development/build.

Since the package:flutter_mpv is a Dart package (which works independent of Flutter), accessing assets was a challenging part. The mentioned shared libraries generated by media-kit/media-kit-android-helper helps to access assets bundled inside Android APK from Dart (using FFI, without depending on Flutter).

iOS

iOS shares much of it's implementation with macOS. Only difference is that OpenGL ES is used instead of OpenGL.

macOS

On macOS the current implementation is based on libmpv and can be summarized as follows:

  1. H/W video decoding: mpv option hwdec is set to auto, does not depend on a pixel buffer.
  2. OpenGL rendering to an OpenGL texture backed by a pixel buffer, which makes it interoperable with METAL (CVPixelBuffer)

Windows

The two APIs above are hardware accelerated i.e. GPU backed buffers are used. This is performant approach, easily capable for rendering 4K 60 FPS videos, rest depends on the hardware. Since libmpv API is OpenGL based & the Texture API in Flutter is Direct3D based, ANGLE (Almost Native Graphics Layer Engine) is used for interop, which translates the OpenGL ES 2.0 calls into Direct3D.

This hardware-accelerated video output requires DirectX 11 or higher. Most Windows systems with either integrated or discrete GPUs should support this already. On systems where Direct3D fails to load due to missing graphics drivers or unsupported feature-level or DirectX version etc. a fallback pixel-buffer based software renderer is used. This means that video is rendered by CPU & every frame is copied back to the RAM. This will cause some redundant load on the CPU, result in decreased battery life & may not play higher resolution videos properly. However, it works well.

Windows 7 & 8.x also work correctly.

0 1

You may visit experimentation repository to see a minimal example showing OpenGL ES usage in Flutter Windows.

GNU/Linux

On Flutter Linux, both OpenGL (H/W) & pixel buffer (S/W) APIs are available for rendering on Texture widget.

Web

Video & audio playback is handled by embedding HTML <video> element.

Examples #

See the examples/ directory for complete working examples:

Example Description Complexity Best For
advanced_player_example Full player with 12 presets & advanced settings ⭐⭐⭐⭐⭐ Production apps, learning
flutter_mpv_test Minimal feature tests ⭐⭐ Quick testing, debugging

Quick Start with Examples #

# Navigate to example
cd examples/advanced_player_example

# Install dependencies
flutter pub get

# Run on your device
flutter run

Note: The example uses pubspec_overrides.yaml to reference local packages during development. When copying to your own project, remove this file and use published versions from pub.dev.

advanced_player_example Features #

  • 🎯 3 Player Modes: Basic, Presets, and Advanced
  • 🎨 12 Performance Presets: From "Low-End Device" to "Reference Quality"
  • ⚙️ Advanced Settings: Fine-tune 20+ video parameters
  • 💾 Settings Persistence: Saves preferences via SharedPreferences
  • 📱 Material Design: Beautiful, responsive UI

Learning Path #

  1. Start with flutter_mpv_test - Understand basic API
  2. Try advanced_player_example - Learn advanced configuration
  3. Build your own player - Combine learnings

For more details, see the Examples README.


License #

Copyright © 2021 & onwards, Hitesh Kumar Saini <saini123hitesh@gmail.com>

This project & the work under this repository is governed by MIT license that can be found in the LICENSE file.

0
likes
150
points
0
downloads
screenshot

Publisher

unverified uploader

Weekly Downloads

A cross-platform video player & audio player for Flutter & Dart with advanced performance controls. Fork of flutter_mpv.

Repository (GitHub)

Topics

#video #video-player #audio #audio-player #cross-platform

Documentation

API reference

License

MIT (license)

Dependencies

collection, http, image, meta, path, safe_local_storage, synchronized, universal_platform, uri_parser, uuid, web

More

Packages that depend on flutter_mpv