flutter_sound 3.1.1

  • Readme
  • Changelog
  • Example
  • Installing
  • 71

Flutter Sound #

pub version

This plugin provides simple recorder and player functionalities for both `android` and `ios` platforms. This only supports default file extension for each platform. This plugin handles file from remote url. This plugin can handle playback stream from native (To sync exact time with bridging).

flauto

Breaking News #

  • 3.0.0 release has breaking features have been added by the work on code name flauto in #243
    • OGG/OPUS support on iOS
    • Playing in lock screen
    • Playing in notification
    • Support tracking

    Please honor all wonderful contributors Larpoux, bsutton, salvatore373 🎉!

Free Read #

Medium Blog

Getting Started #

For help getting started with Flutter, view our online documentation.

Install #

For help on adding as a dependency, view the documentation.

Add flutter_sound as a dependency in pubspec.yaml. The actual version is flauto: ^3.0.0

The Flutter-Sound sources are here.

dependencies:
  flutter:
    sdk: flutter
  flutter_sound: ^3.0.0

FFmpeg #

flutter_sound makes use of flutter_ffmpeg. Please, look to flutter_ffmpeg documentation to see how to add it to your App.

  • On iOS you will have to enter something like that in your Podfile
  # Prepare symlinks folder. We use symlinks to avoid having Podfile.lock
  # referring to absolute paths on developers' machines.
  system('rm -rf .symlinks')
  system('mkdir -p .symlinks/plugins')
  plugin_pods = parse_KV_file('../.flutter-plugins')
  plugin_pods.each do |name, path|
    symlink = File.join('.symlinks', 'plugins', name)
    File.symlink(path, symlink)
    if name == 'flutter_ffmpeg'
        pod name+'/audio-lts', :path => File.join(symlink, 'ios')
    else
        pod name, :path => File.join(symlink, 'ios')
    end
  end
  • On Android you will have to enter the following line in your pubspec.yaml file.
ext.flutterFFmpegPackage = 'audio-lts'

Post Installation #

On iOS you need to add a usage description to info.plist:

<key>NSMicrophoneUsageDescription</key>
<string>This sample uses the microphone to record your speech and convert it to text.</string>
<key>UIBackgroundModes</key>
<array>
	<string>audio</string>
</array>

On Android you need to add a permission to AndroidManifest.xml:

<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />

Migration Guide #

To migrate to 3.0.0 you must migrate your Android app to Android X by following the Migrating to AndroidX Guide.

Methods #

FuncParamReturnDescription
initializevoidInitializes the media player and all the callbacks for the player and the recorder. This procedure is implicitely called during the Flutter Sound constructors. So you probably will not use this function yourself.
releaseMediaPlayervoidResets the media player and cleans up the device resources. This must be called when the player is no longer needed.
setSubscriptionDurationdouble secString messageSet subscription timer in seconds. Default is 0.010 if not using this method.
startRecorderString uri, int sampleRate, int numChannels, t_CODEC codecString uriStart recording. This will return uri used.
stopRecorderString messageStop recording.
pauseRecorderString messagePause recording.
resumeRecorderString messageResume recording.
startPlayerString fileUri, t_CODEC codec, whenFinished()Starts playing the file at the given URI.
startPlayerFromBufferUint8List dataBuffer, t_CODEC codec, whenFinished()String messageStart playing using a buffer encoded with the given codec
stopPlayerString messageStop playing.
pausePlayerString messagePause playing.
resumePlayerString messageResume playing.
seekToPlayerint milliSecs position to goToString messageSeek audio to selected position in seconds. Parameter should be less than audio duration to correctly placed.
iosSetCategorySESSION_CATEGORY, SESSION_MODE, optionsBooleanSet the session category on iOS.
androidAudioFocusRequestint focusGainBooleanDefine the Android Focus request to use in subsequent requests to get audio focus
setActivebool enabledBooleanRequest or Abandon the audio focus

Subscriptions #

SubscriptionReturnDescription
onRecorderStateChanged<RecordStatus>Able to listen to subscription when recorder starts.
onPlayerStateChanged<PlayStatus>Able to listen to subscription when player starts.

Default uri path #

When uri path is not set during the function call in startRecorder or startPlayer, records are saved/read to/from a temporary directory depending on the platform.

Codec compatibility #

Actually, the following codecs are supported by flutter_sound:

AACOGG/OpusCAF/OpusMP3OGG/VorbisPCM
iOS encoderYesYesYesNoNoNo
iOS decoderYesYesYesYesNoYes
Android encoderYesNoNoNoNoNo
Android decoderYesYesNoYesYesYes

This table will eventually be upgrated when more codecs will be added.

FlutterSoundRecorder Usage #

Creating instance.

In your view/page/dialog widget's State class, create an instance of FlutterSoundRecorder. Before acessing the FlutterSoundRecorder API, you must initialize it with initialize(). When finished with this FlutterSoundRecorder instance, you must release it with release().

FlutterSoundRecorder flutterSoundRecorder = new FlutterSoundRecorder().initialize();

...
...

flutterSoundRecorder.release();

Starting recorder with listener.

Future<String> result = await FlutterSoundRecorder.startRecorder(codec: t_CODEC.CODEC_AAC,);

result.then(path) {
	print('startRecorder: $path');

	_recorderSubscription = flutterSoundRecorder.onRecorderStateChanged.listen((e) {
	DateTime date = new DateTime.fromMillisecondsSinceEpoch(e.currentPosition.toInt());
	String txt = DateFormat('mm:ss:SS', 'en_US').format(date);
	});
}

The recorded file will be stored in a temporary directory. If you want to take your own path specify it like below. We are using path_provider in below so you may have to install it.

Directory tempDir = await getTemporaryDirectory();
File outputFile = await File ('${tempDir.path}/flutter_sound-tmp.aac');
String path = await flutterSoundRecorder.startRecorder(outputFile.path, codec: t_CODEC.CODEC_AAC,);

Actually on iOS, you can choose from three encoders :

  • AAC (this is the default)
  • CAF/OPUS
  • OGG/OPUS

Recently, Apple added a support for encoding with the standard OPUS codec. Unfortunatly, Apple encapsulates its data in its own proprietary envelope : CAF. This is really stupid, this is Apple. To encode with OPUS you do the following :

await flutterSoundRecorder.startRecorder(foot.path, codec: t_CODEC.CODEC_OPUS,)

On Android the OPUS codec is not yet supported by flutter_sound Recorder. (But Player is OK on Android)

Stop recorder

Future<String> result = await flutterSoundRecorder.stopRecorder();

result.then(value) {
	print('stopRecorder: $value');

	if (_recorderSubscription != null) {
		_recorderSubscription.cancel();
		_recorderSubscription = null;
	}
}

You MUST ensure that the recorder has been stopped when your widget is detached from the ui. Overload your widget's dispose() method to stop the recorder when your widget is disposed.

@override
void dispose() {
	flutterSoundRecorder.release();
	super.dispose();
}

Pause recorder

On Android this API verb needs al least SDK24.

Future<String> result = await flutterSoundRecorder.pauseRecorder();

Resume recorder

On Android this API verb needs al least SDK24.

Future<String> result = await flutterSoundRecorder.resumeRecorder();

Using the amplitude meter

The amplitude meter allows displaying a basic representation of the input sound. When enabled, it returns values ranging 0-120dB.

//// By default this option is disabled, you can enable it by calling
setDbLevelEnabled(true);
//// You can tweak the frequency of updates by calling this function (unit is seconds)
updateDbPeakProgress(0.8);
//// You need to subscribe in order to receive the value updates
_dbPeakSubscription = flutterSoundRecorder.onRecorderDbPeakChanged.listen((value) {
  setState(() {
    this._dbLevel = value;
  });
});

FlutterSoundPlayer Usage #

Creating instance.

In your view/page/dialog widget's State class, create an instance of FlutterSoundPlayer. Before acessing the FlutterSoundPlayer API, you must initialize it with initialize(). When finished with this FlutterSoundPlayer instance, you must release it with release().

FlutterSoundPlayer flutterSoundPlayer = new FlutterSoundPlayer().initialize();

...
...

flutterSoundPlayer.release();

Start player

  • To start playback of a record from a URL call startPlayer.
  • To start playback of a record from a memory buffer call startPlayerFromBuffer

You can use both startPlayer or startPlayerFromBuffer to play a sound. The former takes in a URI that points to the file to play, while the latter takes in a buffer containing the file to play and the codec to decode that buffer.

Those two functions can have an optional parameter whenFinished:() for specifying what to do when the playback will be finished.

// An example audio file
final fileUri = "https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3";

String result = await flutterSoundPlayer.startPlayer
	(
		fileUri,
		whenFinished: ()
		{
			 print( 'I hope you enjoyed listening to this song' );
		},
	);
// Load a local audio file and get it as a buffer
Uint8List buffer = (await rootBundle.load('samples/audio.mp3'))
    	.buffer
    	.asUint8List();

Future<String> result = await flutterSoundPlayer.startPlayerFromBuffer
	(
		buffer,
		whenFinished: ()
		{
			 print( 'I hope you enjoyed listening to this song' );
		},
	);

You must wait for the return value to complete before attempting to add any listeners to ensure that the player has fully initialised.

Directory tempDir = await getTemporaryDirectory();
File fin = await File ('${tempDir.path}/flutter_sound-tmp.aac');
Future<String> result = await flutterSoundPlayer.startPlayer(fin.path);

result.then(path) {
	print('startPlayer: $path');

	_playerSubscription = flutterSoundPlayer.onPlayerStateChanged.listen((e) {
		if (e != null) {
			DateTime date = new DateTime.fromMillisecondsSinceEpoch(e.currentPosition.toInt());
			String txt = DateFormat('mm:ss:SS', 'en_US').format(date);
			this.setState(() {
				this._isPlaying = true;
				this._playerTxt = txt.substring(0, 8);
			});
		}
	});
}

Start player from buffer

For playing data from a memory buffer instead of a file, you can do the following :

Uint8List buffer =  (await rootBundle.load(assetSample[_codec.index])).buffer.asUint8List();
String result = await flutterSoundPlayer.startPlayerFromBuffer
	(
		buffer,
		codec: t_CODEC.CODEC_AAC,
		whenFinished: ()
		{
			 print( 'I hope you enjoyed listening to this song' );
		},
	);

Stop player

Future<String> result = await flutterSoundPlayer.stopPlayer();

result.then(value) {
	print('stopPlayer: $result');
	if (_playerSubscription != null) {
		_playerSubscription.cancel();
		_playerSubscription = null;
	}
}

You MUST ensure that the player has been stopped when your widget is detached from the ui. Overload your widget's dispose() method to stop the player when your widget is disposed.

@override
void dispose() {
	flutterSoundPlayer.stopPlayer();
	super.dispose();
}

Pause player

Future<String> result = await flutterSoundPlayer.pausePlayer();

Resume player

Future<String> result = await flutterSoundPlayer.resumePlayer();

iosSetCategory(), androidAudioFocusRequest() and setActive() - (optional)

Those three functions are optional. If you do not control the audio focus with the function setActive(), flutter_sound will require the audio focus each time the function startPlayer() is called and will release it when the sound is finished or when you call the function stopPlayer().

Before controling the focus with setActive() you must call iosSetCategory() on iOS or androidAudioFocusRequest() on Android. setActive() and androidAudioFocusRequest() are useful if you want to duck others. Those functions are probably called just once when the app starts. After calling this function, the caller is responsible for using correctly setActive() probably before startRecorder or startPlayer, and stopPlayer and stopRecorder.

You can refer to iOS documentation to understand the parameters needed for iosSetCategory() and to the Android documentation to understand the parameter needed for androidAudioFocusRequest().

Remark : those three functions does work on Android before SDK 26.

if (_duckOthers)
{
	if (Platform.isIOS)
		await flutterSoundPlayer.iosSetCategory( t_IOS_SESSION_CATEGORY.PLAY_AND_RECORD, t_IOS_SESSION_MODE.DEFAULT, IOS_DUCK_OTHERS |  IOS_DEFAULT_TO_SPEAKER );
	else if (Platform.isAndroid)
		await flutterSoundPlayer.androidAudioFocusRequest( ANDROID_AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK );
} else
{
	if (Platform.isIOS)
		await flutterSoundPlayer.iosSetCategory( t_IOS_SESSION_CATEGORY.PLAY_AND_RECORD, t_IOS_SESSION_MODE.DEFAULT, IOS_DEFAULT_TO_SPEAKER );
	else if (Platform.isAndroid)
		await flutterSoundPlayer.androidAudioFocusRequest( ANDROID_AUDIOFOCUS_GAIN );
}
...
...
flutterSoundPlayer.setActive(true); // Get the audio focus
flutterSoundPlayer.startPlayer(aSound);
flutterSoundPlayer.startPlayer(anotherSong);
flutterSoundPlayer.setActive(false); // Release the audio focus

Seek player

To seek to a new location the player must already be playing.

String Future<result> = await flutterSoundPlayer.seekToPlayer(miliSecs);

Setting subscription duration (Optional). 0.010 is default value when not set.

/// 0.01 is default
flutterSoundPlayer.setSubscriptionDuration(0.01);

Setting volume.

/// 1.0 is default
/// Currently, volume can be changed when player is running. Try manage this right after player starts.
String path = await flutterSoundPlayer.startPlayer(fileUri);
await flutterSoundPlayer.setVolume(0.1);

Release the player

You MUST ensure that the player has been released when your widget is detached from the ui. Overload your widget's dispose() method to release the player when your widget is disposed. In this way you will reset the player and clean up the device resources, but the player will be no longer usable.

@override
void dispose() {
	flutterSoundPlayer.release();
	super.dispose();
}

TrackPlayer #

TrackPlayer is a new flutter_sound module which is able to show controls on the lock screen. Using TrackPlayer is very simple : just use the TrackPlayer constructor instead of the regular FlutterSoundPlayer.

trackPlayer = TrackPlayer();

You must startPlayerFromTrack to play a sound. This function takes in 1 required argument and 3 optional arguments:

  • a Track, which is the track that the player is going to play;
  • whenFinished:() : A call back function for specifying what to do when the song is finished
  • onSkipBackward:(), A call back function for specifying what to do when the user press the skip-backward button on the lock screen
  • onSkipForward:(), A call back function for specifying what to do when the user press the skip-forward button on the lock screen
path = await trackPlayer.startPlayerFromTrack
(
	track,
	whenFinished: ( )
	{
		print( 'I hope you enjoyed listening to this song' );
	},

	onSkipBackward: ( )
	{
		print( 'Skip backward' );
		stopPlayer( );
		startPlayer( );
	},
	onSkipForward: ( )
	{
		print( 'Skip forward' );
		stopPlayer( );
		startPlayer( );
	},

);

Create a Track object

In order to play a sound when you initialized the player with the audio player features, you must create a Track object to pass to startPlayerFromTrack.

The Track class is provided by the flutter_sound package. Its constructor takes in 1 required argument and 3 optional arguments:

  • trackPath (required): a String representing the path that points to the audio file to play. This must be provided if dataBuffer is null, but you cannot provide both;
  • dataBuffer (required): a Uint8List, a buffer that contains an audio file. This must be provided if trackPath is null, but you cannot provide both;
  • trackTitle: the String to display in the notification as the title of the track;
  • trackAuthor the String to display in the notification as the name of the author of the track;
  • albumArtUrl a String representing the URL that points to the image to display in the notification as album art.
  • or albumArtAsset : the name of an asset to show in the nofitication
// Create with the path to the audio file
Track track = new Track(
	trackPath: "https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3", // An example audio file
        trackTitle: "Track Title",
        trackAuthor: "Track Author",
        albumArtUrl: "https://file-examples.com/wp-content/uploads/2017/10/file_example_PNG_1MB.png", // An example image
);

// Load a local audio file and get it as a buffer
Uint8List buffer = (await rootBundle.load('samples/audio.mp3'))
    	.buffer
    	.asUint8List();
// Create with the buffer
Track track = new Track(
	dataBuffer: buffer,
        trackTitle: "Track Title",
        trackAuthor: "Track Author",
        albumArtUrl: "https://file-examples.com/wp-content/uploads/2017/10/file_example_PNG_1MB.png", // An example image
);

Informations on a record #

There are two utilities functions that you can use to have informations on a file.

  • FlutterSoundHelper.FFmpegGetMediaInformation(<A_file_path>);
  • FlutterSoundHelper.duration(<A_file_path>)

The informations got with FFmpegGetMediaInformation() are documented here. The integer returned by flutterSound.duration() is the number of milli-seconds for the given record.

int duration = await flutterSoundHelper.duration( this._path[_codec.index] );
Map<dynamic, dynamic> info = await flutterSoundHelper.FFmpegGetMediaInformation( uri );

TODO #

  • [x] Seeking example in Example project
  • [x] Volume Control
  • [x] Sync timing for recorder callback handler

DEBUG #

When you face below error,

* What went wrong:
A problem occurred evaluating project ':app'.
> versionCode not found. Define flutter.versionCode in the local.properties file.

Please add below to your example/android/local.properties file.

flutter.versionName=1.0.0
flutter.versionCode=1
flutter.buildMode=debug

Help Maintenance #

I've been maintaining quite many repos these days and burning out slowly. If you could help me cheer up, buying me a cup of coffee will make my life really happy and get much energy out of it. <br/> Buy Me A Coffee Paypal

3.1.0 #

  • flutter_sound modules are re-entrant #250 and #232
    • We can open several FlutterSoundPlayer at the same time
    • We can open several FlutterSoundRecorder at the same time
  • Add new API verbs : #244
    • flutterSoundHelper.getLastFFmpegReturnCode()
    • flutterSoundHelper.getLastFFmpegCommandOutput()
    • flutterSoundHelper.FFmpegGetMediaInformation() which return info on the given record
    • flutterSoundHelper.duration() which return the number of milli-seconds for the given record
  • Add new API verbs : ##242
    • FlutterSoundRecorder.pauseRecorder()
    • FlutterSoundRecorder.resumeRecorder()
  • flutter_sound is now compatible with permission_handler 5.x.x #259
  • API to control the audiofocus #219
  • API to set the audio-category (i.e. duck-others) #219
  • AndroidX and Android embbeded-V2 support #203
  • Add a parameter to startPlayer to specify a callback when the song is finished #215
  • License is now LGPL 3.0 instead of MIT

3.0.0+1 #

3.0.0 #

2.1.1 #

2.0.5 #

  • Hotfix #221
  • Use AAC-LC format instead of MPEG-4 #209

2.0.4 #

  • OGG/OPUS support on iOS #199

2.0.3 #

  • Resolve #194
    • stopReocorder resolve path.
  • Resolve #198
    • Improve static handler in android.

2.0.1 #

  • Add compatibility for android sdk 19.
  • Add androidx compatibility.
  • Resolve #193
    • Restore default startRecorder

1.9.0 #

  • Fix issue #175
    • add functions . isEncoderSupported(t_CODEC codec); . isDecoderSupported(t_CODEC codec);
    • add property 'audioState'
    • check if codec is really supported before doing 'startRecorder'
    • modify the example app : disable buttons when the button is not compatible with the current state
    • in the example, add sound assets encoded with the various encoder
    • modify the example to play from assets
    • modify the example to allow selection of various codec

1.7.0 #

  • startPlayerFromBuffer, to play from a buffer #170

1.6.0 #

  • Set android default encoding option to AAC.
  • Fix android default poor sound.

1.5.2 #

  • Postfix GetDirectoryType to avoid conflicts #147

1.5.1 #

  • Set android recorder encoder default value to AndroidEncoder.DEFAULT.

1.5.0 #

  • Use NSCachesDirectory instead of NSTemporaryDirectory #141

1.4.8 #

1.4.7 #

  • Resolve few issues on ios record path.
  • Resolve issue playing status so player can resume.
  • Resolve #134
  • Resolve #135

1.4.4 #

  • Stopped recording generating infinite db values #131

1.4.3 #

  • Improved db calcs #123

1.4.2 #

  • Fixed 'mediaplayer went away with unhandled events' bug #104

1.4.1 #

  • Fixed 'mediaplayer went away with unhandled events' bug #83

1.4.0 #

  • AndroidX compatibility improved #68
  • iOS: Fixes for seekToPlayer #72
  • iOS: Setup configuration for using bluetooth microphone recording input #73

1.3.6 #

  • Android: Adds a single threaded command scheduler for all recording related commands.
  • Switch source & target compability to Java 8
  • Bump gradle plugin version dependencies

1.3.+ #

  • Support db/meter #41
  • Show wrong recorder timer text #47
  • Add ability to specify Android & iOS encoder #49
  • Adjust db range and fix nullable check in ios #59
  • Android: Recording operations on a separate command queue #66
  • Android: Remove reference to non-AndroidX classes which improves compatibility

1.2.+ #

  • Fixed sound distorting when playing recorded audio again. Issue #14.
  • Fixed seekToPlayer for android. Issue #10.
  • Expose recorder sampleRate and numChannel.
  • Do not append tmp when filePath provided in ios.
  • Resolve regression issue in 1.2.3 which caused in 1.2.2.
  • Reduce the size of audio file in 1.2.4. Related #26.
  • Fixed recording issue in android in 1.2.5.
  • Changed seekToPlayer to place exact secs instead adding it.
  • Fix file URI for recording and playing in iOS.

1.1.+ #

  • Released 1.1.0 with beautiful logo from mansa.
  • Improved readme.
  • Resolve #7.
  • Fixed missing break in switch statement.

1.0.9 #

  • Reimport intl which is needed to format date in Dart.

1.0.8 #

  • Implemented setVolume method.
  • Specific error messages given in android.
  • Manage ios player thread when audio is not loaded.

1.0.7 #

  • Safer handling of progressUpdate in ios when audio is invalid.

1.0.6 #

  • Fixed bug in platform specific code.

1.0.5 #

  • Fixed pug in seekToPlayer in ios.

1.0.3 #

  • Added license.

1.0.0 #

  • Released preview version for audio recorder and player.

example/lib/main.dart

/*
 * This file is part of Flutter-Sound (Flauto).
 *
 *   Flutter-Sound (Flauto) is free software: you can redistribute it and/or modify
 *   it under the terms of the Lesser GNU General Public License
 *   version 3 (LGPL3) as published by the Free Software Foundation.
 *
 *   Flutter-Sound (Flauto) is distributed in the hope that it will be useful,
 *   but WITHOUT ANY WARRANTY; without even the implied warranty of
 *   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 *   GNU General Public License for more details.
 *
 *   You should have received a copy of the Lesser GNU General Public License
 *   along with Flutter-Sound (Flauto).  If not, see <https://www.gnu.org/licenses/>.
 */

import 'dart:async';
import 'dart:io';
import 'dart:math';
import 'dart:typed_data' show Uint8List;

import 'package:flutter/material.dart';
import 'package:path_provider/path_provider.dart' show getTemporaryDirectory;
import 'package:flutter/services.dart' show rootBundle;
import 'package:intl/date_symbol_data_local.dart';
import 'package:intl/intl.dart' show DateFormat;
import 'package:flutter_sound/flauto.dart';
import 'package:flutter_sound/flutter_sound_player.dart';
import 'package:flutter_sound/flutter_sound.dart';
import 'package:flutter_sound/track_player.dart';
import 'package:flutter_sound/flutter_sound_recorder.dart';

enum t_MEDIA {
  FILE,
  BUFFER,
  ASSET,
  STREAM,
  REMOTE_EXAMPLE_FILE,
}

/// Boolean to specify if we want to test the Rentrance/Concurency feature.
/// If true, we start two instances of FlautoPlayer when the user hit the "Play" button.
/// If true, we start two instances of FlautoRecorder and one instance of FlautoPlayer when the user hit the Record button
const bool REENTRANCE_CONCURENCY = false;
final exampleAudioFilePath = "https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3";
final albumArtPath = "https://file-examples.com/wp-content/uploads/2017/10/file_example_PNG_500kB.png";

void main() {
  runApp(new MyApp());
}

class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => new _MyAppState();
}

class _MyAppState extends State<MyApp> {
  bool _isRecording = false;
  List<String> _path = [null, null, null, null, null, null, null];
  StreamSubscription _recorderSubscription;
  StreamSubscription _dbPeakSubscription;
  StreamSubscription _playerSubscription;
  StreamSubscription _playbackStateSubscription;

  FlutterSoundPlayer playerModule;
  FlutterSoundRecorder recorderModule;
  FlutterSoundPlayer playerModule_2; // Used if REENTRANCE_CONCURENCY
  FlutterSoundRecorder recorderModule_2; // Used if REENTRANCE_CONCURENCY

  String _recorderTxt = '00:00:00';
  String _playerTxt = '00:00:00';
  double _dbLevel;

  double sliderCurrentPosition = 0.0;
  double maxDuration = 1.0;
  t_MEDIA _media = t_MEDIA.FILE;
  t_CODEC _codec = t_CODEC.CODEC_AAC;

  bool _encoderSupported = true; // Optimist
  bool _decoderSupported = true; // Optimist

  // Whether the user wants to use the audio player features
  bool _isAudioPlayer = false;
  bool _duckOthers = false;

  double _duration = null;

  Future<void> _initializeExample(FlutterSoundPlayer module) async {
    playerModule = module;
    await module.initialize();
    await playerModule.setSubscriptionDuration(0.01);
    await recorderModule.setSubscriptionDuration(0.01);
    initializeDateFormatting();
    setCodec(_codec);
    setDuck();
  }

  Future<void> init() async {
    playerModule = await FlutterSoundPlayer().initialize();
    recorderModule = await FlutterSoundRecorder().initialize();
    await _initializeExample(playerModule);

    await recorderModule.setDbPeakLevelUpdate(0.8);
    await recorderModule.setDbLevelEnabled(true);
    await recorderModule.setDbLevelEnabled(true);
    if (REENTRANCE_CONCURENCY) {
      playerModule_2 = await FlutterSoundPlayer().initialize();
      await playerModule_2.setSubscriptionDuration(0.01);
      await playerModule_2.setSubscriptionDuration(0.01);

      recorderModule_2 = await FlutterSoundRecorder().initialize();
      await recorderModule_2.setSubscriptionDuration(0.01);
      await recorderModule_2.setDbPeakLevelUpdate(0.8);
      await recorderModule_2.setDbLevelEnabled(true);
    }
  }

  @override
  void initState() {
    super.initState();
    init();
  }

  t_AUDIO_STATE get audioState {
    if (playerModule != null) {
      if (playerModule.isPlaying) return t_AUDIO_STATE.IS_PLAYING;
      if (playerModule.isPaused) return t_AUDIO_STATE.IS_PAUSED;
    }
    if (recorderModule != null) {
      if (recorderModule.isPaused) return t_AUDIO_STATE.IS_RECORDING_PAUSED;
      if (recorderModule.isRecording) return t_AUDIO_STATE.IS_RECORDING;
    }
    return t_AUDIO_STATE.IS_STOPPED;
  }

  void cancelRecorderSubscriptions() {
    if (_recorderSubscription != null) {
      _recorderSubscription.cancel();
      _recorderSubscription = null;
    }
    if (_dbPeakSubscription != null) {
      _dbPeakSubscription.cancel();
      _dbPeakSubscription = null;
    }
  }

  void cancelPlayerSubscriptions() {
    if (_playerSubscription != null) {
      _playerSubscription.cancel();
      _playerSubscription = null;
    }

    if (_playbackStateSubscription != null) {
      _playbackStateSubscription.cancel();
      _playbackStateSubscription = null;
    }
  }

  @override
  void dispose() {
    super.dispose();
    cancelPlayerSubscriptions();
    cancelRecorderSubscriptions();
    releaseFlauto();
  }

  Future<void> setDuck() async {
    if (_duckOthers) {
      if (Platform.isIOS)
        await playerModule.iosSetCategory(t_IOS_SESSION_CATEGORY.PLAY_AND_RECORD, t_IOS_SESSION_MODE.DEFAULT, IOS_DUCK_OTHERS | IOS_DEFAULT_TO_SPEAKER);
      else if (Platform.isAndroid) await playerModule.androidAudioFocusRequest(ANDROID_AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK);
    } else {
      if (Platform.isIOS)
        await playerModule.iosSetCategory(t_IOS_SESSION_CATEGORY.PLAY_AND_RECORD, t_IOS_SESSION_MODE.DEFAULT, IOS_DEFAULT_TO_SPEAKER);
      else if (Platform.isAndroid) await playerModule.androidAudioFocusRequest(ANDROID_AUDIOFOCUS_GAIN);
    }
  }

  Future<void> releaseFlauto() async {
    try {
      await playerModule.release();
      await recorderModule.release();
      await playerModule_2.release();
      await recorderModule_2.release();
    } catch (e) {
      print('Released unsuccessful');
      print(e);
    }
  }

  static const List<String> paths = [
    'flutter_sound_example.aac', // DEFAULT
    'flutter_sound_example.aac', // CODEC_AAC
    'flutter_sound_example.opus', // CODEC_OPUS
    'flutter_sound_example.caf', // CODEC_CAF_OPUS
    'flutter_sound_example.mp3', // CODEC_MP3
    'flutter_sound_example.ogg', // CODEC_VORBIS
    'flutter_sound_example.wav', // CODEC_PCM
  ];

  void startRecorder() async {
    try {
      // String path = await flutterSoundModule.startRecorder
      // (
      //   paths[_codec.index],
      //   codec: _codec,
      //   sampleRate: 16000,
      //   bitRate: 16000,
      //   numChannels: 1,
      //   androidAudioSource: AndroidAudioSource.MIC,
      // );
      Directory tempDir = await getTemporaryDirectory();

      String path = await recorderModule.startRecorder(
        uri: '${tempDir.path}/${recorderModule.slotNo}-${paths[_codec.index]}',
        codec: _codec,
      );
      print('startRecorder: $path');

      _recorderSubscription = recorderModule.onRecorderStateChanged.listen((e) {
        if (e != null && e.currentPosition != null) {
          DateTime date = new DateTime.fromMillisecondsSinceEpoch(e.currentPosition.toInt(), isUtc: true);
          String txt = DateFormat('mm:ss:SS', 'en_GB').format(date);

          this.setState(() {
            this._recorderTxt = txt.substring(0, 8);
          });
        }
      });
      _dbPeakSubscription = recorderModule.onRecorderDbPeakChanged.listen((value) {
        print("got update -> $value");
        setState(() {
          this._dbLevel = value;
        });
      });
      if (REENTRANCE_CONCURENCY) {
        try
        {
          Uint8List dataBuffer = (await rootBundle.load( assetSample[_codec.index] )).buffer.asUint8List( );
          await playerModule_2.startPlayerFromBuffer( dataBuffer, codec: _codec, whenFinished: ( )
          {
            //await playerModule_2.startPlayer(exampleAudioFilePath, codec: t_CODEC.CODEC_MP3, whenFinished: () {
            print( 'Secondary Play finished' );
          } );
        } catch(e) {
          print('startRecorder error: $e');
        }
        await recorderModule_2.startRecorder(
          uri: '${tempDir.path}/flutter_sound_recorder2.aac',
          codec: t_CODEC.CODEC_AAC,
        );
        print("Secondary record is '${tempDir.path}/flutter_sound_recorder2.aac'");
      }

      this.setState(() {
        this._isRecording = true;
        this._path[_codec.index] = path;
      });
    } catch (err) {
      print('startRecorder error: $err');
      setState(() {
        stopRecorder();
        this._isRecording = false;
        if (_recorderSubscription != null) {
          _recorderSubscription.cancel();
          _recorderSubscription = null;
        }
        if (_dbPeakSubscription != null) {
          _dbPeakSubscription.cancel();
          _dbPeakSubscription = null;
        }
      });
    }
  }

  Future<void> getDuration() async {
    switch (_media) {
      case t_MEDIA.FILE:
      case t_MEDIA.BUFFER:
        int d = await flutterSoundHelper.duration(this._path[_codec.index]);
        _duration = d != null ? d / 1000.0 : null;
        break;
      case t_MEDIA.ASSET:
        _duration = null;
        break;
      case t_MEDIA.REMOTE_EXAMPLE_FILE:
        _duration = null;
        break;
    }
    setState(() {});
  }

  void stopRecorder() async {
    try {
      String result = await recorderModule.stopRecorder();
      print('stopRecorder: $result');
      cancelRecorderSubscriptions();
      if (REENTRANCE_CONCURENCY) {
        await recorderModule_2.stopRecorder();
        await playerModule_2.stopPlayer();
      }
      getDuration();
    } catch (err) {
      print('stopRecorder error: $err');
    }
    this.setState(() {
      this._isRecording = false;
    });
  }

  Future<bool> fileExists(String path) async {
    return await File(path).exists();
  }

  // In this simple example, we just load a file in memory.This is stupid but just for demonstration  of startPlayerFromBuffer()
  Future<Uint8List> makeBuffer(String path) async {
    try {
      if (!await fileExists(path)) return null;
      File file = File(path);
      file.openRead();
      var contents = await file.readAsBytes();
      print('The file is ${contents.length} bytes long.');
      return contents;
    } catch (e) {
      print(e);
      return null;
    }
  }

  List<String> assetSample = [
    'assets/samples/sample.aac',
    'assets/samples/sample.aac',
    'assets/samples/sample.opus',
    'assets/samples/sample.caf',
    'assets/samples/sample.mp3',
    'assets/samples/sample.ogg',
    'assets/samples/sample.wav',
  ];

  void _addListeners() {
    cancelPlayerSubscriptions();
    _playerSubscription = playerModule.onPlayerStateChanged.listen((e) {
      if (e != null) {
        maxDuration = e.duration;
        if (maxDuration <= 0) maxDuration = 0.0;

        sliderCurrentPosition = min(e.currentPosition, maxDuration);
        if (sliderCurrentPosition < 0.0) {
          sliderCurrentPosition = 0.0;
        }

        DateTime date = new DateTime.fromMillisecondsSinceEpoch(e.currentPosition.toInt(), isUtc: true);
        String txt = DateFormat('mm:ss:SS', 'en_GB').format(date);
        this.setState(() {
          //this._isPlaying = true;
          this._playerTxt = txt.substring(0, 8);
        });
      }
    });
  }

  Future<void> startPlayer() async {
    try {
      //final albumArtPath =
      //"https://file-examples.com/wp-content/uploads/2017/10/file_example_PNG_500kB.png";

      String path;
      Uint8List dataBuffer;
      String audioFilePath;
      if (_media == t_MEDIA.ASSET) {
        dataBuffer = (await rootBundle.load(assetSample[_codec.index])).buffer.asUint8List();
      } else if (_media == t_MEDIA.FILE) {
        // Do we want to play from buffer or from file ?
        if (await fileExists(_path[_codec.index])) audioFilePath = this._path[_codec.index];
      } else if (_media == t_MEDIA.BUFFER) {
        // Do we want to play from buffer or from file ?
        if (await fileExists(_path[_codec.index])) {
          dataBuffer = await makeBuffer(this._path[_codec.index]);
          if (dataBuffer == null) {
            throw Exception('Unable to create the buffer');
          }
        }
      } else if (_media == t_MEDIA.REMOTE_EXAMPLE_FILE) {
        // We have to play an example audio file loaded via a URL
        audioFilePath = exampleAudioFilePath;
      }

      // Check whether the user wants to use the audio player features
      if (_isAudioPlayer) {
        String albumArtUrl;
        String albumArtAsset;
        if (_media == t_MEDIA.REMOTE_EXAMPLE_FILE)
          albumArtUrl = albumArtPath;
        else {
          if (Platform.isIOS) {
            albumArtAsset = 'AppIcon';
          } else if (Platform.isAndroid) {
            albumArtAsset = 'AppIcon.png';
          }
        }

        final track = Track(
          trackPath: audioFilePath,
          dataBuffer: dataBuffer,
          codec: _codec,
          trackTitle: "This is a record",
          trackAuthor: "from flutter_sound",
          albumArtUrl: albumArtUrl,
          albumArtAsset: albumArtAsset,
        );

        TrackPlayer f = playerModule;
        path = await f.startPlayerFromTrack(
          track,
          /*canSkipForward:true, canSkipBackward:true,*/
          whenFinished: () {
            print('I hope you enjoyed listening to this song');
            setState(() {});
          },
          onSkipBackward: () {
            print('Skip backward');
            stopPlayer();
            startPlayer();
          },
          onSkipForward: () {
            print('Skip forward');
            stopPlayer();
            startPlayer();
          },
        );
      } else {
        if (audioFilePath != null) {
          path = await playerModule.startPlayer(audioFilePath, codec: _codec, whenFinished: () {
            print('Play finished');
            setState(() {});
          });
        } else if (dataBuffer != null) {
          path = await playerModule.startPlayerFromBuffer(dataBuffer, codec: _codec, whenFinished: () {
            print('Play finished');
            setState(() {});
          });
        }

        if (path == null) {
          print('Error starting player');
          return;
        }
      }
      _addListeners();
      if (REENTRANCE_CONCURENCY && _media != t_MEDIA.REMOTE_EXAMPLE_FILE) {
          Uint8List dataBuffer = (await rootBundle.load(assetSample[_codec.index])).buffer.asUint8List();
          await playerModule_2.startPlayerFromBuffer(dataBuffer, codec: _codec, whenFinished: () {

          //playerModule_2.startPlayer(exampleAudioFilePath, codec: t_CODEC.CODEC_MP3, whenFinished: () {
          print('Secondary Play finished');
        });
      }

      print('startPlayer: $path');
      // await flutterSoundModule.setVolume(1.0);
    } catch (err) {
      print('error: $err');
    }
    setState(() {});
  }

  Future<void> stopPlayer() async {
    try {
      String result = await playerModule.stopPlayer();
      print('stopPlayer: $result');
      if (_playerSubscription != null) {
        _playerSubscription.cancel();
        _playerSubscription = null;
      }
      sliderCurrentPosition = 0.0;
    } catch (err) {
      print('error: $err');
    }
    if (REENTRANCE_CONCURENCY) {
      try {
        String result = await playerModule_2.stopPlayer();
        print('stopPlayer_2: $result');
      } catch (err) {
        print('error: $err');
      }
    }

    this.setState(() {
      //this._isPlaying = false;
    });
  }

  pauseResumePlayer() {
    if (playerModule.isPlaying) {
      playerModule.pausePlayer();
      if (REENTRANCE_CONCURENCY) {
        playerModule_2.pausePlayer();
      }
    } else {
      playerModule.resumePlayer();
      if (REENTRANCE_CONCURENCY) {
        playerModule_2.resumePlayer();
      }
    }
  }

  pauseResumeRecorder() {
    if (recorderModule.isPaused) {
      {
        recorderModule.resumeRecorder();
        if (REENTRANCE_CONCURENCY) {
          recorderModule_2.resumeRecorder();
        }
      }
    } else {
      recorderModule.pauseRecorder();
      if (REENTRANCE_CONCURENCY) {
        recorderModule_2.pauseRecorder();
      }
    }
  }

  void seekToPlayer(int milliSecs) async {
    String result = await playerModule.seekToPlayer(milliSecs);
    print('seekToPlayer: $result');
  }

  Widget makeDropdowns(BuildContext context) {
    final mediaDropdown = Row(
      mainAxisAlignment: MainAxisAlignment.start,
      crossAxisAlignment: CrossAxisAlignment.center,
      children: <Widget>[
        Padding(
          padding: const EdgeInsets.only(right: 16.0),
          child: Text('Media:'),
        ),
        DropdownButton<t_MEDIA>(
          value: _media,
          onChanged: (newMedia) {
            if (newMedia == t_MEDIA.REMOTE_EXAMPLE_FILE) _codec = t_CODEC.CODEC_MP3; // Actually this is the only example we use in this example
            _media = newMedia;
            getDuration();
            setState(() {});
          },
          items: <DropdownMenuItem<t_MEDIA>>[
            DropdownMenuItem<t_MEDIA>(
              value: t_MEDIA.FILE,
              child: Text('File'),
            ),
            DropdownMenuItem<t_MEDIA>(
              value: t_MEDIA.BUFFER,
              child: Text('Buffer'),
            ),
            DropdownMenuItem<t_MEDIA>(
              value: t_MEDIA.ASSET,
              child: Text('Asset'),
            ),
            DropdownMenuItem<t_MEDIA>(
              value: t_MEDIA.REMOTE_EXAMPLE_FILE,
              child: Text('Remote Example File'),
            ),
          ],
        ),
      ],
    );

    final codecDropdown = Row(
      mainAxisAlignment: MainAxisAlignment.start,
      crossAxisAlignment: CrossAxisAlignment.center,
      children: <Widget>[
        Padding(
          padding: const EdgeInsets.only(right: 16.0),
          child: Text('Codec:'),
        ),
        DropdownButton<t_CODEC>(
          value: _codec,
          onChanged: (newCodec) {
            setCodec(newCodec);
            _codec = newCodec;
            getDuration();
            setState(() {});
          },
          items: <DropdownMenuItem<t_CODEC>>[
            DropdownMenuItem<t_CODEC>(
              value: t_CODEC.CODEC_AAC,
              child: Text('AAC'),
            ),
            DropdownMenuItem<t_CODEC>(
              value: t_CODEC.CODEC_OPUS,
              child: Text('OGG/Opus'),
            ),
            DropdownMenuItem<t_CODEC>(
              value: t_CODEC.CODEC_CAF_OPUS,
              child: Text('CAF/Opus'),
            ),
            DropdownMenuItem<t_CODEC>(
              value: t_CODEC.CODEC_MP3,
              child: Text('MP3'),
            ),
            DropdownMenuItem<t_CODEC>(
              value: t_CODEC.CODEC_VORBIS,
              child: Text('OGG/Vorbis'),
            ),
            DropdownMenuItem<t_CODEC>(
              value: t_CODEC.CODEC_PCM,
              child: Text('PCM'),
            ),
          ],
        ),
      ],
    );

    return Padding(
      padding: const EdgeInsets.all(8.0),
      child: Column(
        mainAxisAlignment: MainAxisAlignment.start,
        crossAxisAlignment: CrossAxisAlignment.center,
        children: <Widget>[
          Padding(
            padding: const EdgeInsets.only(bottom: 16.0),
            child: mediaDropdown,
          ),
          codecDropdown,
        ],
      ),
    );
  }

  onPauseResumePlayerPressed() {
    switch (audioState) {
      case t_AUDIO_STATE.IS_PAUSED:
        return pauseResumePlayer;
        break;
      case t_AUDIO_STATE.IS_PLAYING:
        return pauseResumePlayer;
        break;
      case t_AUDIO_STATE.IS_STOPPED:
        return null;
        break;
      case t_AUDIO_STATE.IS_RECORDING:
        return null;
        break;
      case t_AUDIO_STATE.IS_RECORDING_PAUSED:
        return null;
        break;
    }
  }

  onPauseResumeRecorderPressed() {
    switch (audioState) {
      case t_AUDIO_STATE.IS_PAUSED:
        return null;
        break;
      case t_AUDIO_STATE.IS_PLAYING:
        return null;
        break;
      case t_AUDIO_STATE.IS_STOPPED:
        return null;
        break;
      case t_AUDIO_STATE.IS_RECORDING:
        return pauseResumeRecorder;
        break;
      case t_AUDIO_STATE.IS_RECORDING_PAUSED:
        return pauseResumeRecorder;
        break;
    }
  }

  onStopPlayerPressed() {
    return audioState == t_AUDIO_STATE.IS_PLAYING || audioState == t_AUDIO_STATE.IS_PAUSED ? stopPlayer : null;
  }

  onStartPlayerPressed() {
    if (_media == t_MEDIA.FILE || _media == t_MEDIA.BUFFER) // A file must be already recorded to play it
    {
      if (_path[_codec.index] == null) return null;
    }
    if (_media == t_MEDIA.REMOTE_EXAMPLE_FILE && _codec != t_CODEC.CODEC_MP3) // in this example we use just a remote mp3 file
      return null;

    // Disable the button if the selected codec is not supported
    if (!_decoderSupported) return null;
    return (isStopped()) ? startPlayer : null;
  }

  void startStopRecorder() {
    if (recorderModule.isRecording)
      stopRecorder();
    else
      startRecorder();
  }

  onStartRecorderPressed() {
    if (_media == t_MEDIA.ASSET || _media == t_MEDIA.BUFFER || _media == t_MEDIA.REMOTE_EXAMPLE_FILE) return null;
    // Disable the button if the selected codec is not supported
    if (!_encoderSupported) return null;
    if (audioState != t_AUDIO_STATE.IS_RECORDING && audioState != t_AUDIO_STATE.IS_RECORDING_PAUSED && audioState != t_AUDIO_STATE.IS_STOPPED) return null;
    return startStopRecorder;
  }

  bool isStopped() => (audioState == t_AUDIO_STATE.IS_STOPPED);

  AssetImage recorderAssetImage() {
    if (onStartRecorderPressed() == null) return AssetImage('res/icons/ic_mic_disabled.png');
    return audioState == t_AUDIO_STATE.IS_STOPPED ? AssetImage('res/icons/ic_mic.png') : AssetImage('res/icons/ic_stop.png');
  }

  setCodec(t_CODEC codec) async {
    _encoderSupported = await recorderModule.isEncoderSupported(codec);
    _decoderSupported = await playerModule.isDecoderSupported(codec);

    setState(() {
      _codec = codec;
    });
  }

  audioPlayerSwitchChanged() {
    if (!isStopped()) return null;
    return ((newVal) async {
      try {
        if (playerModule != null) await playerModule.release();

        _isAudioPlayer = newVal;
        if (!newVal) {
          _initializeExample(FlutterSoundPlayer());
        } else {
          _initializeExample(TrackPlayer());
        }
        setState(() {});
      } catch (err) {
        print(err);
      }
    });
  }

  duckOthersSwitchChanged() {
    return ((newVal) async {
      _duckOthers = newVal;

      try {
        setDuck();
        setState(() {});
      } catch (err) {
        print(err);
      }
    });
  }

  @override
  Widget build(BuildContext context) {
    final recorderProgressIndicator = _isRecording
        ? LinearProgressIndicator(
            value: 100.0 / 160.0 * (this._dbLevel ?? 1) / 100,
            valueColor: AlwaysStoppedAnimation<Color>(Colors.green),
            backgroundColor: Colors.red,
          )
        : Container();
    final playerControls = Row(
      children: <Widget>[
        Container(
          width: 56.0,
          height: 56.0,
          child: ClipOval(
            child: FlatButton(
              onPressed: onStartPlayerPressed(),
              padding: EdgeInsets.all(8.0),
              child: Image(
                image: AssetImage(onStartPlayerPressed() != null ? 'res/icons/ic_play.png' : 'res/icons/ic_play_disabled.png'),
              ),
            ),
          ),
        ),
        Container(
          width: 56.0,
          height: 56.0,
          child: ClipOval(
            child: FlatButton(
              onPressed: onPauseResumePlayerPressed(),
              padding: EdgeInsets.all(8.0),
              child: Image(
                width: 36.0,
                height: 36.0,
                image: AssetImage(onPauseResumePlayerPressed() != null ? 'res/icons/ic_pause.png' : 'res/icons/ic_pause_disabled.png'),
              ),
            ),
          ),
        ),
        Container(
          width: 56.0,
          height: 56.0,
          child: ClipOval(
            child: FlatButton(
              onPressed: onStopPlayerPressed(),
              padding: EdgeInsets.all(8.0),
              child: Image(
                width: 28.0,
                height: 28.0,
                image: AssetImage(onStopPlayerPressed() != null ? 'res/icons/ic_stop.png' : 'res/icons/ic_stop_disabled.png'),
              ),
            ),
          ),
        ),
      ],
      mainAxisAlignment: MainAxisAlignment.center,
      crossAxisAlignment: CrossAxisAlignment.center,
    );
    final playerSlider = Container(
        height: 56.0,
        child: Slider(
            value: min(sliderCurrentPosition, maxDuration),
            min: 0.0,
            max: maxDuration,
            onChanged: (double value) async {
              await playerModule.seekToPlayer(value.toInt());
            },
            divisions: maxDuration == 0.0 ? 1 : maxDuration.toInt()));

    final dropdowns = makeDropdowns(context);
    final trackSwitch = Padding(
      padding: const EdgeInsets.all(8.0),
      child: Row(
        children: <Widget>[
          Padding(
            padding: const EdgeInsets.only(right: 4),
            child: Text('"Flauto":'),
          ),
          Switch(
            value: _isAudioPlayer,
            onChanged: audioPlayerSwitchChanged(),
          ),
          Padding(
            padding: const EdgeInsets.only(right: 4.0),
            child: Text('Duck Others:'),
          ),
          Switch(
            value: _duckOthers,
            onChanged: duckOthersSwitchChanged(),
          ),
        ],
      ),
    );

    Widget recorderSection = Column(crossAxisAlignment: CrossAxisAlignment.center, mainAxisAlignment: MainAxisAlignment.center, children: <Widget>[
      Container(
        margin: EdgeInsets.only(top: 12.0, bottom: 16.0),
        child: Text(
          this._recorderTxt,
          style: TextStyle(
            fontSize: 35.0,
            color: Colors.black,
          ),
        ),
      ),
      _isRecording ? LinearProgressIndicator(value: 100.0 / 160.0 * (this._dbLevel ?? 1) / 100, valueColor: AlwaysStoppedAnimation<Color>(Colors.green), backgroundColor: Colors.red) : Container(),
      Row(
        children: <Widget>[
          Container(
            width: 56.0,
            height: 50.0,
            child: ClipOval(
              child: FlatButton(
                onPressed: onStartRecorderPressed(),
                padding: EdgeInsets.all(8.0),
                child: Image(
                  image: recorderAssetImage(),
                ),
              ),
            ),
          ),
          Container(
            width: 56.0,
            height: 50.0,
            child: ClipOval(
              child: FlatButton(
                onPressed: onPauseResumeRecorderPressed(),
                disabledColor: Colors.white,
                padding: EdgeInsets.all(8.0),
                child: Image(
                  width: 36.0,
                  height: 36.0,
                  image: AssetImage(onPauseResumeRecorderPressed() != null ? 'res/icons/ic_pause.png' : 'res/icons/ic_pause_disabled.png'),
                ),
              ),
            ),
          ),
        ],
        mainAxisAlignment: MainAxisAlignment.center,
        crossAxisAlignment: CrossAxisAlignment.center,
      ),
    ]);

    Widget playerSection = Column(
      crossAxisAlignment: CrossAxisAlignment.center,
      mainAxisAlignment: MainAxisAlignment.center,
      children: <Widget>[
        Container(
          margin: EdgeInsets.only(top: 12.0, bottom: 16.0),
          child: Text(
            this._playerTxt,
            style: TextStyle(
              fontSize: 35.0,
              color: Colors.black,
            ),
          ),
        ),
        Row(
          children: <Widget>[
            Container(
              width: 56.0,
              height: 50.0,
              child: ClipOval(
                child: FlatButton(
                  onPressed: onStartPlayerPressed(),
                  disabledColor: Colors.white,
                  padding: EdgeInsets.all(8.0),
                  child: Image(
                    image: AssetImage(onStartPlayerPressed() != null ? 'res/icons/ic_play.png' : 'res/icons/ic_play_disabled.png'),
                  ),
                ),
              ),
            ),
            Container(
              width: 56.0,
              height: 50.0,
              child: ClipOval(
                child: FlatButton(
                  onPressed: onPauseResumePlayerPressed(),
                  disabledColor: Colors.white,
                  padding: EdgeInsets.all(8.0),
                  child: Image(
                    width: 36.0,
                    height: 36.0,
                    image: AssetImage(onPauseResumePlayerPressed() != null ? 'res/icons/ic_pause.png' : 'res/icons/ic_pause_disabled.png'),
                  ),
                ),
              ),
            ),
            Container(
              width: 56.0,
              height: 50.0,
              child: ClipOval(
                child: FlatButton(
                  onPressed: onStopPlayerPressed(),
                  disabledColor: Colors.white,
                  padding: EdgeInsets.all(8.0),
                  child: Image(
                    width: 28.0,
                    height: 28.0,
                    image: AssetImage(onStopPlayerPressed() != null ? 'res/icons/ic_stop.png' : 'res/icons/ic_stop_disabled.png'),
                  ),
                ),
              ),
            ),
          ],
          mainAxisAlignment: MainAxisAlignment.center,
          crossAxisAlignment: CrossAxisAlignment.center,
        ),
        Container(
            height: 30.0,
            child: Slider(
                value: min(sliderCurrentPosition, maxDuration),
                min: 0.0,
                max: maxDuration,
                onChanged: (double value) async {
                  await playerModule.seekToPlayer(value.toInt());
                },
                divisions: maxDuration == 0.0 ? 1 : maxDuration.toInt())),
        Container(
          height: 30.0,
          child: Text(_duration != null ? "Duration: $_duration sec." : ''),
        ),
      ],
    );

    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(
          title: const Text('Flutter Sound'),
        ),
        body: ListView(
          children: <Widget>[
            recorderSection,
            playerSection,
            dropdowns,
            trackSwitch,
          ],
        ),
      ),
    );
  }
}

Use this package as a library

1. Depend on it

Add this to your package's pubspec.yaml file:


dependencies:
  flutter_sound: ^3.1.1

2. Install it

You can install packages from the command line:

with pub:


$ pub get

with Flutter:


$ flutter pub get

Alternatively, your editor might support pub get or flutter pub get. Check the docs for your editor to learn more.

3. Import it

Now in your Dart code, you can use:


import 'package:flutter_sound/flutter_sound.dart';
  
Popularity:
Describes how popular the package is relative to other packages. [more]
96
Health:
Code health derived from static analysis. [more]
26
Maintenance:
Reflects how tidy and up-to-date the package is. [more]
80
Overall:
Weighted score of the above. [more]
71
Learn more about scoring.

We analyzed this package on Apr 1, 2020, and provided a score, details, and suggestions below. Analysis was completed with status completed using:

  • Dart: 2.7.1
  • pana: 0.13.6
  • Flutter: 1.12.13+hotfix.8

Health issues and suggestions

Fix lib/flutter_sound_recorder.dart. (-68.83 points)

Analysis of lib/flutter_sound_recorder.dart failed with 4 errors, 3 hints, including:

line 262 col 9: The name 'PermissionGroup' isn't a type so it can't be used as a type argument.

line 262 col 63: The method 'PermissionHandler' isn't defined for the class 'FlutterSoundRecorder'.

line 262 col 103: Undefined name 'PermissionGroup'.

line 263 col 20: Undefined name 'PermissionGroup'.

line 22 col 8: Unused import: 'dart:typed_data'.

Fix lib/flauto.dart. (-6.31 points)

Analysis of lib/flauto.dart reported 13 hints, including:

line 18 col 8: Unused import: 'dart:convert'.

line 20 col 8: Unused import: 'dart:io'.

line 21 col 8: Unused import: 'dart:io'.

line 22 col 8: Unused import: 'dart:typed_data'.

line 28 col 8: Unused import: 'package:flutter/services.dart'.

Fix lib/flutter_sound_player.dart. (-4.41 points)

Analysis of lib/flutter_sound_player.dart reported 9 hints, including:

line 25 col 8: Unused import: 'package:flutter_sound/android_encoder.dart'.

line 26 col 8: Unused import: 'package:flutter_sound/ios_quality.dart'.

line 30 col 8: Unused import: 'package:permission_handler/permission_handler.dart'.

line 96 col 14: Name types using UpperCamelCase.

line 97 col 14: Name types using UpperCamelCase.

Fix lib/flutter_sound.dart. (-3.93 points)

Analysis of lib/flutter_sound.dart reported 8 hints, including:

line 18 col 8: Unused import: 'dart:convert'.

line 20 col 8: Unused import: 'dart:io'.

line 21 col 8: Unused import: 'dart:io'.

line 24 col 8: Unused import: 'package:flutter/services.dart'.

line 30 col 8: Unused import: 'package:path_provider/path_provider.dart'.

Fix lib/track_player.dart. (-3.93 points)

Analysis of lib/track_player.dart reported 8 hints, including:

line 73 col 19: This function has a return type of 'Future', but doesn't end with a return statement.

line 220 col 5: Don't explicitly initialize variables to null.

line 221 col 5: Don't explicitly initialize variables to null.

line 329 col 5: Don't explicitly initialize variables to null.

line 330 col 5: Don't explicitly initialize variables to null.

Maintenance issues and suggestions

No valid SDK. (-20 points)

The analysis could not detect a valid SDK that can use this package.

Dependencies

Package Constraint Resolved Available
Direct dependencies
Dart SDK >=1.19.0 <3.0.0
flutter 0.0.0
flutter_ffmpeg ^0.2.10 0.2.10
path >=1.0.0 <2.0.0 1.6.4
path_provider >=1.0.0 <2.0.0 1.6.5
permission_handler >=4.4.0 <6.0.0 5.0.0+hotfix.3
Transitive dependencies
collection 1.14.11 1.14.12
meta 1.1.8
path_provider_macos 0.0.4
path_provider_platform_interface 1.0.1
permission_handler_platform_interface 2.0.0
platform 2.2.1
plugin_platform_interface 1.0.2
sky_engine 0.0.99
typed_data 1.1.6
vector_math 2.0.8