0

I'm trying to provide my audioHandler on my Player class, but something weird is happening

When I enter the screen, the StreamBuilder will go active just fine but if i pop and navigate to the screen again the stream connect will stay on 'waiting' forever, unless i play the audio. This causes some weird behaviors. What m i doing wrong?

relevant code

Player class

final audioHandlerProvider = Provider<AudioHandler>((ref) {
  AudioHandler _audioHandler = ref.read(audioHandlerServiceProvider);

  return _audioHandler;
});

class _PlayerClicVozzState extends State<PlayerClicVozz> {
  @override
  Widget build(BuildContext context) {

    return Scaffold(
      extendBodyBehindAppBar: true,
      backgroundColor: Color(0xff131313),
      appBar: AppBar(
        automaticallyImplyLeading: false,
        actions: [
          IconButton(
            icon: Icon(Icons.clear, color: Colors.white),
            onPressed: () => Navigator.of(context).pop(),
          ),
        ],
        backgroundColor: Colors.transparent,
        elevation: 0,
      ),
      body: Center(
        child: Consumer(builder: (context, watch, child) {
          final res = watch(audioHandlerProvider);
          return StreamBuilder<MediaState>(
            stream: _mediaStateStream(res),
            builder: (context, snapshot) {
              final mediaState = snapshot.data;
              return SeekBar(
                duration: mediaState?.mediaItem?.duration ?? Duration.zero,
                position: mediaState?.position ?? Duration.zero,
                onChangeEnd: (newPosition) {
                  res.seek(newPosition);
                },
              );
            },
          );
...


audioservice init

late AudioHandler _audioHandler;

final audioHandlerServiceProvider = Provider<AudioHandler>((ref) {
  return _audioHandler;
});

Future<void> main() async {
  _audioHandler = await AudioService.init(
    builder: () => AudioPlayerHandler(),
    config: AudioServiceConfig(
      androidNotificationChannelId: 'com.mycompany.myapp.channel.audio',
      androidNotificationChannelName: 'Audio playback',
      androidNotificationOngoing: true,
    ),
  );
... 

My audiohandler is exatcly the same as the plugin example


import 'package:audio_service/audio_service.dart';
import 'package:just_audio/just_audio.dart';

class AudioPlayerHandler extends BaseAudioHandler with SeekHandler {
  static final _item = MediaItem(
    id: 'https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3',
    album: "Science Friday",
    title: "A Salute To Head-Scratching Science",
    artist: "Science Friday and WNYC Studios",
    duration: const Duration(milliseconds: 5739820),
    artUri: Uri.parse(
        'https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg'),
  );

  final _player = AudioPlayer();

  /// Initialise our audio handler.
  AudioPlayerHandler() {
    // So that our clients (the Flutter UI and the system notification) know
    // what state to display, here we set up our audio handler to broadcast all
    // playback state changes as they happen via playbackState...
    _player.playbackEventStream.map(_transformEvent).pipe(playbackState);
    // ... and also the current media item via mediaItem.
    mediaItem.add(_item);

    // Load the player.
    _player.setAudioSource(AudioSource.uri(Uri.parse(_item.id)));
  }

  // In this simple example, we handle only 4 actions: play, pause, seek and
  // stop. Any button press from the Flutter UI, notification, lock screen or
  // headset will be routed through to these 4 methods so that you can handle
  // your audio playback logic in one place.

  @override
  Future<void> play() => _player.play();

  @override
  Future<void> pause() => _player.pause();

  @override
  Future<void> seek(Duration position) => _player.seek(position);

  @override
  Future<void> stop() => _player.stop();

  /// Transform a just_audio event into an audio_service state.
  ///
  /// This method is used from the constructor. Every event received from the
  /// just_audio player will be transformed into an audio_service state so that
  /// it can be broadcast to audio_service clients.
  PlaybackState _transformEvent(PlaybackEvent event) {
    return PlaybackState(
      controls: [
        MediaControl.rewind,
        if (_player.playing) MediaControl.pause else MediaControl.play,
        MediaControl.stop,
        MediaControl.fastForward,
      ],
      systemActions: const {
        MediaAction.seek,
        MediaAction.seekForward,
        MediaAction.seekBackward,
      },
      androidCompactActionIndices: const [0, 1, 3],
      processingState: const {
        ProcessingState.idle: AudioProcessingState.idle,
        ProcessingState.loading: AudioProcessingState.loading,
        ProcessingState.buffering: AudioProcessingState.buffering,
        ProcessingState.ready: AudioProcessingState.ready,
        ProcessingState.completed: AudioProcessingState.completed,
      }[_player.processingState]!,
      playing: _player.playing,
      updatePosition: _player.position,
      bufferedPosition: _player.bufferedPosition,
      speed: _player.speed,
      queueIndex: event.currentIndex,
    );
  }
}

MediaStateStream and QueueStateStream

  Stream<MediaState> _mediaStateStream(AudioHandler audioHandler) {
    return Rx.combineLatest2<MediaItem?, Duration, MediaState>(
        audioHandler.mediaItem,
        AudioService.position,
        (mediaItem, position) => MediaState(mediaItem, position));
  }

  _queueStateStream(AudioHandler audioHandler) {
    return Rx.combineLatest2<List<MediaItem>?, MediaItem?, QueueState>(
        audioHandler.queue,
        audioHandler.mediaItem,
        (queue, mediaItem) => QueueState(queue, mediaItem));
  }
  • Isn't `_mediaStateStream(res)` the key part that causes your problem? I think it would be helpful to provide that code. Whatever code is behind this, I am guessing it is probably solved by using a `BehaviorSubject` from rxdart wrapped around this stream - unless you've done that already, but you need to share the code for that so we can see. – Ryan Heise Nov 17 '21 at 07:43
  • hey, ryan. this is the mediaStateStream. It's the example's code but i just turned into a method Stream _mediaStateStream(AudioHandler audioHandler) { return Rx.combineLatest2( audioHandler.mediaItem, AudioService.position, (mediaItem, position) => MediaState(mediaItem, position)); } – Luciano Victor Nov 17 '21 at 12:18
  • It seems like MediaState is 'triggered' (for lack of a better word), when the app is initialised or when i interact with a button – Luciano Victor Nov 17 '21 at 12:21
  • Can you edit your question? That will make it easier for someone to help you. – Ryan Heise Nov 17 '21 at 12:42
  • edited as requested – Luciano Victor Nov 17 '21 at 13:01

1 Answers1

0

When you subscribe to a stream, you only start receiving new events that are emitted after the moment that you subscribe, and you may have a period of waiting for that next event.

In your implementation of _mediaStateStream you are making use of AudioService.position which only emits events when the position is changing (i.e. not paused or stalled). So even though the stream may have emitted position events in the past, if you subscribe to that stream again while paused or stalled, you will be in a waiting state until the next position event arrives which is after playback resumes again.

I would suggest wrapping your stream in rxdart's BeehaviorSubject so that it retains a memory of the last event and re-emits the last event to new listeners. Also, you could seed this BehaviorSubject with the very first value to ensure there is no waiting period even for the first listener:

_mediaStateSubject = BehaviorSubject.seeded(MediaState(
    handler.mediaItem.valueOrNull,
    handler.playbackState.position))
  ..addStream(_mediaStateStream(handler));

Then you can listen to _mediaStateSubject instead of _mediaStateStream.

Ryan Heise
  • 2,273
  • 5
  • 14