Overview

Record to Stream

To record to a live PCM data, when calling the verb startRecorder(), you specify the parameter toStream:, toStreamFloat32: or toStreamInt16: with your Stream sink, instead of the parameter toFile:. This parameter is a Dart StreamSink that you can listen to, for processing the audio data.

  • The parameter toStream: is used when you want to record interleaved data to a <Uint8List> Stream Sink
  • The parameter toStreamFloat32: is used when you want to record non interleaved data (Planar mode) to a <List<Float32List>> Stream Sink
  • The parameter toStreamInt16: is used when you want to record non interleaved data (Planar mode) to a <List<Int16List>> Stream Sink

Interleaved

Interleaved data are given as <Uint8List>. This are the Raw Data where each sample are coded with 2 or 4 unsigned bytes for each sample. This is convenient when you want to handle globally the data as a raw buffer. For example when you want send the raw data to a remote server.

Non interleaved (or Planar)

Non interleaved data are coded as <List<Float32List>> or <List<Int16List>> depending of the codec selected. The number of the element of the List is equal to the number of channels (1 for monophony, 2 for stereophony). This is convenient when you want to access the real audio data as Float32 or Int16.

startRecorder()

The main parameters for the verb startRecorder() are :

  • codec: : The codec (Codec.pcm16 or Codec.pcmFloat32)
  • The Stream sink :
    • toStream: : When you want to record data interleaved
    • toStreamFloat32: : When you want to record Float32 data not interleaved
    • toStreamInt16: : When you eant to record record Int16 data not interleaved
  • sampleRate: : The sample rate
  • numChannels: The number of channels (1 for monophony, 2 for stereophony, or more …)

Interleaved Example

You can look to

  StreamController<Uint8List> recordingDataController = StreamController<Uint8List>();
  _mRecordingDataSubscription =
          recordingDataController.stream.listen
            ((Uint8List buffer)
              {
                ... // Process the audio frame
              }
            );
  await _mRecorder.startRecorder(
        toStream: recordingDataController.sink,
        codec: Codec.pcm16,
        numChannels: 2,
        sampleRate: 48000,
  );

Non Interleaved Example

You can look to the same simple example provided with Flutter Sound.

  StreamController<Food> recordingDataController = StreamController<List<Float32List>>();
  _mRecordingDataSubscription =
          recordingDataController.stream.listen
            ( (List<Float32List> buffer)
              {
                for (int channel = 0; channel < cstChannelCount; ++channel)
                {
                    Float32List channelData = buffer[channel];
                    for (int n = 0; n < channelData.length; ++n)
                    {
                      double sample = buffer[channel][n];
                      ... // Process the sample
                    }
                }
              }
            );
  await _mRecorder.startRecorder(
        toStreamFloat32: recordingDataController.sink,
        codec: Codec.pcmFloat32,
        numChannels: 2,
        sampleRate: 48000,
  );

Notes :


Play from Stream

To play live stream, you start playing with the verb startPlayerFromStream() instead of the regular startPlayer() verb.

The main parameters for the verb startPlayerFromStream() are :

  • codec: : The codec (Codec.pcm16 or Codec.pcmFloat32)
  • sampleRate: : The sample rate
  • numChannels: : The number of channels (1 for monophony, 2 for stereophony, or more …)
  • interleaved: : A boolean for specifying if the data played are interleaved
await myPlayer.startPlayerFromStream
(
    codec: Codec.pcmFloat32 
    numChannels: 2
    sampleRate: 48100
    interleaved: true,
);

interleaved:

This parameter specifies if the data to be played are interleaved or not. When the data are interleaved, you will use the _mPlayer.uint8ListSink to play data. When the data are not interleaved, you will use _mPlayer.float32Sink or _mPlayer.int16Sink depending on the codec used. When the data are interleaved, the data provided by the app must be coded as UInt8List. This is convenient when you have raw data to be played from a remote server. When the data are not interleaved, they are provided as List<List>, with an array of length equal to the number of channels.


whenFinished:

This parameter cannot be used. After startPlayerFromStream() the player is always on until stopPlayer(). The app can provide audio data when it wants. Even after an elapsed time without any audio data.


Playback without back pressure (without flow control),

  • _mPlayer.float32Sink is a Stream Sink used when the data are interleaved and when you have UInt8List buffers to be played
  • _mPlayer.int16Sink is a Stream Sink used when the data are not interleaved and when you have Float32 data to be played
  • _mPlayer.int16Sink is a Stream Sink used when the data are not interleaved and when you have Int16 data to be played
Uint8List d;
...
_mPlayer.uint8ListSink.add(d);
List<Float32List> d; // A List of `numChannels` Float32List
...
_mPlayer.float32Sink.add(d);

List<Int16List>; // A List of `numChannels` Int16List
...
_mPlayer.int16Sink.add(d);

The App does myPlayer.uint8ListSink.add(d) or _mPlayer.float32Sink(d) or mPlayer.int16Sink(d); each time it wants to play some data. No need to await, no need to verify if the previous buffers have finished to be played. All the data added to the Stream Sink are buffered, and are played sequentially. The App continues to work without knowing when the buffers are really played.

This means three things :

  • If the App is very fast adding buffers to the foodSink it can consume a lot of memory for the waiting buffers.
  • When the App has finished feeding the sink, it cannot just do myPlayer.stopPlayer(), because there are perhaps many buffers not yet played. If it does a stopPlayer(), all the waiting buffers will be flushed which is probably not what it wants.
  • The App cannot know when the audio data are really played.

Examples

You can look to the provided examples :


Playback with back pressure (with flow control).

Playing live data without flow control is very simple, because you don’t have to wait//handle Futures. But sometimes it can be interesting to manage a flow control :

  • When you have huge data generated and you cannot loop feeding your Stream Sink.
  • When you want to know when the data has been played for generating data on demand.
  • When you just want to know when your previous packet has been played

If the App wants to keep synchronization with what is played, it uses the verb feedUint8FromStream(), feedInt16FromStream() or feedF32FromStream() to play data.

It is really very important not to call another feedFromStream() before the completion of the previous future. When each Future is completed, the App can be sure that the provided data are correctely either played, or at least put in low level internal buffers, and it knows that it is safe to do another one.

Example:

You can look to this This example

await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);

await myPlayer.feedFromStream(aBuffer);
await myPlayer.feedFromStream(anotherBuffer);
await myPlayer.feedFromStream(myOtherBuffer);

await myPlayer.stopPlayer();

You will await or use then() for each call to feedFromStream().

Notes :

  • This new functionnality needs, at least, an Android SDK >= 23

Examples

You can look to the provided examples :


Notes :

Record to Stream

To record data to a live Stream, you use the regular verb startRecorder(), with specific paramaters:

  • TODO

Example: (TODO)

Then you listen to the correct stream:

TODO Example: TODO

Interleaving Examples

Play from Stream

To play data from a live stream, you use the specific verb startPlayerFromStream(). The parameters used are :

TODO

Example: TODO

Then you can send your data to the output device. You have two possibilities :

  • With Flow Control

example

  • Without flow Control

example.

Interleaving

Examples