Tutorial: Playing sound files

This tutorial covers how to open and play sound files. This includes some important classes for handling sound files in JUCE.

Level: Intermediate

Platforms: Windows, Mac OS X, Linux

Classes: AudioFormatManager, AudioFormatReader, AudioFormatReaderSource, AudioTransportSource, FileChooser, ChangeListener, File, FileChooser

Getting started

Download the demo project for this tutorial here: tutorial_playing_sound_files.zip. Unzip the project and open it in your IDE.

If you need help with this step, see Tutorial: Getting started with the Projucer.

The demo project

The demo project presents a three-button interface for controlling the playback of a sound file. The three buttons are:

  • A button to present a file chooser to the user for them select the sound file.
  • A button to play the sound.
  • A button to stop the sound.

The interface is shown in the following screenshot:

tutorial_playing_sound_files_screenshot1.png
A three-button interface to control sound file playback.

Helpful classes

The AudioSource class

While we can generate audio sample-by-sample in the getNextAudioBlock() of the Audio Application template, there are some built-in tools for generating and processing audio. These allow us to link together high-level building blocks to form powerful audio applications without having to process each and every sample of audio within our application code (JUCE does this on our behalf). These building blocks are based on the AudioSource class. In fact, if you have followed any of the tutorials based on the AudioAppComponent class — for example, Tutorial: Simple synthesis (noise) — then you have been making use of the AudioSource class already. The AudioAppComponent class itself inherits from the AudioSource class and, importantly, contains an AudioSourcePlayer object that streams the audio between the AudioAppComponent and the audio hardware device. We can simply generate the audio samples directly in the getNextAudioBlock() function but we can instead chain a number of AudioSource objects together to form series of processes. We make use of this feature in this tutorial.

Audio formats

JUCE provides number of tools for reading and writing sound files in a number of formats. In this tutorial we make use of several of these, in particular we use the following classes:

  • AudioFormatManager: This class contains a list of audio formats (such as WAV, AIFF, Ogg Vorbis, and so on) and can create suitable objects for reading audio data from these formats.
  • AudioFormatReader: This class handles the low-level file reading operations on the audio file and allows us to read audio in a consistent format (generally this means arrays of float values). When an AudioFormatManager object is asked to open a particular file, it creates instances of this class.
  • AudioFormatReaderSource: This is a subclass of the AudioSource class. It can read audio data from an AudioFormatReader object and render the audio via its getNextAudioBlock() function.
  • AudioTransportSource: This class is another subclass of the AudioSource class. It can control the playback of an AudioFormatReaderSource object. This control includes starting and stopping the playback of the AudioFormatReaderSource object. It can also perform sample rate conversion and it can buffer audio ahead of time if we wish.

Putting it together

We will now bring together these classes along with suitable user interface classes to make our sound file playing application. It is useful at this point to think about the various phases — or transport states — of playing an audio file. Once the audio file is loaded we can consider these four possible states:

  • Stopped: Audio playback is stopped and ready to be started.
  • Starting: Audio playback hasn't yet started but it has been told to start.
  • Playing: Audio is playing.
  • Stopping: Audio is playing but playback has been told to stop, after this it will return to the Stopped state.

To represent these states, we create an enum within our MainContentComponent class:

enum TransportState
{
Stopped,
Starting,
Playing,
Stopping
};

Initialising the interface

In the constructor for our MainContentComponent class, we configure the three buttons:

MainContentComponent()
: state (Stopped)
{
addAndMakeVisible (&openButton);
openButton.setButtonText ("Open...");
openButton.addListener (this);
addAndMakeVisible (&playButton);
playButton.setButtonText ("Play");
playButton.addListener (this);
playButton.setColour (TextButton::buttonColourId, Colours::green);
playButton.setEnabled (false);
addAndMakeVisible (&stopButton);
stopButton.setButtonText ("Stop");
stopButton.addListener (this);
stopButton.setColour (TextButton::buttonColourId, Colours::red);
stopButton.setEnabled (false);
// ...

Notice in particular that we disable the Play and Stop buttons initially. The Play button is enabled once a valid file is loaded. We can see here that we have added our MainContentComponent object as a listener for each of these three buttons (see Tutorial: Listeners and broadcasters). We also initialise our transport state in the constructor's initialiser list.

Other initialisation

In addition to the three TextButton objects we have four other members of our MainContentComponent class:

AudioFormatManager formatManager;
AudioTransportSource transportSource;
TransportState state;

Here we see the AudioFormatManager, AudioFormatReaderSource, and AudioTransportSource classes mentioned earlier.

In the MainContentComponent constructor we need to initialise the AudioFormatManager object to register a list of standard formats [1]:

formatManager.registerBasicFormats(); // [1]

As a minimum this will enable the AudioFormatManager object to create readers for the WAV and AIFF formats. Other formats may be available depending on the platform and the options enabled in the juce_audio_formats module within the Projucer project as shown in the following screenshot:

tutorial_playing_sound_files_screenshot2.png
The juce_audio_formats module options showing audio format options.

In the MainContentComponent constructor we also add our MainContentComponent object as a listener [2] to the AudioTransportSource object so that we can respond to changes in its state (for example, when it stops):

transportSource.addChangeListener (this); // [2]
Note
The function name is addChangeListener() in this case, rather than simply addListener() as it is with many other listener classes in JUCE.

Responding to AudioTransportSource changes

When changes in the transport are reported, the changeListenerCallback() function will be called. This will be called asynchronously on the message thread:

void changeListenerCallback (ChangeBroadcaster* source) override
{
if (source == &transportSource)
{
if (transportSource.isPlaying())
changeState (Playing);
else
changeState (Stopped);
}
}

You can see that this just calls a member function changeState().

Changing states

The changing of the transport state is localised into this single function changeState(). This helps keep all of the logic for this functionality in one place. This function updates the state member and triggers any changes to other objects that need to take place when in this new state.

Note
More experienced readers may wish to use the state design pattern as an alternative way of structuring this code.
void changeState (TransportState newState)
{
if (state != newState)
{
state = newState;
switch (state)
{
case Stopped: // [3]
stopButton.setEnabled (false);
playButton.setEnabled (true);
transportSource.setPosition (0.0);
break;
case Starting: // [4]
playButton.setEnabled (false);
transportSource.start();
break;
case Playing: // [5]
stopButton.setEnabled (true);
break;
case Stopping: // [6]
transportSource.stop();
break;
}
}
}
  • [3]: When the transport returns to the Stopped state it disables the Stop button, enables the Play button, and resets the transport position back to the start of the file.
  • [4]: The Starting state is triggered by the user clicking the Play button, this tells the AudioTransportSource object to start playing. At this point we disable the Play button too.
  • [5]: The Playing state is triggered by the AudioTransportSource object reporting a change via the changeListenerCallback() function. Here we enable the Stop button.
  • [6]: The Stopping state is triggered by the user clicking the Stop button, so we tell the AudioTransportSource object to stop.

Processing the audio

The audio processing in this demo project is very straightforward: we simply hand off the processing to the AudioTransportSource object by passing it the AudioSourceChannelInfo struct that we have been passed via the AudioAppComponent class:

void getNextAudioBlock (const AudioSourceChannelInfo& bufferToFill) override
{
if (readerSource == nullptr)
{
bufferToFill.clearActiveBufferRegion();
return;
}
transportSource.getNextAudioBlock (bufferToFill);
}

Notice that we check if there is a valid AudioFormatReaderSource object first and simply zero the output if not (using the convenient AudioSourceChannelInfo::clearActiveBufferRegion() function). The AudioFormatReaderSource member is stored in a ScopedPointer object because we need to create these objects dynamically based on the user's actions. It also allows us to check for nullptr for invalid objects.

We also need to remember to pass the prepareToPlay() callback to any other AudioSource objects we are using:

void prepareToPlay (int samplesPerBlockExpected, double sampleRate) override
{
transportSource.prepareToPlay (samplesPerBlockExpected, sampleRate);
}

And the releaseResources() callback too:

void releaseResources() override
{
transportSource.releaseResources();
}

Handling the button clicks

In our buttonClicked() function we call some sensibly-named functions to perform the action:

void buttonClicked (Button* button) override
{
if (button == &openButton) openButtonClicked();
if (button == &playButton) playButtonClicked();
if (button == &stopButton) stopButtonClicked();
}

Opening a file

To open a file we pop up a FileChooser object in response to the Open... button being clicked:

void openButtonClicked()
{
FileChooser chooser ("Select a Wave file to play...",
File::nonexistent,
"*.wav"); // [7]
if (chooser.browseForFileToOpen()) // [8]
{
File file (chooser.getResult()); // [9]
AudioFormatReader* reader = formatManager.createReaderFor (file); // [10]
if (reader != nullptr)
{
ScopedPointer<AudioFormatReaderSource> newSource = new AudioFormatReaderSource (reader, true); // [11]
transportSource.setSource (newSource, 0, nullptr, reader->sampleRate); // [12]
playButton.setEnabled (true); // [13]
readerSource = newSource.release(); // [14]
}
}
}
Note
Storing the newly allocated AudioFormatReaderSource object in a temporary ScopedPointer object has the added benefit of being exception-safe. An exception could be thrown during the function call AudioTransportSource::setSource(), in which case the ScopedPointer object will delete the AudioFormatReaderSource object that is no longer needed. If a raw pointer had been used at this point to store AudioFormatReaderSource object then there could be a memory leak since the pointer would be left dangling if the exception is thrown.

Playing and stopping the file

Since we have already set up the code to actually play the file, we need only call our changeState() function with the appropriate argument to play the file. When the Play button is clicked, we do the following:

void playButtonClicked()
{
changeState (Starting);
}

Stopping the file is similarly straightforward, when the the Stop button is clicked:

void stopButtonClicked()
{
changeState (Stopping);
}
Exercise
Change the third (filePatternsAllowed) argument when creating the FileChooser object to allow the application to load AIFF files too. The file patterns can be separated by a semicolon so this should be "*.wav;*.aif;*.aiff" to allow for the two common file extensions for this format.

Adding pause functionality

We will now walk through some steps to add a pause functionality to the application. Here we will make the Play button become a Pause button while the file is playing (instead of just disabling it). We will also make the Stop button become a Return to zero button while the sound file is paused.

First of all we need to add two states Pausing and Paused to our TransportState enum:

enum TransportState
{
Stopped,
Starting,
Playing,
Pausing,
Paused,
Stopping
};

Our changeState() function needs to handle the two new states and the code for the other states needs to be updated too:

void changeState (TransportState newState)
{
if (state != newState)
{
state = newState;
switch (state)
{
case Stopped:
playButton.setButtonText ("Play");
stopButton.setButtonText ("Stop");
stopButton.setEnabled (false);
transportSource.setPosition (0.0);
break;
case Starting:
transportSource.start();
break;
case Playing:
playButton.setButtonText ("Pause");
stopButton.setButtonText ("Stop");
stopButton.setEnabled (true);
break;
case Pausing:
transportSource.stop();
break;
case Paused:
playButton.setButtonText ("Resume");
stopButton.setButtonText ("Return to Zero");
break;
case Stopping:
transportSource.stop();
break;
}
}
}

We enable and disable the buttons appropriately, and update the button text correctly in each state.

Notice that we actually stop the transport when asked to pause in the Pausing state. In the changeListenerCallback() function, we need to change the logic to move to the correct state depending on whether a pause or stop request was made:

void changeListenerCallback (ChangeBroadcaster* source) override
{
if (source == &transportSource)
{
if (transportSource.isPlaying())
changeState (Playing);
else if ((state == Stopping) || (state == Playing))
changeState (Stopped);
else if (Pausing == state)
changeState (Paused);
}
}

We need to change the code when the Play button is clicked:

void playButtonClicked()
{
if ((state == Stopped) || (state == Paused))
changeState (Starting);
else if (state == Playing)
changeState (Pausing);
}

And when the Stop button is clicked:

void stopButtonClicked()
{
if (state == Paused)
changeState (Stopped);
else
changeState (Stopping);
}

And that's it: you should be able to build and run the application now.

Note
The source code for this modified version of the application can be found in the MainComponent_02.cpp file in the Source directory of the demo project.
Exercise
Add a Label object to the interface that displays the current time position of the AudioTransportSource object. You can use the AudioTransportSource::getCurrentPosition() function to obtain this position. You will also need to make the MainContentComponent class inherit from the Timer class and perform periodic updates in your timerCallback() function to update the label. You could even use the RelativeTime class to convert the raw time in seconds to a more useful format in minutes, seconds, and milliseconds.
Note
The source code for this exercise can be found in the MainComponent_03.cpp file in the Source directory of the demo project.

Summary

In this tutorial we have introduced the reading and playing of sound files. In particular we have covered the following things:

Notes

The second and third arguments to the AudioTransportSource::setSource() function allow you to control look ahead buffering on a background thread. The second argument is the buffer size to use and the third argument is a pointer to a TimeSliceThread object, which is used for the background processing. In this example we use a zero buffer size and a nullptr value for the thread object, which is the default.

See also