We Make Awesome Sh

Synced Web Audio Playback

This is the second instalment of our blog series on how we crafted the Madeon Adventure Machine. Our first post was “Audio Formats for Gapless Web Playback” which covered which audio format we chose and why.

The third and final instalment will be about the WebMIDI API and how we made it possible for you to use your Launchpad MIDI device with a website.

But this installment, well this is about how we made it so that all the samples played in perfect synced harmony.

The samples were all at 110bpm, with 2 bars (one bar consists of 4 beats in 4/4 time). So if it is 110 beats per minute and each sample consists of 8 beats, we now need to figure out how long each sample needs to play for before it loops.

var loopDelay = (60 / 110) * 8 = 4.3636363636; // in seconds  

So we needed the new samples to play every 4.36363636363 seconds. We needed to get the browser to perform quantization perfectly smoothly. "Quantization" is the process of aligning notes, or in this case, samples, to beats or fractions of beats. But to keep things simple we'll call the end result a "looper", simply as it loops audio in time.

JavaScript timing functions

The obvious thing to start with was to have an array of “to play” samples. Each time a user pressed one of the buttons, it pushed the audio into this array and then we had a simple JavaScript timed loop every 4.X seconds.

setInterval(function() {  
    playNextSamples()
}, 4363.63)

Now one thing to know about JavaScript and timing functions, is that this probably won’t work very well, as the JavaScript clock is horrifically imprecise.

So we thought you know what, we're gonna try and be clever. We're gonna use requestAnimationFrame. rAF is much better from a timing perspective, and is much more precise. So we built a simple wrapper, called rafInterval, which used the same syntax as setInterval but used rAF in the background.

var rafInterval = function(cb, time) {  
    var lastLoop = 0,
    loop = function() {
        if (lastLoop + time < (new Date()).getTime()) {
            cb();
            lastLoop = (new Date()).getTime();
        }
        requestAnimationFrame(loop);
    };
};

Now it was at a point that we were happy with, we knew it was far from perfect, but thought “this is fun”. Hugo on the other hand, said it was “comically bad” due to how out of time it was..

The Eureka Moment

We did some digging and discovered that with the WebAudio API you can actually set audio to play at a specific time in the future.

Now we had to refactor the code to make it so that we knew when the next loop should occur, so we simply kept a running timer for when the next loop should be. One thing to know is that AudioContext has a currentTime attribute which gets the current time from the actual computer clock, based in seconds, from when the audioContext began.

So we had a looper running, to make sure that the next loopTime was correct:

setNextLoop = function() {  
  nextLoop = audioContext.currentTime - ((audioContext.currentTime - loopStart) % loopDelay) + loopDelay;
};

Why do we have this in a separate function you ask? Well because the JavaScript stops running when the browser window is in a background tab. So we also had:

window.onfocus = function() {  
  setNextLoop();
};

Audio Library

With regards to that audio playback, as we were using the standard audio element <audio> to control playback we wanted a simple drop in to use the WebAudio API if supported.

We discovered the Audia.js library, which is a super simple drop in replacement. Audia even supports a fallback to drop down to the simple <audio> element, for browsers that don't support the AudioContext

However, Audia doesn’t support the audio timing playback we needed, so we needed to modify the library slightly to add it in. We modified the play function to take an argument of the future time we wanted to play it, and that was relatively simple:

bufferSource.start(startAtTime);  

The bufferSource is simply the audio data set up to play. The argument passed to start is the exact time (in seconds, according to audioContext.currentTime) that the audio should begin playing. So now we have audio looping in beautiful harmony.

The next up in the series is how we enabled users to use their Launchpads with the web app.

Look out for our next post on how we used the new experimental WebMIDI API.

Recent Articles

comments powered by Disqus