Web Development

Use the HTML5 Web Audio API to get creative with your web apps' sounds

See demos and experiments that use the HTML5 Web Audio API specification, which processes and synthesizes audio in web applications.

HTML5WebAudioAPI_graphic_091613.gif
A couple of years ago I wrote about implementing and using the HTML5 Audio element with examples and fallback options. In this article, I will review the HTML5 Web Audio API specification, which is a high-level JavaScript API that processes and synthesizes audio in web applications.

According to the W3C Working Draft, the API is designed to be used in conjunction with other APIs and elements on the web platform, notably the XMLHttpRequest (using the responseType and response attributes), and in particular for games and interactive applications. It is also anticipated to be used with the canvas 2D and WebGL 3D graphics APIs.

I will present: a list of browsers that support it, a brief introductory overview, and tutorials, demonstrations, and experiments that others have contributed to and have made available on the web utilizing the creative features of the Web Audio API.

Browser support

According to the Can I Use… website, the Web Audio API is partially supported by Firefox 23.0+, Chrome 10.0+, Safari 6.0+, Opera 15.0+, iOS Safari 6.0+, and Chrome for Android 29.0.

AudioContext interface

AudioContext is the interface that represents a set of AudioNode objects and their particular connections allowing for arbitrary routing of signals to the AudioDestinationNode interface, which is what the user hears. According to the specification, in most use cases, only a single AudioContext is used per web document. For example, an AudioContext is constructed as follows:

        var context = new AudioContext();

A very simple graphical representation of an example AudioContext interface is displayed in Figure A.

Figure A

HTML5WebAudioAPI_FigA_091613.gif

Each audio node can also represent a set of filters, distortion, or other audio effects that can be placed between each of the sources (nodes) and the destination (output). 

Attributes

Currently there are five attributes that can be used:

  • destination -- where the AudioDestinationNode is a single input representing the final destination for all audio to be rendered to the audio hardware.
  • sampleRate – the sample frames per second in which the AudioContext interface handles audio inputs.
  • currentTime – the time in seconds from start zero when the context is created and increments in real-time.
  • listener – is an AudioListener that is used for 3D spatialization.
  • activeSourceCount – the number of AudioBufferSourceNodes that are active and playing.

For more information on the specifications for Web Audio API interfaces, parameters, and other effects, check out the Audio API section.

Tutorials, demonstrations, and experiments

The following tutorials, demonstrations, and experiments use the Web Audio API in whole or as an integral part of their implementations.

The HTML5 Rocks website has a great introductory tutorial entitled Getting Started with Web Audio API authored by Boris Smus (Figure B). The tutorial has sample code snippets and sound demonstrations that allow you to experience playing sounds with rhythm, changing the volume of sound, altering the play/pause functionality, incorporating the cross-fading between two sounds, and applying simple filters to alter the sound, and more.

Figure B

HTML5WebAudioAPI_FigB_091613.gif

Check out the Chrome Experiment ToneCraft by DinahMoe, which lets you create tunes by adding or removing seven color blocks from the left sidebar. Each block represents individual sound notes, which can be placed in any combination onto a grid in a 3D environment similar to Minecraft. You can stack and arrange blocks for unique tones that get sampled through a loop, and then press the spacebar to play/stop the tones. I played around with ToneCraft and found it to be a fun and creative audio instrument, creating this interesting tone loop (Figure C).

Figure C

HTML5WebAudioAPI_FigC_091613.gif

Plink is another DinahMoe experiment. It allows you to choose from eight instruments represented by the colors on the right sidebar and then, when clicking with the mouse and moving the cursor (instrument) up and down, the musical staff allows you to play notes and tunes up and down the scale. The multiplayer musical game experience allows users to create a temporary Plink username that attaches to their cursor, and you can join others who are playing as well to create interesting music (Figure D).

Figure D

HTML5WebAudioAPI_FigD_091613.gif

Frequency Modulation (FM) with Web Audio API by Gaëtan Renaudeau (who works as a web architect with Zenexity, the Paris startup behind Play Framework) includes an informative series of audio FM synthesis examples. Demonstrations in his post include a Low-Frequency Oscillation (LFO) demo, a Modulator in audible range sample, Frequency ratios: harmonic or dissonant sounds example, and adding in an Envelope for automating amplitude. Another example shows how fine-tuning the Modulator's frequency very close to the Carrier's frequency creates interesting audio effects (Figure E).

Figure E

HTML5WebAudioAPI_FigE_091613.gif

The second exploration by Gaëtan Renaudeau uses Beez, WebRTC, and the Web Audio API to create a web real-time audio experiment using smartphones as synthesizer effect controllers. The experiment consists of controlling an audio stream running on a desktop web page with audio effect pads running on phones via a mobile web interface. This only works on Android Chrome devices with WebRTC required (Figure F).

Figure F

HTML5WebAudioAPI_FigF_091613.gif

JAM with Chrome, which is part of the Google Chrome Web Audio API Demos, allows you to play music online with your friends. The web application lets you select from 18 instruments, including various bass, electric, and acoustic guitars and several drum kits and keyboards. With each instrument, you can select Easy or Pro mode. Plus, several Autoplay modes are offered, along with six chords and knobs (depending on the instrument) that allow you to adjust the chorus, stereo delay, autowha, slapback delay, and more. In addition to the Web Audio API, the technology behind JAM includes WebSockets, HTML5 Canvas, CSS3, and the Go programming language (Figure G).

Figure G

HTML5WebAudioAPI_FigG_091613.gif

Brian Rinaldi created the Retro Game Music effects, which are featured on his flippin' Awesome! site. His experiment is a recreation of the Legend of Zelda musical theme using band.js and the Web Audio API. Band.js is a music composer and interface for the Web Audio API that supports rhythms, multiple instruments, repeating sections, and complex time signatures, and is created by Cody Lundquist. The Zelda retro music code is also available on JSFIDDLE, where you can view the HTML, CSS, and JavaScript, as well as play, pause, or stop the sound from the live results (Figure H).

Figure H

HTML5WebAudioAPI_FigH_091613.gif

What this means for web developers

The trending shift with more web-enabled audio content and video multimedia content seems to be catching more steam and growth with online listening and viewing every day. Interfaces such as the HTML5 Web Audio API and other multimedia APIs are sure to see more implementations in the near future, especially with the expanse of applications that use audio on the web and for various mobile devices.

About

Ryan has performed in a broad range of technology support roles for electric-generation utilities, including nuclear power plants, and for the telecommunications industry. He has worked in web development for the restaurant industry and the Federal g...

1 comments

Editor's Picks