Posted on: October 15, 2018
I developed a mobile headphone mounted interface that can track the orientation of the listener’s head and deliver dynamic real-time spatialised audio. Meaning that a sound can be positioned within a 360 degree audio field, and the position of that sound remains static in relation to the position of the listeners head and ears.
This was initially prototyped by mapping the magnetometer readings from an iPhone to OSC (Open Sound Control) messages, using the GyrOSC [1] mobile app, to control the Envelop [2] surround sound panner inserted into a single track playing a piece of music in Ableton Live, with the music delivered through a pair of wireless headphones to allow for the listeners 360 degree rotation.
Using the above prototype, the music’s perceived source in physical space could be determined quite accurately by spinning around on an office chair, or by standing up and shuffling around on the spot. This was the case for both the on-ear headphones and the bone-conducting headphones, the later resulting, to varying degrees, in the perception of a virtual audio source within the natural ambience of the physical space.
The failure of the surround sound panner to react to movements of just the head (and the ears), due to it being controlled by the hand-held magnetometer in the iPhone, was quickly made apparent (this had also been experienced previously with other smartphone sensor based mobile sound experiences). It was therefore decided that a head-tracking device would perform much better.
Alongside the above experiment I had been exploring the use of Bluetooth beacons as a way of deploying physical markers in space for the positioning of virtual audio sources, or for tracking the position of listeners.
Other than assessing a beacon’s RSSI (Received Signal Strength Indicator) value from its proximity to a smartphone via a Bluetooth scanning App, most other features, such as determining a range from multiple beacon sources (also known as ‘ranging’) and adjusting the TX, and Major and Minor values of the beacons, proved difficult or impossible to carry out as most beacons are ‘locked’ into their manufacturers API’s via their UUID’s in order to fulfil the IPS (Indoor Positioning Service) that they offer to customers who purchase their beacons.
It also became clear that in order to scan, receive and forward the RSSI values of multiple beacons to a central DAW (Digital Audio Workstation), or some kind of audio engine, would require a BLE (Bluetooth Low Energy) gateway, or GATT (Generic Attribute) Server. Again there are mobile apps available that can do this, but most, if not all, are specific to a beacon manufacturer.
This emerging investigation into BLE networks with both central and peripheral roles for BLE beacon devices lead me to the Puck.js [3] product, which presented itself as a potentially useful and versatile research tool in this area.
The Puck.js is a Javascript programmable BLE beacon. Built around the Nordic BLE chip, it can be programmed using Espruino, an open-source JavaScript interpreter. As well as being a BLE beacon, Puck.js can be programmed to act as GATT server, scan for other BLE devices and send and receive data over a Bluetooth connection. Puck.js is approx. 35mm in diameter and around 8mm thick, ships with an on-board magnetometer, ambient light sensor, temperature sensor and has a physical push-button on its surface. It can also be expanded to include other sensors and features such as GPS.
The small, light-weight and mobile Puck.js enabled me to test a headphone/head mountable magnetometer, and also provide further avenues of experimentation such as listener positioning and scanning for the proximity of other beacons, relaying data to and from other BLE capable devices (Figure 1).
Figure 1. The Puck.js mounted on top of a set of Bluetooth headphones.
<html> <head> <script src="https://www.puck-js.com/puck.js"></script> <script src="https://cdn.jsdelivr.net/npm/resonance-audio/build/resonance-audio.min.js"></script> <style> button { width:200px; padding:20px; margin-bottom:10px; } </style> </head> <body> <button id="connectButton">CONNECT</button><br/> <button id="playButton">PLAY</button> <p id="magValue"></p> <script type="text/javascript"> var connectButton = document.getElementById("connectButton"); // Create an AudioContext let audioContext = new AudioContext(); // Create a (first-order Ambisonic) Resonance Audio scene and pass it // the AudioContext. let resonanceAudioScene = new ResonanceAudio(audioContext); // Connect the scene’s binaural output to stereo out. resonanceAudioScene.output.connect(audioContext.destination); // Define room dimensions. // By default, room dimensions are undefined (0m x 0m x 0m). let roomDimensions = { width: 10, height: 3.5, depth: 10, }; // Define materials for each of the room’s six surfaces. // Room materials have different acoustic reflectivity. let roomMaterials = { // Room wall materials left: 'grass', right: 'grass', front: 'grass', back: 'grass', // Room floor down: 'grass', // Room ceiling up: 'transparent', }; // Add the room definition to the scene. resonanceAudioScene.setRoomProperties(roomDimensions, roomMaterials); // Create an AudioElement. let audioElement = document.createElement('audio'); // Load an audio file into the AudioElement. audioElement.src = 'resources/EMR_recording_samples.mp3'; // Generate a MediaElementSource from the AudioElement. let audioElementSource = audioContext.createMediaElementSource(audioElement); // Add the MediaElementSource to the scene as an audio input source. let source = resonanceAudioScene.createSource(); let listener = audioElementSource.connect(source.input); // Called when we get a line of data function onLine(v) { console.log("Received: "+JSON.stringify(v)); //Display source position value document.getElementById("magValue").innerHTML = "Source Position: "+v; // Set the source position relative to the user's compass bearing source.setPosition(Math.cos(v), 0, Math.sin(v)); } // When clicked, connect or disconnect var connection; connectButton.addEventListener("click", function() { if (connection) { connection.close(); connection = undefined; } Puck.connect(function(c) { if (!c) { alert("Couldn't connect!"); return; } connection = c; // Handle the data we get back, and call 'onLine' // whenever we get a line var buf = ""; connection.on("data", function(d) { buf += d; var i = buf.indexOf("\n"); while (i>=0) { onLine(buf.substr(0,i)); buf = buf.substr(i+1); i = buf.indexOf("\n"); } }); }); }); // Play the audio button. playButton.onclick = function (event) { audioElement.play(); } </script> </body> </html>
Figure 2. The progressive web application source code
The Puck.js also enables experimentation with listener mounted beacons, either as central or peripheral devices within a BLE network through Espruino’s GATT server module.
This head-mounted position could also prove useful for indoor positioning and communication between participants headsets as it may eliminate much of the interference generated by human traffic that Bluetooth signals are prone to,especially if it was communicating with other beacons positioned directly overhead, or beacons attached to a ceiling. With this view, it could eventually be embedded in the headband of the headphones, or provided as a clip-on accessory.
Additional prototypes were made using Espruino’s BLE MIDI module and Web Bluetooth capabilities to send the calibrated magnetometer data as MIDI control messages over Bluetooth to a DAW, and to a Bluetooth enabled web application respectively. The later utilised a combination of Web Bluetooth and the Web Audio API to realise a standalone, mobile, dynamic and responsive surround sound experience (Figure 2). This was achieved by mapping the Puck.js’ magnetometer data to the Web Audio API’s PannerNodeparameters. Although this approach was successful in moving the audio source dynamically around the listener, as the Web Audio API provides an AudioListener interface with orientation properties, it would make better sense, both semantically and functionally, to use this going forward.
Figure 3. A sketch of the prototype system.
Potential future studies
To summarise, some of the possible future avenues of study this initial prototyping has revealed include:
All of the above objectives could potentially form part of one carefully designed study.
Things that require attention
Some points that need addressing based on the use and evaluation of the outlined prototypes are:
Refinement and clarification of my research area
The outlined prototyping activities have led me to think about several possible refinements, and some clarifications, of my area of research:
Potential Practice-based studies
This focus has given the practice-based aspect of my research more clarity too. I can now identify potential applications and target collaborative practice with the developing framework around them. These include:
Notes
References