Posted on: July 22, 2020
Artist Janet Cardiff, who is perhaps best known for her soundwalks, along with sound studies theorist Michael Bull, both demonstrate how cinematic experiences can be realised through the augmenting of everyday life with mobile audio content delivery technologies [3, 4, 5].
The personal stereo listener, and indeed more recently, and with greater efficiency, the iPod and iPhone listener, rely on the possibility of serendipitous collisions between linear, non-dynamic audio content and their experiences of everyday life, such as those depicted in Bull’s Sounding Out the City, for the realisation of what is described as a filmic experience.
Cardiff’s soundwalks, along with other locative listening experiences, extend this filmic experience beyond the aestheticizing potential of linear listening’s serendipitous encounters. This is typically achieved by purposely authoring interactive audio experiences which trigger specific segments of pre-recorded audio content based on a listener’s GPS coordinates.
Recent advances in Simultaneous Location and Mapping (SLAM) technology within mobile application authoring frameworks, and the increasing availability of consumer level headphones embedded with sensors, have made the authoring of Audio Augmented Reality (AAR) experiences a possibility.
AAR enables the finer-grained audio augmentation of specific parts of a location, or an architectural structure, as well as the objects or artefacts within. It also enables, through the determination of the listener’s orientation, the authoring of interactive and spatialised, binaural soundscapes for the listener to explore.
It is proposed that an AAR experience can go beyond the capabilities of linear mobile listening, and beyond the capabilities of GPS enabled soundwalks, in terms of reframing a listener’s perception of location. Furthermore, it is proposed that an immersive, live, cinematic and fictional encounter with a real world location can be experienced.
I would therefore like to invite listeners to enter into, and become composers of, their own horror movie soundscape, a live cinematic experience that invites exploration, promises intrigue, creates suspense and rattles a few nerves. A live cinematic experience where reality is the screen.
Adaptable to the circumstances of the current ‘lock-down’ it is proposed that this installation environment can be remotely deployed to users to allow them to author experiences within their homes that both themselves and their fellow housemates can participate in.
It is proposed that audio archive content from the BBC’s Sound Effects Archive  (available to use for research purposes under the terms of the RemArc Licence ) will be used to provide audio content for the experience, along with repurposed, public domain horror movie soundtrack music from the Prelinger Archives  and specifically produced audio content.
The application uses Simultaneous Location and Mapping (SLAM) technology which enables the placement and anchoring of virtual content (in this case digital audio content) within a real world location. Narrative elements and interactions within the experience can be authored in relation to the determined distance and angle between the user and the virtual audio content in both the real world and the system’s virtual world model. Users are able to attach virtual sound sources to specific locations, such as rooms within their homes as well as specific objects, such as radios, TVs, paintings and pieces of furniture, that will then act as entry-points, way-markers, features, characters or destinations within the interactive and narrative structure.
Figure 1. An outline of the application’s architecture.
Figure 2. Prototype installation studies at the Science Museum along with participant feedback.
Access requirements for participation
It is proposed that the experience be deployed as an iPhone application, made available to download via Apple’s AppStore or TestFlight.A supporting webpage can provide access and installation instructions, and detail the minimum required technical specifications.The current prototype application supports the iPhone SE 1stGeneration (2016) model or higher, and the only additional equipment required is a standard set of stereo headphones. It is also proposed that this application will also be compatible with Bose AR enabled headphones as an additional user option which would enable headtracking, rather than device tracking, for the control of the interactive spatialised audio content for those users who are in possession of any of Bose’s AR enabled headphones or Frames .
Posted on: April 7, 2016
Posted on: April 7, 2016
Posted on: February 20, 2015
Posted on: March 9, 2011
Posted on: October 26, 2015