Laurence Cliffe

intermedia arts | creative technology | research through design

Home | Archive | Contact

An audio augmented reality cinematic soundscape experience

An audio augmented reality cinematic soundscape experience

Artist Janet Cardiff, who is perhaps best known for her soundwalks, along with sound studies theorist Michael Bull, both demonstrate how cinematic experiences can be realised through the augmenting of everyday life with mobile audio content delivery technologies [3, 4, 5].

The personal stereo listener, and indeed more recently, and with greater efficiency, the iPod and iPhone listener, rely on the possibility of serendipitous collisions between linear, non-dynamic audio content and their experiences of everyday life, such as those depicted in Bull’s Sounding Out the City, for the realisation of what is described as a filmic experience.

Cardiff’s soundwalks, along with other locative listening experiences, extend this filmic experience beyond the aestheticizing potential of linear listening’s serendipitous encounters. This is typically achieved by purposely authoring interactive audio experiences which trigger specific segments of pre-recorded audio content based on a listener’s GPS coordinates.

Recent advances in Simultaneous Location and Mapping (SLAM) technology within mobile application authoring frameworks, and the increasing availability of consumer level headphones embedded with sensors, have made the authoring of Audio Augmented Reality (AAR) experiences a possibility.

AAR enables the finer-grained audio augmentation of specific parts of a location, or an architectural structure, as well as the objects or artefacts within. It also enables, through the determination of the listener’s orientation, the authoring of interactive and spatialised, binaural soundscapes for the listener to explore.

It is proposed that an AAR experience can go beyond the capabilities of linear mobile listening, and beyond the capabilities of GPS enabled soundwalks, in terms of reframing a listener’s perception of location. Furthermore, it is proposed that an immersive, live, cinematic and fictional encounter with a real world location can be experienced.

I would therefore like to invite listeners to enter into, and become composers of, their own horror movie soundscape, a live cinematic experience that invites exploration, promises intrigue, creates suspense and rattles a few nerves. A live cinematic experience where reality is the screen.

Adaptable to the circumstances of the current ‘lock-down’ it is proposed that this installation environment can be remotely deployed to users to allow them to author experiences within their homes that both themselves and their fellow housemates can participate in.

It is proposed that audio archive content from the BBC’s Sound Effects Archive [6] (available to use for research purposes under the terms of the RemArc Licence [7]) will be used to provide audio content for the experience, along with repurposed, public domain horror movie soundtrack music from the Prelinger Archives [8] and specifically produced audio content.

Technical description

The application uses Simultaneous Location and Mapping (SLAM) technology which enables the placement and anchoring of virtual content (in this case digital audio content) within a real world location. Narrative elements and interactions within the experience can be authored in relation to the determined distance and angle between the user and the virtual audio content in both the real world and the system’s virtual world model. Users are able to attach virtual sound sources to specific locations, such as rooms within their homes as well as specific objects, such as radios, TVs, paintings and pieces of furniture, that will then act as entry-points, way-markers, features, characters or destinations within the interactive and narrative structure.

Figure 1. An outline of the application’s architecture.

Figure 2. Prototype installation studies at the Science Museum along with participant feedback.

Access requirements for participation

It is proposed that Horror-Fi Me be deployed as an iPhone application, made available to download via Apple’s AppStore or TestFlight.A supporting webpage can provide access and installation instructions, and detail the minimum required technical specifications.The current prototype application supports the iPhone SE 1stGeneration (2016) model or higher, and the only additional equipment required is a standard set of stereo headphones. It is also proposed that this application will also be compatible with Bose AR enabled headphones as an additional user option which would enable headtracking, rather than device tracking, for the control of the interactive spatialised audio content for those users who are in possession of any of Bose’s AR enabled headphones or Frames [9].

For more info visit:


  1. Cliffe L, Mansell J, Cormac J, Greenhalgh C, & Hazzard A. 2019. The Audible Artefact: Promoting Cultural Exploration and Engagement with Audio Augmented Reality.In Proceedings of the 14th International Audio Mostly Conference, New York, New York, USA: ACM Press. 
  2. Cliffe L, Mansell J, Greenhalgh C, & Hazzard A. 2020. Materialising contexts: virtual soundscapes for real-world exploration. Personal and Ubiquitous Computing. Springer, New York.
  3. Aceti L. 2013. Not Here, Not There: An Analysis Of An International Collaboration To Survey Augmented Reality Art.Leonardo Electronic Almanac, Volume 19, Issue 1. pp.1-16
  4. Hazzard, A., Spence, J., Greenhalgh, C., & McGrath, S. 2017. The Rough Mile(pp. 1–8). Presented of  the 12th International Audio Mostly Conference, New York, New York, USA: ACM Press. 
  5. Bull, M. 2000. Sounding Out the City: Personal Stereos and the Management of Everyday Life.Oxford and New York: Berg.
  6. BBC. 2020. BBC Sound Effects.Available at:
  7. BBC. 2020. Terms of use for the BBC’s digital services.Available at:
  8. Internet Archive. 2020. Prelinger Archives.Available at:
  9. Bose. 2020. Introducing Bose AR: Audio-first approach to augmented reality.Available at: