Posted on: July 18, 2018
This proposal outlines an artist-led and practice-based research project that intends to develop a framework through which visitors to a series of site-specific installations and interactive performance spaces can perform and create personal, collaborative and interactive musical experiences.
The resulting framework would be intended for use by electronic music performers, sound artists and DJs, and intends to help these users realise collaborative and interactive shared musical performances and experiences for their audiences and participants.
In relation to this, the criteria for the success of the project could be framed as:
The project intends to make a contribution to knowledge and learning through the following:
Within the disciplines of cultural and sound studies, there exists an opinion that recordings of location-based noise, which combine to form a soundscape, have the potential to offer us a different way of understanding our environments, and include elements that visual landscape recordings, as traditionally represented by photography and painting, may be less capable of revealing (Attali, 2009 & Thompson, 2017).
Key to this idea is the physical nature of sound and the way in which it can travel largely uninhibited from its source to its receiver, giving away information about its origins, making it less easy to frame, to tame, or to present in any way other than that which it is (Attali, 2009 & Schafer, 1993). Similarly, both Thompson (2017) and Isofat (2007) recognise sound’s ubiquitous nature, concluding ‘sounds cannot be ignored directionally in the way that sight can be directed in space’ (Isofat, 2007, p48).
Within music there is a rich history of incorporating noise to create new and innovative music styles and genres (Thompson, 2017). We could therefore conclude that it is this reliable, reflective and revealing nature of noise that provides new music with some of its innovative qualities, providing it with its relevance, its urgency and its excitement. The French composer Edgard Varése described music as, ‘organised sound’ and defined his role of a musician as being, ‘a worker in rhythms, frequencies and intensities’ (Varèse & Wen-chung , 1966).
Additionally, both Cage (1961) and Attali (2009) point towards the empowerment of the listener as a performer, and the embracing of the indeterminate nature of found sounds, or noises, as being key compositional factors in the realisation of new musical experiences.
With this view, we could ask the question: How can we organise the noises from our soundscapes to create innovative musical experiences, musical experiences that are not only specific to a location, but also promote a connection between the listener and the location?
Along with rich and varied soundscapes, our contemporary environments are also awash with data and potential data sources, such as wi-fi signals, mobile network activity, travel and weather data, along with the personal data sources of the location’s occupants. Just as we can think of all the sounds and noises that are present at a specific location as being a soundscape, we could perhaps think of all the data that is present at a specific location, including the personal data elements of its inhabitants, as being the datascape.
In relation to the evolution of headphone based interactive musical experiences, Dobda (2013) details how a greater level of musical interaction and a more personalised musical experience can be facilitated by the inclusion of personal data elements such as biofeedback and location data. This, he suggests, can be used to realise end listener mixingwhich, he goes on to conclude, would better integrate listeners within the work and improve their connection to it.
There have been several projects in recent years that have developed frameworks for turning soundscapes into music in real-time, namely Vawter (2006) (Figure 1) and Gaye et al. (2003) (Figure 2). I propose that such frameworks could be used as a foundation to consider ways of generating music from the datascape as well.
Figure 1. Vawter’s (2006) graph illustrating the categorisation of noises within a soundscape.
Figure 2. Gaye et al’s (2003) diagram illustrating the mapping of discrete and continuous soundscape factors to musical parameters.
It is perhaps through a combination of both soundscape and datascape organisation that innovative and interactive musical experiences could be created that are unique to, and representative of, the listener and their location.This conclusion being drawn largely in relation to both Cage’s (1961), Thompson’s (2017) and Attali’s (2009) suggestions that it is the inclusion and appropriation of noise that drives musical innovation and creates new musical genres.
Additionally, and in relation to this, noise, defined as an unwanted disturbance, presents itself as an interesting and unifying opposite between soundscape and datascape (music vs. noise and signal vs. noise) and hints at the creative possibilities of the ground in between.
The hypothesis being that noise, in both its data and aural form, could be used as a raw material for the construction of interactive musical performances, rather than being discarded, filtered or ‘cleaned’.
The sound artist Alvin Lucier’s concept of architecture as an acoustic lens (Lucier in Kelly, 2011), although largely related to soundwaves and the absorbing and reflective qualities of materials, also refers more generally to the dimensions and structure of architectural space and echoes Gaye et al’s (2003) concept of the city, or architecturalstructure as interface. This concept is also explored in Lucier’s installations I Am Sitting In A Room(1969), Quasimodo the Great Lover (1970) and Vespers (1972), the latter of which being described as ‘a sonic architectural portrait’ (Collins, 2010, p.98).
Figure 3. A still from a video showing a performance of Lucier’s Vesperswhere multiple participants can be seen interacting and performing with echo location devices.
Additionally, Brejzek (2010) suggests that the increase in networked social activities has altered our perception of space as being solid, and now social spaces are thought of as ‘mixed reality playgrounds’. This analogy of social space as playground nicely introduces us to silent discoanother interesting and inspiring point of reference, particularly with regards to Dobda’s (2013) description of Austin Silent Disco’smulti-channel prototype. Dobda (2013) presents a prototype that extends the possibilities of end-user mixing well beyond the selection of one of three broadcast channels, by introducing volume controls for each channel, plus two receiving channels. This essentially enables the listener to personalise their listening experience by mixing different amounts of the various input sources together. Additionally, the introduction of coloured LED’s to indicate, and effectively broadcast to other listeners, your currently active channel represents an interesting participatory and collaborative development with this technology.
Figure 4. An example of a multi-channel silent disco event with colour-coded audio channels.
The French cultural theorist Guy Debord’s influence extends from his psychogeographical theory of urban exploration, the dérive, and the representation of the experience of location, the sonic representation of which Isofat (2009) refers to as psychosonography.
Isofat (2009) asserts that the recording of a location’s soundscape functions as a foundation for psychosonography, ‘a source of data for closer examination’ (Isofat, 2009. p.49). Though he realises a recording of the soundscape alone cannot fulfil the expressive condition he sets forth, which he defines as ‘a sonic representation of place as an expression of a mental image’ (Isofat, 2009. p.48).
Isofat (2009) goes on to suggest the use of musical instruments as a means by which the soundscape can be complimented, or augmented. I would suggest that the study and subsequent expressive sonification of location-based data sources, through the use of instrumentation and sounds that convey meaning associated with place, offers another dimension through which place can be represented.
Sonic Citywas realised in an era prior to the smartphone, and relied upon a laptop based technical solution with interfaces, micro-controllers, microphones, various sensors and their connecting wires, which, as Gaye et al. (2003) point out, hindered both usability, long-term testing and technical reliability.
In terms of both individual and collaborative musical interaction it is anticipated that, like Dobda (2013) suggests, it will be the participant’s movement in space, and their proximity to other participants and the effect this has on the data sources that will define their personal data and in turn their musical experience.
In relation to the above, and in relation to the project’s interior, installation-based context, the research and development of an effective Indoor Positioning System (IPS) would be required. With GPS proving weak and ineffective indoors (He & Shin, 2017), previous, and related, installation examples, such as Embodied iSound(Gimenes et al; 2016) have relied largely on iBeacon-based Bluetooth systems (Figures 5 & 6).
Figure 5. Installation plan for Embodied iSound(Gimenes et al; 2016), showing iBeacon based indoor Positioning System (IPS).
With a reported accuracy of around 2 meters (He & Shin, 2017), an iBeacon system has limited granularity and therefore would provide only limited musical control within the context of the proposed framework. Whilst an IPS combining Bluetooth technologies along with wi-fi signal or geomagnetic mapping, as documented by He & Shin (2017) and as applied commercially by IndoorAtlas(2018), may enable more accurate positioning appropriate for fine grained musical control for participants positions and trajectories.
Moreover, it may be possible, within the context of a site-specific installation, that both extensive mapping and treatment of the performance space can be undertaken to increase the performance of such a system as IndoorAtlas. This could include the effective positioning and calibration of a supporting wi-fi network, or customising the magnetic signature of the space to best suit the system, rather than relying on an existing network infrastructure or the default structural geomagnetic signature.
Figure 6. The Embodied iSoundinstallation in use (Gimenes, 2017).
With the realisationof an effective IPS, it is possible to view participants as Dynamic Musical Objects (DMO’s) within the interface of the installation environment, each with their own set of data, sonification parameters and audio preferences that effect and interact with the data sources of the space. This directly reflects the more traditional concept of the DMO as a data package containing both audio and metadata which enables the production and consumption of innovative musical experiences (Perez-Carrillo et al; 2016).
In terms of the collaborative production of the musical experience this concept offers some exciting opportunities. For example, Thalmann et al; (2016) present the idea of dynamic spatialisation using compass and geolocation controls mapped to participants orientation, which could potentially enable listeners to walk around a virtual musical space. Within the context of a collaborative and interactive musical production environment each participant’s position and trajectory in physical space could become a dynamic sound source within the virtual soundscape, thus facilitating collaborative and interactive musical performance. This concept is also mention by Dobda (2013) and given the term active spatial sound.
I aim to take an artist-led and practice-based approach to my research, where artistic practice can provide research data obtained from the ethnographic study of audience and participant interaction with deployed sound installations and performances.
Evaluation of this data, which would include the results of both qualitative and quantitative approaches (for example: observations, surveys, semi-structured interviews, workshops, videos, field notes and data logs), can then inform my ongoing research and contribute to an understanding of how the deployed technologies and experiences perform ‘in the wild’ with real audiences and participants.
Potential theories and frameworks can then be formulated and applied to future technological and experiential deployments which would be grounded in the effective analysis of the results of recognised research methodologies. This approach is illustrated in Benford & Giannachi (2012) and essentially forms a cyclical process where practice contributes to theory, and theory, in turn, contributes to practice.
Additionally, as the realisation of research and development projects as part of this PhD will require working in both the Arts and in Human Computer Interaction (HCI), it is possible that learning can take place not just from participants, but also from my own repeated experiences with the projects, and via a process of self-evaluation, informed by the dual nature of the practice. A similar complimentary approach is identified by Taylor et al. (2011), who also suggest that the personal experiences of the researcher as participant play an important part within this cyclical and iterative design process as it can help, in combination with the analysis of each design iteration, to identify emerging and embedded trends that could potentially compromise either artistic integrity or effective HCI practice. Additionally, Taylor et al. (2011) point out that this dual design and participatory role can lead to dialogue and exchange with users that would ordinarily be absent, and can gain insights that may be unobtainable in a traditional design process, both of which having the potential to drive development forward.
Examples of this artist-led and practice-based approach can be found in related works such as: Hazzard (2016) Guidelines for Composing Locative Soundtracks, Lacey (2016) Sonic Rupture: A Practice-led Approach to Urban Soundscape Design, Taylor et al. (2011) Designing from within: humanaquarium, and in Chaparro & Duenas (2015) Psychogeographical Sound-drift.
In Gaye et al. (2003) Sonic City: The Urban Environment as a Musical Interface, the authors describe a ‘multi-disciplinary and iterative development process [where] ethnographic studies, scenarios, and workshops provided insight into the user experiences’(Gaye et al. 2003).
It should be mentioned that this approach is not without its challenges. Developing installation based, and mobile artistic experiences in the real-world, for real audiences, will most often require working to real-world performance schedules, venue and festival timetables, and other time constraints and deadlines associated with possible platforms that can be utilised to deploy such projects ‘in the wild’. This potential issue is identified in Benford et al. (2013). In relation to this, Taylor et al. (2011) suggest that this may, at times, require compromise, and the complexity of a specific design iteration may need to be adapted to facilitate an effective deployment, which effectively constitutes a trade-off between the ease and importance of the deployment and the speed of the project’s development.
Another consideration, concerning the dual role of practitioner and researcher, is the importance of clarifying which role is active when involved in verbal or written communication regarding the project, a point which is mentioned by both Hazzard (2016) and Jacobs et al. (2013).
Additionally, Benford et al. (2013) also identify issues pertaining to the effective ethnographic study of, in particular mobile art experiences, but also art experiences in general, where the presence of researcher may compromise either the artistic integrity of the work, or the immersion of the participant within the experience. In these cases Benford et al. (2013) recognise the need for the direct recruitment of participants, and for the direct participation of the ethnographer.
Indeed, this ‘artist as researcher’ approach is described in Benford et al. (2013) as having three distinct perspectives: practicing, studying and theorizing, through which the individual will need to navigate and adopt appropriate positions and ‘shifts in perspectives’to suit relevant activities.
Interestingly, Benford et al. (2013) also predict that this ‘artist as researcher’ approach may become more popular as artists engage in interdisciplinary PhDs, and specifically those engaged with HCI development.
Various ethical issues, and other potentially problematic features of the project can be identified, two examples being the use of personal microphones, and the collection of personal locative data. Whilst the latter can be dealt with through a process of effective anonymisation, the use of personal microphones would appear to require prior consent, and clear and transparent communication to potential participants regarding its role.
Although there are now additional considerations concerning compliance with GDPR, and School of Computer Science ethics, there are additional ethical challenges which are a consequence of the proposed artist-led and practice-based research approach.
This approach effectively proposes using artistic practice as a vehicle for research, and Benford et al. (2015) outline some of the challenges of such a research approach as:
It should perhaps be noted that these specific challenges are present alongside the overarching challenges and implications presented by existing ethical approval processes in relation to experimental and practice-based research approaches, and more specifically research projects that are going to develop through each design iteration.
Additionally, and especially if the audio content is to be delivered to participants via headphones, one should not underestimate the immersive quality of interactive sound experiences. Whilst participants are immersed within the context of an interactive audio experience they can become isolated from the context of their immediate physical environment, and others around them, thus raising issues of personal safety and security.