Laurence Cliffe

intermedia arts | creative technology | research through design

Home | Archive | Contact

Communicating musical performances through haptic and visual feedback

Communicating musical performances through haptic and visual feedback

This project involved extensive research into the historical relationships between colour and musical frequencies along with practical experiments converting musical performances into vibrations from tactile transducers and vibrational motors. A Max patch was designed that enabled the audio signal captured from a microphone, or the playback of a pre-recorded piece of music, to be interpreted through light and vibration. The Max patch converted the frequencies of the audio signals into PWM values and colour codes (via an FFT peak frequency analysis approach) that were then sent to an Arduino to control tactile motors and smart LEDs respectively. Controls within the Max patch enable the fine adjustment and channeling of different frequencies to enable responsive and reflective visual and haptic interpretations of a variety of different types of music. The Max patch also enables the user to select from a number of colour frequency scales including the Scriabin, Benard-Klein and Zieverink scales for visually interpreting the musical input. Two possible designs for an audience-based interface were also envisaged which would enable a mobile and self-contained unit to hold the associated electronics and software, including a built-in microphone for capturing and converting both the musical performance and the ambient acoustic atmosphere of the venue and audience. Research was also undertaken into the development of a haptic or tactile, vibrational frequency scale that wold be capable of transposing audible frequencies to tactile frequencies. Whilst It was originally envisaged that such a system would help to engage DHH audiences with live musical performances, there remains a myriad of other possible applications. Music by Meg Baird – Mosquito Hawks © wichitarecordings.