I’m particularly excited about the development of recorded music that never sounds the same twice—varying randomly or with listener interaction—Brian Eno’s generative music being the first and most well known example.
The tech lovers at last week’s MEX Mobile User Experience conference in London were treated to all manner of fantastical visions of our further mobile empowered futures; big data, connected cars, smart homes, Internet of Things, gestural interfaces, personal mini-drones—the lot.
Few presentation this year will be complete without at least passing reference to the game changing nature or dystopian social implications of soon-to-be-unleashed Google Glass. Surprisingly, however, a couple of jaw-dropping demonstrations were enough to leave many of those attending wondering whether we might be missing a slightly quieter revolution taking hold. Could immersive audio be about to come of age in mobile user experience?
Having played second fiddle to the visual interface for decades, being so often the reserve of experimental art installations or niche concepts for the blind, audio has yet to find mass interaction application outside of alarms, alerts, ringtones and the occasional novelty bottle opener. All of this, however, could be set to change, if the two fields of binaural sound and dynamic music can find their way into the repertoire of interaction designers.