Intelligent Sonic Environments


I have recently fantasized about clubs (or “concert” halls) as intelligent spaces, whereby the DJ/musician/sound artist would be even less a producer, and even more of a conduit or channel through which the sound and the crowd’s collective identity would move: the club would sense the size and mood of the crowd, would even measure conversation/listening/dancing levels, and the music would organically mutate and evolve accordingly. It seems to me to be a logical evolution away from the production/consumption divide, and the art/life one as well. But as I have absolutely zilch geek cred, I have no idea how much of a fantasy this is – I know there are people developing all kinds of applications with similar aims, though. Via Anne Galloway, for example, I have discovered the Sonic City Project, one of many creative and innovative projects at the PLAY research studio in Sweden. In this project, the city is imagined as an interactive environment:

“We are designing an mobile audio experience which maps real-time perception of personal state and environmental factors to music creation. In Sonic City, music is created algorithmically as a direct result of a user’s state, actions, path through the streets, the physical landscape, activities nearby, as well as the way the system is worn. Our first prototype involves a set of wearables which sense the user’s context when walking through the city. Continuous and discrete factors are perceived and mapped in real-time to dynamic parameters in musical creation.”

Anyone interested in ubiquitous computing, ethnographic research into technology in everyday life, or more interactive music technology should have a look at the project’s specs.