Tuesday 10 May 2011

Adaptive Physical Controllers

Well I havnt updated this for a while but its because I have been busy writing my dissertation for my MA in Sonic Arts. There are some videos of my project coming soon when I edit things together but for now I will serialize my dissertation for you all to read if your interested in these topics. It covers both the theory behind alternative controllers and what I have done with my own work. I'll be posting a couple of sections of it up every day, although its very text heavy I hope it will be interesting to some of you who also are interested in these type of projects and will explain a bit about it all. When I've released the whole thing on here I will also put a download link to a printable pdf version which will include all the images, bibliography etc..

So without further ado here is the first section which covers the introduction and also some discussion of existing digital input devices for music


Adaptive Physical Controllers - 

Introduction

I will be looking in particular at the linking of Max/Msp and Ableton Live as it allows us to create complicated controller interfaces and devices whilst allowing us to access the API of a powerful live music system. Harnessing the flexibility of Max/Msp with the more traditional and time domain orientated approach of Ableton Live allows the performer to create an adaptive system whereby a controller can be used to perform multiple tasks and control any parameter, for example the pitch, timbre and velocity of a synth, the tempo of a song, a parameter within an effect, what sounds are playing, and so on. It is this adaptive nature of home built personalized controllers that allow us to explore new ways of interacting with computers and music. Projects such as David Jeffrey Merrills Flexigesture (Merrill and Massachusetts Institute of Technology. Dept. of Architecture. Program In Media Arts and Sciences. 2004) “a personalizable and expressive physical input device” and Onyx Ashantis “Beatjazz” project (Ashanti 2010) move towards this goal and attempt to combine the best aspects of traditional hardware controllers with the possibilities audio programming languages and custom controllers present to us.
In my project I have attempted to make a pair of gloves that can be used to create and manipulate music within Ableton Live. The aim of this is being able to play live improvised electronic dance music without interacting with the computer directly. In many aspects of live electronic music the excitement of performance has been lost, there is often little interaction between the performer and the computer, and even if the music is composed and created in real time there is little for the audience to visually identify with. Unlike a performance with traditional instruments it is almost impossible for the audience member to visualize what the performer is doing. When using a computer the ‘wall of the screen’ separates the performer from the audience and obscures their actions “Conventional sonic display technologies create a “plane-of-separation” between the source/method of sound production and the intended consumer. This creates a musical/social context that is inherently and intentionally presentation (rather than process) orientated”(Bahn, Hahn et al. 2001 1). It is one of the central paradoxes for the electronic performer, although they have a box that is capable of creating almost any sound imaginable the central mechanisms for creating these sounds are obscured from all but the user themselves. I wish to find a way to attempt to overcome this problem by creating a system that allows access to the many features computer music software offers whilst removing the user from a fixed position in front of the computer screen and creating a direct visual feedback for the audience.




There are clearly benefits to modeling controllers on traditional instruments, by doing so you provide a safe reference point for the user and in theory reduce the learning curve required to play it (providing the user has previous instrument training). By working in a familiar framework you play on the existing strengths of the performer, however there are limitations to traditional instruments that I believe make them unsuitable for use as a modern day controller. Traditional use of a computer requires many keys and key combinations to perform specific functions, this is relatively easy using a keyboard and mouse as every key is individually marked and key combinations are easily pressed. However if you translate this idea of a grid of keys to the fretboard of a guitar you can begin to see the problems that may occur. It is very difficult to translate a vast number of controls to a small number of keys and direct midi instrument mapping often yields the problem that software will only allow you to control one parameter at once and many midi sequencers and live performance programs do not allow you to easily switch between channels and instruments.
There is obviously a great difference between an instrument that attempts to simply recreate the analogue in a digital domain and a controller that seeks to redefine performer and computer interaction, for example the Yamaha WX5[1] seeks to recreate the experience of playing a woodwind instrument but with extra keys for computer control, shifting octaves and so on, if seeking to simply replace a traditional instrument with a digital one instruments like this are an effective choice. However as we are looking to create a new type of computer control interface the mechanics and implementation of these instruments are less relevant to us than something such as the Eigenharp[2] which bills itself as ‘The most expressive electronic instrument ever made’ and attempts to go further than simply recreating existing instrument designs. Indeed it looks to incorporate aspects of many existing instruments and to allow the user to play VST plugins to create a hybrid design that straddles both traditional and digital instrument designs. Undoubtedly the quality of the keys, their velocity sensing and the ability to move them in both a horizontal and vertical direction goes a long way to allowing the player to perform all the traditional expressions associated with musical instruments[3] in a way that has not been available in the past and design features such as the inclusion of a breath controller and excellent instrument models allows the player to easily replicate, for example, wind instrument sounds. However it is the more strictly digital interactions with this controller that may leave the end user wanting.
 The Eigenharp, in a desire to remain as traditional as possible, takes the approach of using a complex set of key presses and lights to navigate through un-named menus on the instrument. Whilst usable this requires that the user become familiar with a menu tree that has little visual guide and without the names of the menus appearing and only colored lights to mark where you are or what option is active it is all too easy to choose the wrong option. In addition built in requirements such as having to reselect the instrument you are playing to exit the menu tree add un-necessary complexity. In this case the desire to pretend that the computer the instrument is plugged in to does not exist feels like a denial of the capability of the device and negates much of the goal to present an instrument that can be quickly mastered by the user. Although there is a limited computer interface provided with the Eigenharp this is mainly for choosing sounds, scale modes and VST’s and as such is more of a pre performance configuration tool than something that can be used ‘on the fly’.
I believe that the main fault with the Eigenharp model is that it binds the user to a specific predefined interface. The benefit of creating an alternative controller is that you can create an interface that combines well with your intuitive workflow and techniques. When using a powerful ‘virtual’ instrument that is linked to the computer you have the opportunity to allow the user to reprogram settings to work in a way that suits their needs. This is one of the central tenants of adaptive controller design; the end user can specify how to work with the tool that is used for interaction. For playability the Eigenharp undoubtedly succeeds in creating an instrument that can replicate the experience and sound of playing a ‘real’ instrument with all associated traits but it is in the user interface that stops if from being truly revolutionary and which does not allow the user to access the full capabilities of the instrument in a way that is complementary to their workflow.

“1st law of alternative controllers; adapt to the new reality. 2nd law of alternative controllers; adapt reality.” - Onyx Ashanti

“We feel that the physical gesture and sonic feedback are key to maintaining and extending social and instrumental traditions within a technological context” - (Bahn, Hahn et al. 2001 1)

I believe that a radical approach to instrument control systems is required to get the most from modern computers and audio software. Audio programming languages such as Max/Msp or Puredata and hardware interfaces such as the Arduino make it easy for a musician to design their own instrument and define their interaction with the computer in a way that is most appropriate to their performance. It is possible to create a “dynamic interactive system” (Cornock and Edmonds 1973) where the performer and computer feedback to each other to create ever changing interactive situations. It has become simple to create a system whereby the action of different sensors is easily assignable and changeable from within the software, and it is this flexibility and almost unlimited expandability that makes these tools suitable for creating a truly futuristic control system.


[1] see Appendix A 1.1 for picture
[2] see Appendix A 2.1 for picture
[3] Vibrato, Pitch Bend, Slides, etc....

No comments: