Cyborg violin is my, rather nerdy, name for a violin enhanced with odd electronics. I’ve attached hardware consisting of an IMU (accelerometer, gyroscope and magnetometer) a ribbon controller, light sensor and some multi-colour LEDs
How It Started
About a year ago I bought another electric violin with the intention of doing something whacky with it. At the time I didn’t know what that would be but was sure I would think of something. Initially I tried experimenting with different tunings and for most of its life before it became the cyborg it had an octave D string in place of the G and so was strung D’DAE – which was interesting … but not very!
In the meantime I was experimenting with other interfaces for the CSMA project with Stuart Russell and one of the things I enjoyed messing with was the Wii controller. Nothing radical here, Wii controllers have been a staple of experimental composer/performers since they became available, but it was fun to play with and we did a fun video of one controlling a soft synth, and even one with my (very much hardware) Moog Little Phatty.
While experimenting with this I started wondering how hard it would be to attach the Wii controller hardware onto a violin so that the instrument ‘knew’ its position in space and the player could modulate the sounds using gestures while playing. This would being a new dimension to a performance and probably, if done well, look interesting too. I’m not a hugely expressive violinist to be quite honest, I don’t do the ducking and diving that some rock violinists do, I tend to stay and brood and look mysterious mostly. Though that does seem to work for me as people have remarked favourably on it.
Chatting around I realised the Wii was not a particularly accurate instrument, it also needs 2xAA batteries and is therefore quite heavy. I did like the buttons on it though so wandered around the internet for alternatives that might fit the bill better. I was recommended to investigate the X-OSC. A small Wi-Fi equipped board that already contained the IMU as well as a multitude of analogue and digital outputs. I ordered one in October of last year – about the same time as I was playing with Arduino too.
The X-OSC is very small, but violins are also small instruments and it took quite some thought to work out how to attach it to the violin without it getting in the way of playing or making it not fit into a case. I asked a couple of luthiers but I suspect they thought I was a madwoman and ignored me in the hope I would go away. So I did, and worked things out for myself. The X-OSC board now sits under the fingerboard (secured with blu-tak and elastic bands!) and attached to it are an LED ring on the headstock, a four-button switch array (on the side of the instrument), a soft pot ribbon-controller on the side of the fingerboard and a light sensor on the top of the body. All of this connects over Wi-Fi to Max on my laptop. The X-OSC can create an ad-hoc network for gigging purposes where there is no trustworthy Wi-Fi network. Perfect!
X-OSC mounted on violin and Max for Live patch
The software started out as a standalone Max patch, mainly because I was learning to interpret the accelerometer and gyroscope outputs to give me a reliable orientation for the violin and to test the function of the other peripherals. I wanted the outputs to control effects on the laptop so I then put them into a Max For Live device and tried to work out how best to attach them to the Ableton Live effects and VSTs I wanted to use.
My first attempt had me sending MIDI output from the Max patch and then using the MIDI mapping feature of Live attaching them to effects functions. To to this I had to send the output of the Max processor to the internal IAC device and then back into Live. This worked and for initial tests seemed OK. It was very easy to map X-OSC outputs to different effects in Live and see what worked best for the violin movements. However it soon became clear that the latency involved was just far too long. There was so much MIDI data being sent that Ableton was getting clogged up and messages were being lost or dropped or delayed horribly. Something better was needed.
I was recommended, on Twitter, to use the Live Object Model (LOM) for this purpose. I’d played with the LOM a little before and found it a little complex to use. Also it would mean hard-coding device and parameter numbers into the patch which I wasn’t keen on – at least while the project was still under heavy development. Anyway I did a little test to see how the latency of mapped MIDI vs LOM stacked up. No contest – the LOM was much more responsive, this was obviously the answer. So I wrote a small Max patch to encapsulate the LOM operations transparently and then I could the LOM.Navigator device to work out where the effects were that I wanted to control. Actually this is, in some ways, even easier than mapping MIDI in Live. When mapping MIDI from the IMU there are 6 outputs constantly sending data so you have to mute 5 of them to be sure of only mapping the one you want – it’s actually quite annoying. With the LOM encapsulation all I needed to do was provide 3 numbers: track, device, parameter and it just worked!
There is one output on the violin too, that I alluded too. That’s a ring of 12 RGB LEDs on the headstock of the instrument. These can be set (using the buttons on the violin) to display either a circling ring of blue & yellow, a colour that changes depending on the accelerometer data or a sound-to-light display made using VIZZIE in a seperate Max For Live patch on a send track in the Live set.
The buttons turned out to be the least useful thing in the project. I thought that 4 would be too few, and bought 2 rows of 4 just in case. However because playing violin uses both hands quite intensively there is rarely time to press buttons, foot pedals are far more convenient. That’s why I relegated the buttons to choosing the LED effects. All of the other controls are on the Softstep and MIDI mapped in the usual way.
The demo video here was done before I had decided to use the LOM so it shows some of the delays and uncertainty of control that is now all fixed and working. Sometime in the next week or so I’ll do a new video showing it working better. But I think this video shows roughly what the instrument can do. The LEDs are better in real life than they seem here as the video camera just ‘whites out’ in the dark and loses all the colours that they can produce. It also shows me wearing a head torch which flashes into the light sensor. This is where the delay and packet-dropping of the MIDI messages is most apparent! It is now much better 😀
I can’t wait to get a chance to perform live with this instrument
Read Full Post »