qi last night

I ran Qi again last night on the walls of Kennedy Theater at UH Manoa as part of the preliminaries to the fiftieth anniversary celebration/performance of the dance program. Here is a short video of a few of the people who used it. Don’t miss the little person at about 2:40. I didn’t ask permission from the people in here and don’t know who they are, so if you see this and you are in it and don’t want to be, let me know and I’ll edit you out.

Performance can take some interesting turns. Performativity comes from the idea that your performances and utterances do not just say something but do something in the world. The past two Friday nights, as part of the preliminaries for the fiftieth anniversary dance concert at UH Manoa’s Kennedy Theater, dance professor Kara Miller invited History professor Richard Rath to set up his interactive motion-to-music installation “Qi” projected on an outside wall of the theater before the show on two of the Fridays of the show.

In it, dancers, cars, and passers by “do something” with their motions, namely make music. The results were projected on the side of Kennedy theater and music made by the audience filled the air outside the theater the two nights of the installation. A couple of ideas are at work in the installation in conjunction with performativity. One is that music is often synesthetic, transforming actions in one sense modality, say vision — as in sheet music — or touch, as in the fingers on a guitar, into another, hearing. In this case the transformation is somewhat direct, as music gets made by moving the balls of green energy, the “qi,”around the screen with one’s motions in order to make music. The installation is also meant to break down the distinction between audience and performer, as often times people discover the instrument through being in the picture on the wall that they are looking at and hearing their motions come out as music. In other times and places, especially before the advent of recorded music, music making was something that everyone participated in, without the formal distinction between audience and performer. Qi points to the artificial nature of that divide while looking forward in new ways rather than back to some romanticized version of the past.

Two highlights that I did not capture were a little girl about age ten who just had a blast with it, and an older couple who did Tai Chi, which worked really well. I did not get video of either though.

There has been a great series of articles on motion to music controllers the past week or so over at Create Digital Music. All of them are cool, but they require some extra piece of hardware like an iPhone attached to your wrist or a Leap Motion, or a kinect. Qi is a max/msp patch that I wrote using the cv.jit computer vision modules. the only hardware is the video camera that comes with the laptop (ok, I used a cheapy USB cam so I could aim it away from me, but the onboard cam works too).  People seem to get the idea immediately and have fun doing it, although the occasional detached arm waving in the picture is me giving the two second tutorial: “up for higher notes, down for lower, different synths on left and right sides, bigger circles=louder.”

 

Qi tonight, Kennedy Theater

Just a reminder that my motion-into-music contraption will be making music as long as somebody is moving outside of Kennedy Theater before the performance tonight. UH Manoa. 7-8 PM. Outside stuff is free. If you are around come by. Last week was a blast, come by this week if you can. To get an idea of what it is, see the videos (note to self: smile next time, its fun).

qi motion to music at UH Friday night this week and next

My long term motion to music software, now named “qi” is going to be available for everyone to move to and create music on Friday November 15 and 22. It will be part of the dance program’s 50th anniversary show preliminaries, projected on the wall outside Kennedy Theater starting about 7 PM until the show starts at 8. This Friday and next. Hope you can make it! As usual, facebook people need to come to way.net to view the video I think. Qi motion to music demo

control issues.

(n.b., If you just want to see and hear the Hot Hand in action, scroll to the video at the bottom of the page)

I have control issues.  Guitarists have it rough in the all-digital world.  Both hands are full with the playing of the thing.  The foot (two is harder) is useful on stompboxes and wah pedals, but not as easy for the digital world, where effects can be free but seldom come in a box.

Hot Hand
Hot Hand

Enter into the arena the SourceAudio Hot Hand,
a little box about a third of the size of a flash drive attached to a
rubber strap that you can wrap around what you wish, default being a
finger.  It has the accelerometer built in and sends three signals, one
for each axis of the three dimension of the space we live in.  Move it
along an axis and it sends out midi control changes which can be pumped
into the digital guitar rig and attached to anything a midi signal can
control.  It takes some calibrating and “fingering” out to actually
control the thing, but when it works, you can flail your picking hand
around in the air or move it subtly, both of which guitarists have been
known to do, and the midi flows to whatever you hook it up to. I opted
for the obvious here and hooked it up to filter cutoffs (three of them),
resonance (three of those too), distortion amount (one on the Z axis,
which takes the most conscious effort to get off of zero), and filter
attack.

After messing about the whole day I got a mostly good take that
demonstrates some of the possibilities for expression in the Hot Hand.
 The first part sets up a loop with the instruments modulated by the
ring, then some lead over that with some extra expression, and a drum
drop where I use the Hot Hand to mangle the loop full time without
playing guitar, then on to the glorious finish where I heroically try to
stomp out all the loops and drums at once.

I am really enjoying the thing so far, but it will take more
calibrating and more learning on my part to get expressive control over
it.  I relearned what pitch, roll, and yaw are to try to figure it out,
but it does not exactly work on those principles.  Instead it is more
like a carpenter’s level, the little tubes with the bubbles in them, and
I have to figure out how to control where the bubbles float to, all
three at once.

I had much fun, and I hope you enjoy the video!  

Some Background . . .

Expression is my dream, a way of manipulating tone and timbre on the guitar as complex and rewarding as the fingers/ frets/ wood/ amp/ speakers/ ears combo.  Textures, spaces, shapes: I have a little synesthesia in my dreams of expressive control. There is a remarkable amount of expressive power packed into stringed instruments. MIDI controllers, with their one parameter at a time functioning, seem rigid in comparison.  Sure tones can be shifted about, but real expression remains elusive — though not impossible, given that the instrument itself has a lot of expressive wallop already.

I have a couple of useful foot controllers.  One is the venerable Behringer FCB1010, which is a bit of a bear to program and somewhat big to carry around.  If I am going to replace my amp with a laptop, I don’t want a giant floorboard.

SoftStep
SoftStep

I recently got Keith McMillen’s SoftStep as a replacement.  It is small, sleek, and has really neat blue and red LED lighting so you can see what you are doing.  The SoftStep is used to control the drums in the video.  It works well for triggering, though the “steps” are a little small for feet with shoes on.  It has the ability to send several different continuous controller midi signals, but in practice they are not that easy to control with foot pressure.  It does have a plug for a good old expression pedal though.  Both it and the Behringer are pretty involved to program, but I have to give the SoftStep the edge there, even though the software crashes every time you close it and fails to save the midi device it is supposed to send to.

vPedal
vPedal

I also have a vPedal transcription pedal set to spit out mouse clicks.  That works well if you can move the mouse where you are going to click next, which can be tricky, but having a mouse click handy is useful. Its hella loud in the mechanics though.  I have a stealth switch II that is a bit quieter, but does not trigger until you release it, which is difficult but interesting for timing — I have to pull the time out of it instead of stomping it into it.  That gets used for my linux mouseclick needs rather than on the main music laptop — I am one of those odd non-Ableton, non-Mac electronic musicians.  I use a combination of Plogue Bidule with a little bit of Max MSP here and there with a few hundred (not kidding) VSTs in Windows, with Linux when I need realtime kernel and gnarly guitar sounds via Guitarix and Jack. (down to 64 samples latency with this combo on a five year old computer — I can’t hear it so much as feel it.  The low latency makes playing seem a bit more responsive)

I rigged up a Wacom Bamboo pen tablet to spit out OSC and Midi, and it sounds cool and is quite expressive, but the pen takes a hand, so it mostly gets used as its own instrument, a sort of theremin on steroids.  I have a MAX patch that can control MIDI though video gestures, like a Kinect but without all the gizmos, but it is not done yet, and pretty much has to be run by itself.  No guitar signal there yet, though I am working on it.

Envelope followers are helpful but limited, and LFOs sound too regular when they are regular and too random when randomized, even though I use them a lot.  They are prefab instead of expressive.  I have explored some fractal and game-of-life controllers (like in this song), but they are too wild for everyday use.  All of this still sounds more toward the artificial end of artificial intelligence.

stc-1000
Mercurial STC-1000

I have two hand controllers, the underrated and out-of-business Mercurial STC-1000 touch surface and a Korg nanoKontrol, but when I touch the touchpad or twiddle those knobs , I can only play guitar with one hand.  I often tried to work out a way to get an accelorometer attached to a guitar that then produced MIDI or OSC, even buying a WIIMote when I have no Wii in order to get it to send midi via GlovePie.  Never figured that one out in any useful way, and the WiiMote would have to be in a hand or strapped on to the guitar somehow.  An out-of-business company called FreePlayer almost got it right, but they could not get a wireless one to market and it required modding the guitar and a USB cable on the guitar, which was  a no go for me.

As usual, lots more music is on the blog and on the waydio at way music.  Pop it out and listen to it in the background.