Thoughts upon watching Wayne Krantz

Reminder to self: remember that fretted instruments have their character based not insignificantly on the sounds their tunings make easy to voice. guitar sounds the way it does for a reason. use the hand patterns that feel funky and you just may be coaxing out the inherent characteristic funkiness of the guitar. don’t be overly anxious to jump into music theoretical chord patterns. don’t be afraid to use plenty of open strings and patterns and time signatures that don’t really make much sense but feel good.

(I wrote this down on a piece of paper a few months ago when watching this.)

Bringing NUI to VST

Some of my friends know me to have a near-obsession over the emerging NUI paradigm, known to most people through little things like Apple’s iPhone or Jeff Han’s famous video. When I first saw the video, it struck me, as many, that a big change was coming. Others have felt similarly to me, but some have failed to grasp the full picture, and dismiss this revolutionary development out of hand when they try to use it with their favorite software and find it less than usable. I too encountered this disappointment years ago, when I first tried the experiment. The touch hardware is only one side of the story, but this takes a while to realize.

I started recording music on a Tascam four-track in high school, and being the nerd that I am, quickly moved into DAW land before that term really had a meaning (it’s still rather nebulous), and have been recording for my own sake ever since. As music and technology advanced (and they both have), and as I grew, I began to tire of recording. I love to hate on Ableton, but their Live software really did shake up the game. Here, finally, was an application which truly transformed the computer into an interactive musical instrument. Recording had finally broken free of the linear-time it had been bound to ever since the days of tape (well, Robert Fripp might take exception to that assertion, but he himself is exceptional).

As I began to play more with the software, I of course became frustrated with the mouse interface. I was trying to make music, not check my e-mail, damn it! Taking my hands off the guitar, picking up a mouse, finding the cursor on the screen to know how to move, it, clicking a weird button… the ergonomics were just too frustrating to make continued use of in a studio environment, let alone in performance. Being a highly process-oriented individual, I began to think of alternate solutions. A foot controller like Behringer’s FCB1010 is of great utility, but lacks the dynamic feedback capabilities that make modern screens so beautiful. It became clear that a touch screen would offer the best of all worlds. So I bought one. Only it turns out, all that software really was designed for a mouse and keyboards. The icons were just too small for my fat fingers to accurately press! On top of that, things like rotary faders don’t have a consistent feel that maps to fingers. The mouse pointer can disappear when the button is clicked, but my finger stays put. Do I want to drag up and down with my finger, well away from the display of the rotary control, in order to rotate it, really?

Fast forward. We are on the verge of cheap, omnipresent multitouch-capable hardware, with operating system-level support. This will be most radically felt in the multimedia and artistic realms, as intuitive interface diminishes the barrier of entry caused by software learning curves. Design interfaces to respond to the expressive nature of gesture, and people lose their fear of experimentation. Create interesting parameter mappings between the physical input and the digital result, and configuration choices cease to be overwhelming.

All this is a bit long winded to get to what I really want to address. Steinberg’s VST audio plug-in architecture is a defacto standard with a mind-boggling assortment of third-party offerings. Most plug-ins also offer custom GUIs for editing their parameters. Unfortunately, the vast majority of these work extremely poorly with touch. How hard would it be to add some extra exports to a VST library to expose a NUI presentation in parallel to the common GUI? Would it be possible to retrofit a layer to fit over existing VSTs? Perhaps some recommendations, if not a formal specification, for the pieces of VSTGUI to avoid in a NUI case– for instance, CCursorType doesn’t really make sense at all, and neither does CMouseWheelAxis. But there are more subtle things: what knob mode works best for touch screens? In my experience, linear click drag (kLinearMode) doesn’t seem to make much sense for touch, but for kCircularMode, care must be taken that the knob’s representation is sized large enough for the fat fingers to accurately manipulate. After all, they don’t get any of the tactile feedback of a real physical knob, so proper placement is pretty much all visual. In any case, it would be great to see some guidance on retrofitting the myriad VSTs out there to transition them from WIMP to OCGM in a timely fashion. I suppose, as always, it’ll just take some time for the adjustment to soak into the collective consciousness. And, as always, I’ll be patient but eager.

RPM ’07 Failed

February has come to an end, and with it my chance to complete the RPM ’07 Challenge. Despite my best efforts, I failed to record a whole album. However, I was working ambiguously alone, and I nevertheless did get quite a bit accomplished, at least in terms of learning. I may not have anything finished, but I’ve got quite a few nice starts. I suppose I should’ve devoted more time to this, but as always, I had my hands in too many projects (some of which I’ll detail in future posts), not to mention entertaining visiting friends, hanging out, etc. I thought I should go ahead and post what I have done so far. Perhaps some critiques at this stage in development will lead me in another direction in some of these experimental ideas.

These five aren’t the only things I did during February, but they are the least experimental, and, though I hate to admit, the most developed. I will certainly continue working on these, and record more for them.  I’ve written some lyrics for a few of them, and will record the singing as soon as I can compel my roommate to stop rearranging the basement mics every twenty minutes.

ChucK Composition

Though I first looked at ChucK quite a while back, I recently decided to give it another look and give it an actual try for more than a few minutes. After playing around with it for an evening, I ended up liking it a lot more than I had previously. The feature set has improved markedly since I last tried it, as well as the consistency and thoroughness of the examples. There are still a few areas I would definitely like to see added, mostly centered around integration with other audio applications. Being able to load ChucK scripts as a VST/AU/LADSPA plugin would be a nice advantage, although routing via Rewire or JACK would be about as good. If my interest with this language keeps up, I may decide to get involved with the development and implement some of these myself.

I’ve wondered about alternative scores for music performance a lot, and while a ChucK script may not be exactly what I’ve been thinking about, it might make a lot of sense for virtual accompaniment. A properly written score could direct the human performer, playing a specific part such as guitar or voice, via visual cues; at the same time, the input of the performer could be processed by the program to affect changes in the machine’s performance. Louder RMS values on the input could be reflected by more rapid note generation by the computer, more periodic transients could lead from ambient soundscapes to rhythmic sequences, different instruments could noodle about on riffs within the current chord, etc. Of course all the typical MIDI and OSC control tricks still apply. All this could surely be done with Ableton Live or any other host with sufficient plugins, but setting up some of the more complicated things with that particular model could be roundabout, to say the least. Obviously everything could be done with C++ as well, but it would take forever and be tough to maintain across platforms. The point is that ChucK seems to be a good middle ground. While it isn’t a ready to go solution, it offers a direct and quick path to implement any of silly audio ideas floating around.

I’ve attached my first little experimental composition script, and a sample performance of it. It uses the Stk instruments, so sounds a bit similar to many of the examples, but I think it’s a bit nicer. It’s short, but it certainly gave me a nice introduction and greatly increased my comfortability in the language. Hopefully I’ll soon be capable of writing some longer, and nicer compositions.