The National Conference on Undergraduate Education (NCUR) is, as its name suggests, the country’s premiere outlet for scholarly and creative work by undergraduates. UVU dance student Molly Buonforte, who participated at the Utah Conference on Undergraduate Research (UCUR), and I were able to make the trip to the University of Kentucky to present a reworked version of Dance Loops. Following the nomenclature from software releases, this version was the “Golden Master],” which refers to the production-ready version of software. This was our largest audience by far, as well as the first performance on an actual theatre stage (yay!). It was also the first performance with original music, as I created two pieces in GarageBand for the performance.
Despite the “Golden Master” nomenclature, there was a string of technical difficulties that nearly prevented the performance: the extension cable for the Kinect didn’t work, then the extension cable for the USB web cam didn’t work, then I couldn’t set up the Mira app with wi-fi to control the effects, then I couldn’t set it up over a private connection. Eventually we moved the entire performance about six feet downstage so I could sit at the edge and control the laptop manually. Sub-optimal, but it worked. Always nice to know that if Plans A, B, C, and D don’t work, there is still a Plan E.
The video for this performance, while still amateurish, is better than the others. Enjoy!
After learning a little more about what to do and what NOT to do with your first rendition of Dance Loops (i.e., the “alpha release” @ UCUR), we had a chance to do a few things over for our “open beta” (AKA the “nearly there” version). This time, we were at the Scholarship of Teaching and Engagement Conference (SoTE) at my home school, Utah Valley University, in Orem, Utah. Superstar UVU dancer Hannah Braegger McKeachnie reprised her role from Dance Loops and performed the first section, to the music of Julia Kent (with an edited version of “Gardermoen”). We still performed in a sub-optimal environment – a partitioned meeting room, in this case – and we still have abominable video but, otherwise, things went beautifully. We also got to meet some wondeful people from other schools who were interested in the piece and may be able to contribute in some way in the future. Very exciting! But, for now, here is our monkeywrench video:
In the software world, the “alpha release” is the “not-quite-ready-for-primetime” version. It is usually circulated internally so the bugs can be worked out, although there are occasionally public alpha releases by very daring (or foolish) companies. I’m not totally sure which of the two camps we fall into, but here is an extremely non-professional video – we like to call it the “bootleg version” – of our first public performance of Dance Loops.
Now, a few alpha release issues with this performance.
The video is shot way off to the side and aimed wrong. The primary video camera didn’t work and, well, this is what we have. Better than nothing (but maybe not by much).
It’s in a classroom auditorium with a very shallow stage and no theatre lighting, but that’s the nature of this event.
The projections are way too fuzzy for this situation; we wanted them a little fuzzy but on this shiny surface it was really exaggerated.
The videos are projected too high; we wanted to avoid the wood rail but learned that the videos need to be on the same level as the dancer and the same size to work best, wood rail be damned.
We though that there was too much synchronization in the projections during the last rehearsal, so I removed a bunch of unity from the programming for this. Big mistake; it just looked jumbled. Never change things without rehearsing first!
We also told the dancers that they didn’t need to follow their phrases so closely and to just play around with. They did exactly what we told them to but, again, it looked to mushy. Again, never change things without rehearsing!
So, we learned some important lessons. Nevertheless, it was a good experience. Hannah will get to her part again in a few weeks and Molly will do a variation on hers (and another) a week after that. We’re learning!
Like the previous chapter, chapter 19, on “Informal Music Learning Instruments” is more of show-and-tell than hands-on. In this chapter, VJ Manzo shows how Max/MSP/Jitter can be used to create programs that greatly facilitate the exploration of musical concepts like harmony. In addition to this pedagogical goal, though, I also see application of a lot of these principles and patches to my own hoped-for work on live looping with my saxophone, especially as VJ’s patches might be good for harmonizing. Hmm… we’ll have to see later this summer.
Max/MSP/Jitter for Music, Ch. 19: Informal Music Learning Instruments (0 exercises)
Chapter 15, “Audio Effects and Processing” of VJ Manzo‘s book Max/MSP/Jitter for Music shows how to manually create some simple effects – delays and white noise, in particular – and manipulate and visualize them. To get to that point, the chapter shows how to:
Create a umenu to give a dropdown list of effect sizes (although it may work better to provide checkboxes that allow for the selection of multiple effects)
Create a gate~ object (the MSP audio version) to direct the audio signal towards the appropriate effect patch depending on the effect selected in the umenu
Use the tapin~ object to store snippets of audio and the tapout~ object to delay playback
Use the transport object to provide global control of timing and playback, as well as the ability to specify timing in samples instead of millseconds
Create a umenu to provide a list of filter options and cascade~ and filtergraph~ objects to allow manual modification of those filters
Create a noise~ object to generate white noise
Use a preset object to save settings for the entire patch
My next big goal is to set up my KMI SoftStep foot controller to activate effects and possibly modify them while live looping, as both hands will be on my saxophone.
Max/MSP/Jitter for Music, Ch. 15: Audio Effects and Processing (10 exercises)
Woo hoo! It’s a tiny step but an important one, as this is our first successful experiment with live video looping, which will be central to our Dance Loops project at Utah Valley University. This video is based on motion sensitive recording and processing in Max/MSP/Jitter via my Mac’s iSight camera. You’ll notice that the movement from the first half persists during the second half, in which additional movement is layered. The looped videos for Dance Loops will be filmed with a Kinect video/depth camera and will probably be played back in a very different way, but this represents the first step in that direction.