This month I’ve been teaching music production and composition as part of NYU’s IMPACT program. A participant named Michelle asked me to critique some of her original compositions. I immediately said yes, and then immediately wondered how I was actually going to do it. I always want to evaluate music on its own terms, and to do that, I need to know what the terms are. I barely know Michelle. I’ve heard her play a little classical piano and know that she’s quite good, but beyond that, I don’t know her musical culture or intentions or style. Furthermore, she’s from China, and her English is limited.
I asked Michelle to email me audio files, and also MIDI files if she had them. Then I had an epiphany: I could just remix her MIDIs, and give my critique totally non-verbally.
Can the computer be an improvisation partner? Can it generate musical ideas of its own in real time that aren’t the product of random number generators or nonsensical Markov chains?
In Joel Chadabe‘s “Settings For Spirituals,” he uses pitch-tracking to perform various effects on a recording of a singer: pitch shifting, chorus, reverb. The result is effectively an avant-garde remix. It isn’t exactly my speed, but I like the spirit of the piece – remixing existing recordings is a central pillar of current interactive electronic music. I’m less taken with Chadabe’s 1978 “Solo” for Synclavier controlled by theremin. The idea of dynamically controlling a computer’s compositions is an intriguing one, and I like the science-fictional visual effect of using two giant theremin antennae to control note durations, and to fade instrumental sounds in and out. Chadabe set the Solo system up to intentionally produce unpredictable results, giving the feeling of an improvisational partner. He describes “Solo” as being “like a conversation with a clever friend.” Who wouldn’t want such an experience?
Update: I now have a functioning prototype of my app. If you’d like to try it, get in touch.
My NYU masters thesis is a drum programming tutorial system for beginner musicians. It uses a novel circular interface for displaying the drum patterns. This presentation explains the project’s goals, motivations and scholarly background.
I was motivated to create a surround remix of a Beatles song by hearing the Beatles Love album in class.
I chose “Here Comes The Sun” because I have the multitracks, and because I heard potential to find new musical ideas within it. Remixing an existing recording is always an enjoyable undertaking, but the process takes on new levels of challenge and reward when the source material is so well-known and widely revered. Much as I enjoy Beatles Love, I feel that it didn’t take enough liberties with the original tracks. I wanted to depart further from the original mix and structure of “Here Comes The Sun.”
For Paul Geluso’s Advanced Audio Production midterm, we were assigned to choose two tracks from his recommended listening list, and compare and contrast them sonically. I chose “Regiment” by David Byrne and Brian Eno, and “Little Fluffy Clouds” by The Orb.
Recorded ten years apart using very different technology, both tracks nevertheless share a similar structure: dance grooves at medium-slow tempos centered around percussion and bass, overlaid with radically decontextualized vocal samples. Both are dense and abstract soundscapes with an otherworldly quality. However, the two tracks have some profound sonic differences as well. “Regiment” is played by human instrumentalists into analog gear, giving it a roiling organic murk. “Little Fluffy Clouds” is a pristine digital recording built entirely from DJ tools, quantized neatly and clinically precise.
Discussing “Silver Apples Of The Moon” puts me in a quandary. I like Morton Subotnick personally, and very much enjoyed studying with him. I appreciate his desire to liberate the world from the shackles of keyboard-centric thinking. There’s no question that his music is personal, original and forward-thinking. But I find myself unable to emotionally connect.
Allmusic’s artist profiles include user-submitted “moods.” The Allmusic artist moods for Subotnick are: Cerebral, Clinical, Detached, Reserved, and Hypnotic. I couldn’t have described “Silver Apples” any better. Subotnick certainly isn’t reserved in person; his willingness to sing and dance spontaneously in class is his most charming quality. But like most of his high modernist cohort, Subotnick’s music is austere.
For my NYU masters thesis in Music Technology, I’m designing a beginner-oriented music learning app for the iPad and similar devices. It will approach music the way I wish I had been taught it, and the way I’ve been teaching it to my private students.
I’m motivated in this project by a few axiomatic beliefs:
Everyone is born with the capacity to learn music. That capacity just needs to be activated in the right way.
Anyone can and should participate in music actively. Like cooking or sports, music need not be totally mastered to benefit the participant, and it should definitely not be the exclusive province of specialists.
Beginners should study music they’re familiar with, and that they like.
Music teaching for beginners should follow an Afrocentric paradigm that relates to pop, rock and hip-hop. That means starting with rhythm, and treating melodic instruments as percussion.
So here’s what this means for music teaching.
Beginners should learn pentatonics first, then mixolydian. Music education customarily begins with the major scale, but pentatonics and mixolydian are closer to pop, rock, hip-hop and dance common practice.
Beginners should work modally for a long time. Being constrained to a certain unvarying group of notes frees up mental bandwidth to think melodically and rhythmically. The best mode to work in is the ambiguous major/minor tonality of the blues. Again, this reflects the majority of the American mainstream.
Only after becoming familiar with blues should students embark on the major scale and diatonic harmony. Traditional music theory pedagogy is based on rules laid down in the eighteenth century. While these rules are of historical interest, their conflict with current music makes them tedious and alienating.
The app will start with drum programming, giving you templates for basic dance styles (hip-hop, techno, rock) and letting you customize them. Once you have some mastery of loop programming and rhythms, the app takes you into basic MIDI sequencing, first with single-note basslines, then simple pentatonics, and on to chords. For the visual aesthetic, I plan to avoid skeuomorphism entirely. The interface will consist entirely of geometric shapes in flat colors and large text. Here’s a concept image:
As I contemplate my masters thesis, I’m looking for good examples of beginner-centric musical user interface design. Propellerhead’s new Figure app has been a source of inspiration for me. It’s mostly wonderful, and even its design flaws are instructive.
I have a long history with Propellerhead’s software, beginning with Rebirth in 1998. I’ve made a lot of good music with their stuff, but have also experienced a lot of frustration, mostly due to their insistence on slathering everything with unhelpfully “realistic” design.
I’ve toyed around with several iPhone and iPad music apps. Many are intriguing and fun, but few have inspired me into making “real” music. In preparation for the next Disquiet Junto project, I downloaded Nodebeat and tried some improvisation. I like the result:
The app combines randomness and control in an intriguing way. I also like the fine microtonal control it gives you. You can also use it as a MIDI controller for other software, though I haven’t given that a try yet. If you want to try it for yourself and you don’t have an iOS or Android device, you can snag the desktop version, for free no less.