Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians
I propose a new web-based accessible rhythm instrument called QWERTYBeats.
Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project. Continue reading
For his birthday, Milo got a book called Welcome to the Symphony by Carolyn Sloan. We finally got around to showing it to him recently, and now he’s totally obsessed.
The book has buttons along the side which you can press to hear little audio samples. They include each orchestra instrument playing a short Beethoven riff. All of the string instruments play the same “bum-bum-bum-BUMMM” so you can compare the sounds easily. All the winds play a different little phrase, and the brass another. The book itself is fine and all, but the thing that really hooked Milo is triggering the riffs one after another, Ableton-style, and singing merrily along.
The MusEDLab and Soundfly just launched Theory For Producers, an interactive music theory course. The centerpiece of the interactive component is a MusEDLab tool called the aQWERTYon. You can try it by clicking the image below. (You need to use Chrome.)
In this post, I’ll talk about why and how we developed the aQWERTYon.
Here’s what happened in my life as an educator this past semester, and what I have planned for the coming semester.
Montclair State University Intro To Music Technology
I wonder how much longer “music technology” is going to exist as a subject. They don’t teach “piano technology” or “violin technology.” It makes sense to teach specific areas like audio recording or synthesis or signal theory as separate classes. But “music technology” is such a broad term as to be meaningless. The unspoken assumption is that we’re teaching “musical practices involving a computer,” but even that is both too big and too small to structure a one-semester class around. On the one hand, every kind of music involves computers now. On the other hand, to focus just on the computer part is like teaching a word processing class that’s somehow separate from learning how to write.
The newness and vagueness of the field of study gives me and my fellow music tech educators wide latitude to define our subject matter. I see my job as providing an introduction to pop production and songwriting. The tools we use for the job at Montclair are mostly GarageBand and Logic, but I don’t spend a lot of time on the mechanics of the software itself. Instead, I teach music: How do you express yourself creatively using sample libraries, or MIDI, or field recordings, or pre-existing songs? What kinds of rhythms, harmonies, timbres and structures make sense aesthetically when you’re assembling these materials in the DAW? Where do you get ideas? How do you listen to recorded music analytically? Why does Thriller sound so much better than any other album recorded in the eighties? We cover technical concepts as they arise in the natural course of producing and listening. My hope is that they’ll be more relevant and memorable that way. Continue reading
I’m a proud member of the NYU Music Experience Design Lab, a research group that crosses the disciplines of music education, technology, and design. Here’s an overview of our many ongoing projects.
The folks at Olympia Noise Co recently came out with a new circular drum machine for iOS called Patterning, and it’s pretty fabulous.
The app’s futuristic look jumps right out at you: flat-colored geometric shapes with zero adornment, in the spirit of Propellerhead Figure. There’s nothing on the screen that doesn’t function in some way. It’s a little dense at first glance, but a complex tool is bound to have a complex interface, and Patterning reveals itself easily through exploration.
I’m part of a research group at NYU called the Music Experience Design Lab. One of our projects is called Play With Your Music, a series of online interactive music courses. We’re currently developing the latest iteration, called Play With Your Music: Theory. Each module presents a “musical simple,” a short and memorable loop of melody or rhythm. Each simple is a window into one or more music theory concepts. Users can learn and play with the simples using a new interface called the aQWERTYon, which maps scales and chords to the regular computer keyboard.
My newest music student is a gentleman named Rob Precht. As is increasingly the case with people I teach privately, Rob lives many time zones away, and he and I have never met face to face. Instead, we’ve been conducting lessons via a combination of Skype and Splice. It’s the first really practical remote music teaching method I’ve used, and I can’t recommend it highly enough.
Rob came to me via this very blog. He’s a semi-retired lawyer who took some piano lessons as a kid but doesn’t have much other music training or experience. He approached me because he wanted to compose original music, and he thought (correctly) that computer-based production would be the best way to go about it. He had made a few tracks with GarageBand, but quickly switched over to Ableton Live after hearing me rave about it. We decided that the best approach would be to have him just continue to stumble through making original tracks, and I would help him refine and develop them.
If you want to understand the cultural struggle taking place in music education right now, you could do worse than to start with the harmonica.
This unassuming little instrument was designed in central Europe in the 19th century to play the music popular in that time and place: waltzes, oom-pah music, and the like. All of this music is diatonic, meaning that it’s based around the major scale, the do-re-mi you learned in school. It’s also the music that you learn if you take a formal music theory class.