I’m working on a long paper right now with my colleague at Montclair State University, Adam Bell. The premise is this: In the past, metaphors came from hardware, which software emulated. In the future, metaphors will come from software, which hardware will emulate.
The first generation of digital audio workstations have taken their metaphors from multitrack tape, the mixing desk, keyboards, analog synths, printed scores, and so on. Even the purely digital audio waveforms and MIDI clips behave like segments of tape. Sometimes the metaphors are graphically abstracted, as they are in Pro Tools. Sometimes the graphics are more literal, as in Logic. Propellerhead Reason is the most skeuomorphic software of them all. This image from the Propellerhead web site makes the intent of the designers crystal clear; the original analog synths dominate the image.
In Ableton Live, by contrast, hardware follows software. The metaphor behind Ableton’s Session View is a spreadsheet. Many of the instruments and effects have no hardware predecessor.
My music-making life has revolved heavily around Ableton Live for the past few years, and now the same thing is happening to my music-teaching life. I’m teaching Live at NYU’s IMPACT program this summer, and am going to find ways to work it into my future classes as well. My larger ambition is to develop an all-around electronic music composition/improvisation/performance curriculum centered around Live.
While the people at Ableton have done a wonderful job documenting their software, they mostly presume that users know what they want to accomplish, they just don’t know how to get there. But my experience of beginner Ableton users (and newbie producers generally) is that they don’t even know what the possibilities are, what the workflow looks like, or how to get a foothold. My goal is to fill that vacuum, and I’ll be documenting the process extensively here on the blog.
Soundation uses the same basic interface paradigm as other audio recording and editing programs like Pro Tools and Logic. Your song consists of a list of tracks, each of which can contain a particular sound. The tracks all play back at the same time, so you can use them to blend together sounds as you see fit. You can either record your own sounds, or use the loops included in Soundation, or both. The image below shows six tracks. The first two contain loops of audio; the other four contain MIDI, which I’ll explain later in the post.
I’ve undergone some evolution in my thinking about the intended audience for my thesis app. My original idea was to aim it at the general public. But the general public is maybe not quite so obsessed with breakbeats as I am. Then I started working with Alex Ruthmann, and he got me thinking about the education market. There certainly a lot of kids in the schools with iPads, so that’s an attractive idea. But hip-hop and techno are a tough sell for traditionally-minded music teachers. I realized that I’d find a much more receptive audience in math teachers. I’ve been thinking about the relationship between music and math for a long time, and it would be cool to put some of those ideas into practice.
The design I’ve been using for the Drum Loop UI poses some problems for math usage. Since early on, I’ve had it so that the centers of the cells line up with the cardinal angles. However, if you’re going to measure angles and things, the grid lines really need to be on the cardinal angles instead. Here’s the math-friendly design:
My thesis is supposed to include a quantitative research component. This had been causing me some anxiety. It’s educational and creative software. What exactly could I measure? I had this vague notion of testing people’s rhythmic ability before and after using the app. But how do you quantify rhythmic ability? Even if I had a meaningful numerical representation, how could I possibly measure a big enough sample size over a long enough time to get a statistically significant result? The development of my app is going okay, but I was really stressing about the experimental component.
Then my advisor introduced me to Andrew Brown‘s notion of software development as research, or SoDaR. As Brown puts it, “SoDaR involves computers, but is about people.” Humans are complex, our interactions with computers are complex, the way we learn is complex. The only method of inquiry that can encompass all that complexity is qualitative, anthropological inquiry, involving a substantial amount of introspection on the part of the researcher.
For this week’s reading on experience design for music education, we moved up a level to think about experience design generally. A lot of design theory tends to boil down to “Design things better!” Marc Hassenzahl’s book falls into that trap a little, but he does have some useful specific ideas. His main thesis is that designers of technology aren’t just designing the technology itself. They’re designing the felt experience of using the technology (intentionally or not.) People care less about the technology itself and more about how they feel while using it.
Nearly getting scooped by Loopseque lit a fire under me to get some more concept images for my thesis app together. So here are some examples of the beat programming lessons that form the intellectual heart of my project. The general idea is that you’re given an existing drum pattern, a famous breakbeat or something more generic. Some of the beats are locked down, guaranteeing that anything you do will sound musical. Click each one to see it bigger.
Everyone in the class has to maintain a blog documenting their design process. (Wouldn’t it be cool if every teacher of everything had their students blog about their class work?) My music education experience design is going to be my thesis, which I’m already blogging about. So instead I’ll use these posts for some public-facing note taking.