Designing music learning experiences with technology

This semester I’m working as a research assistant to Alex Ruthmann at NYU. The job includes helping him with a new joint music education and music technology  class, Designing Technologies & Experiences for Music Making, Learning and Engagement. Here’s the bibliography. The central class project is to create a music education technology experience — a lesson plan or classroom activity, a piece of software or hardware, or something outside those categories.

Everyone in the class has to maintain a blog documenting their design process. (Wouldn’t it be cool if every teacher of everything had their students blog about their class work?) My music education experience design is going to be my thesis, which I’m already blogging about. So instead I’ll use these posts for some public-facing note taking.

Most existing technology for music teaching is, shall we say, not good. Software designed for the pros is too difficult for novices. Even Garageband is too complex. The stuff for beginners is too often boring drills with a thin layer of game on top, exercises lacking musical context and any intrinsic motivation for playing. It’s rare for music ed tech to be tested in the classroom. The complexity of the classroom environment does make it hard to do conclusive tests that control for all the confounding factors, but anecdotal and qualitative observations are better than nothing.

All teachers are really experience designers — the experience is whatever takes place in the classroom. And experience designers are de facto teachers, whether they realize it or not. Interface metaphors, presets and defaults are all lessons that need to be learned.

This didn’t come up in class, but I’ve heard teachers compared to DJs. David Wiley says this:

Both [DJs and teachers] start with a collection of existing materials – acoustic resources like songs, sound effects, and samples, and educational resources like simulations, tutorials, and articles. Both sequence and blend these materials in interesting ways. Both do quite a bit of planning (think syllabus as playlist), perform in discrete blocks of time (think course meeting as set); and both have to make meaningful connections between the resources they choose to employ (think lecturing and discussion leading as beat matching).

Clubbers vote with their feet, and generally do so very overtly. Learners vote with their attention, and generally do so very covertly. How do we, as teachers, “keep the dance floor full?” A skilled DJ can feel the energy coming off a crowd and respond very quickly when that group is starting to feel restless (and starting to abandon the dance floor). A skilled teacher can feel the energy coming off a class and respond very quickly when that group is starting to get restless (and starting to doodle, read books, play games on their cell phones, etc.). The DJ responds by playing different music, sticking with genres that the crowd likes. How does the teacher respond? By using different examples, sticking with the kinds of explanations that the learners resonate with? By understanding the rhythm of the class, by knowing when to “play a slow song?”

Music teachers and experience designers can keep the kids engaged by working with actual music with some cultural value to it. The music doesn’t have to be current pop; kids get exposed to a wide variety of things in TV and movies. Alex is enthusiastic about the potential of the Echo Nest and Spotify APIs to open up the ocean of recordings to creative software designers. Kids are eager to participate in music that they know and like. Technology makes it possible to use recorded music as raw material for new work. This has been possible for many years, but software like Ableton Live is out of reach of most school budgets. Alex points to web-based remix tools like the Bohemian Rhapsichord by Jennie and Paul Lamere as a better direction for music ed tech.

The Bohemian Rhapsichord

The Bohemian Rhapsichord is made possible by the rapid proliferation of detailed musical metadata. The feature extraction employed by the Echo Nest is basically just very organized, very granular musicology.

The Rhapsichord also functions as an attractive visualization scheme. Visualization is an invaluable tool for music educators. Music notation is basically just a non-interactive visualization scheme.

Margaret Mead said: “If anthropologists were fish, the one thing they wouldn’t notice is water.” Alex asked the class, what is the water of music ed? Some possibilities:

  • Recordings. Most of us hear much more recorded music than live music. Our expectations have changed as a result. We’re used to flawless renditions of professionally written material, mixed beautifully. Live performance can’t help but be a disappointment.
  • Producers and engineers. At this point, the people behind the mixing board have as much impact on the music as the performers, if not more so. But you rarely hear about the recording process in the classroom. It isn’t just pop fans that need to understand concepts like mixing and space. They matter in classical music too. The physical arrangement of orchestras and choruses is a form of mixing.
  • All this listening to disembodied recordings produced by mysterious means puts us in an alienated, almost hallucinatory state that Murray Shafer called “schizophonia.” Maybe the main job of a music teacher now is to cure schizophonia.

From Pro Tools to Sibelius, most professional music software opens with a blank slate. A new Pro Tools session defaults to zero tracks. A new Sibelius document looks like blank staff paper. That’s great for the pros, but terrifying for novices. Untrained listeners hear music as solid blocks of sound — they can follow lyrics or rhythms, but they have a hard time parsing out individual instrument lines. Given Garageband, a lot of kids lay out one loop at a time, in a sequential stairstep pattern. They don’t know what a finished track is supposed to look like until they see one. Alex has an exercise he does where kids start with a cacophonous mass of loops, and they need to make it into a decent-sounding track through subtraction only. Mute and Solo are magical teaching tools, especially when applied to multitracks from well-known songs.

Music is customarily about filling time, but it also fills space. Visualization lives in space, very literally. Software can unite the visualization and the related sound into a single experience. Alex demoed an as-yet unreleased iPad app called Melody Morph, which lets you lay out shapes triggering different notes and then play them back by tracing different paths across and around them with your fingertip. When Alex recorded and played back a path through the space of the app, it was a real “aha” moment for the class.

The class is not just going to be about software. It’s possible to quickly create new physical interfaces on sheets of paper using conductive ink or regular old pencil graphite, communicating capacitance changes to the computer or phone via a cheap Bluetooth transmitter. Motion sensing is becoming an accessible mainstream technology, and eye tracking is close on its heels. New controller schemes can help to liberate us from the tyranny of the mouse, much like Morton Subotnick wants analog synths to liberate us from the tyranny of the keyboard.

In his book Experience Design, Nathan Shedroff lists three components of an educational experience (or any kind of experience.)

  1. Attraction — the surface qualities.
  2. Engagement — the experience itself.
  3. Conclusion — the win condition, the presentation, the performance.

Shedroff’s evaluation criteria for experience design:

  • Breadth
  • Intensity
  • Duration
  • Triggers
  • Interaction
  • Significance

Looks like we have our work cut out for us. Should be a fun semester.