Affordances and Constraints

Note-taking for User Experience Design with June Ahn

Don Norman discusses affordances and constraints in The Design of Everyday Things, Chapter Four: Knowing What To Do.

Don Norman - The Design of Everyday Things

User experience design is easy in situations where there’s only one thing that the user can possibly do. But as the possibilities multiply, so do the challenges. We can deal with new things using information from our prior experiences, or by being instructed. The best-designed things include the instructions for their own use, like video games whose first level act as tutorials, or doors with handles that communicate how you should operate them by their shape and placement.

Continue reading

QWERTYBeats research

Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians


I propose a new web-based accessible rhythm instrument called QWERTYBeats.

QWERTYBeats logo

Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project.  Continue reading

Rohan lays beats

The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.

Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.

Indigo lays beats

Continue reading

Milo meets Beethoven

For his birthday, Milo got a book called Welcome to the Symphony by Carolyn Sloan. We finally got around to showing it to him recently, and now he’s totally obsessed.

Welcome To The Symphony by Carolyn Sloan

The book has buttons along the side which you can press to hear little audio samples. They include each orchestra instrument playing a short Beethoven riff. All of the string instruments play the same “bum-bum-bum-BUMMM” so you can compare the sounds easily. All the winds play a different little phrase, and the brass another. The book itself is fine and all, but the thing that really hooked Milo is triggering the riffs one after another, Ableton-style, and singing merrily along.

Continue reading

How should we be teaching music technology?

This semester, I had the pleasure of leading an independent study for two music students at Montclair State University. One was Matt Skouras, a grad student who wants to become the music tech teacher in a high school. First of all, let me just say that if you’re hiring for such a position in New Jersey, you should go right ahead and hire Matt, he’s an exceptionally serious and well-versed musician and technologist. But the reason for this post is a question that Matt asked me after our last meeting yesterday: What should he be studying in order to teach music tech?

Matt is an good example of a would-be music tech teacher. He’s a classical trumpet player by training who has found little opportunity to use that skill after college. Wanting to keep his life as a musician moving forward, he started learning guitar, and, in his independent study with me, has been producing adventurous laptop music with Ableton Live. Matt is a broad-minded listener, and a skilled audio engineer, but his exposure to non-classical music is limited in the way typical of people who came up through the classical pipeline. It was at Matt’s request that I put together this electronic music tasting menu.

So. How to answer Matt’s question? How does one go about learning to teach music technology? My first impulse was to say, I don’t know, but if you find out, please tell me. The answer I gave him was less flip: that the field is still taking shape, and it evolves rapidly as the technology does. Music tech is a broad and sprawling subject, and you could approach it from any number of different philosophical and technical angles. I’ll list a few of them here. Continue reading

Composing for controllerism

My first set of attempts at controllerism used samples of the Beatles and Michael Jackson. For the next round, I thought it would be good to try to create something completely from scratch. So this is my first piece of music created specifically with controllerism in mind.

The APC40 has forty trigger pads. You can use more than forty loops, but it’s a pain. I created eight loops that fit well together, and then made four additional variations of each one. That gave me a set of loops that fit tidily onto the APC40 grid. The instruments are 808 drum machine, latin percussion, wood blocks, blown tube, synth bass, bells, arpeggiated synth and an ambient pad.

40 loops

Continue reading

Ableton Session View and instrument design

We usually think of “recorded” and “live” as two distinct and opposed forms of music. But technology has been steadily eroding the distinction between the two. Controllerism is a performance method using specialized control surfaces to trigger sample playback and manipulate effects parameters with the full fluidity and expressiveness of a conventional instrument. Such performance can take place on stage or in the studio.

Controllerism is attractive to me because I came to music through improvisation: blues, jazz, jam bands. I spent years improvising electronic music with Babsy Singer, though she did the beats and loops, not me. My life as a producer, meanwhile, has involved very little improvisation. Making music with the computer has been more like carefully writing scores. Improvisation and composition are really the same thing, but the timescales are different. Improvisation has an immediacy that composing on paper doesn’t. The computer shortens the loop from thought to music, but there’s still a lot of obligatory clicking around.

It’s certainly possible to improvise on the computer with MIDI controllers, either the usual keyboard variety or the wackier and more exotic ones. Improvising with MIDI and then cleaning up the results more meticulously is pretty satisfying, though my lack of piano skills make it almost as slow and tedious an input system as the mouse. Jamming on iPhone and iPad apps like Animoog or GarageBand is better. What they lack in screen real estate, they make up for with form factor. Making music on the computer comes to feel like office work after a while. But you can use the phone or the tablet while lying in bed or on the ground, or while pacing around, or basically anywhere. Multitouch also restores some of the immediacy of playing instruments.

There’s also the option of recording a lot of vocal or instrumental improvisation, and then sorting out all the audio afterwards. This is the most satisfying strategy for infusing electronic music with improvisation that I’ve found so far. You get all the free-flowing body-centered immediacy of live jamming, with no pressure whatsoever to be flawless. However, then you have to do the editing. It’s easier now than it was five or ten years ago, but it’s still labor-intensive. It can take an hour of work to shape a few minutes of improv into musical shape.

All of this time, I’ve had severe DJ envy, since their gear is designed for immediacy and improvisation. It’s a lame DJ indeed who meticulously stitches together a set ahead-of-time in an audio editor. However, DJ tools operate at the level of entire songs. It’s not easy to use Serato to write a new track. I’ve been wanting a tool that gives me the same sense of play, but at the scale of individual samples rather than entire songs.

Enter the APC40. The form factor resembles an MPC, and you can use it that way, to trigger one-shot samples like drum hits or chord stabs. But the intended use case is for Ableton session view, starting and stopping the playback of loops. By default, loop playback is quantized to the bar, so whenever you hit a pad, the loop begins playing cleanly on the next downbeat. (You can set the quantization interval to be as wide or narrow as you want, or disable it completely.) Playing your loops live makes happy accidents more likely. Of course, unhappy accidents are more likely too. But those are easy to fix in Arrange view. When I discovered that NYU has a little-used APC, I signed it out and started teaching myself controllerism. Here’s a picture of it.

It seems complex, and it is. The Starship Enterprise quality appeals to my tech nerd side. Creating an Ableton session for APC playing is like inventing a new musical instrument, every time. After you design your instrument, then you have to learn how to play it. On the other hand, if you design your instrument right, the actual playing of it can be fun and easy. When I set up the APC with some Michael Jackson samples and let Milo try it, he figured out the concept immediately.

Continue reading

Digital audio basics

Before you can understand how digital audio works, you need to know a few things about the physics of sound. This animation shows a sound wave emanating through the air from a circular source — imagine that it’s a drum or cymbal.

Spherical pressure waves

As you can see, sound is a wave, like a ripple in a pond. Imagine that your ear is at the bottom center of this image. The air pressure against your inner ear is rhythmically increasing and decreasing. Your brain senses how wide those swings in air pressure are and how often they’re happening, and you experience the result as a sound.

Continue reading

The great music interface metaphor shift

I’m working on a long paper right now with my colleague at Montclair State University, Adam Bell. The premise is this: In the past, metaphors came from hardware, which software emulated. In the future, metaphors will come from software, which hardware will emulate.

The first generation of digital audio workstations have taken their metaphors from multitrack tape, the mixing desk, keyboards, analog synths, printed scores, and so on. Even the purely digital audio waveforms and MIDI clips behave like segments of tape. Sometimes the metaphors are graphically abstracted, as they are in Pro Tools. Sometimes the graphics are more literal, as in Logic. Propellerhead Reason is the most skeuomorphic software of them all. This image from the Propellerhead web site makes the intent of the designers crystal clear; the original analog synths dominate the image.

Reason with its inspiration

In Ableton Live, by contrast, hardware follows software. The metaphor behind Ableton’s Session View is a spreadsheet. Many of the instruments and effects have no hardware predecessor.

Loops in session view

Continue reading