The aQWERTYon pitch wheels and the future of music theory visualization

The MusEDLab will soon be launching a revamped version of the aQWERTYon with some enhancements to its visual design, including a new scale picker. Beyond our desire to make our stuff look cooler, the scale picker represents a challenge that we’ve struggled with since the earliest days of aQW development. On the one hand, we want to offer users a wide variety of intriguing and exotic scales to play with. On the other hand, our audience of beginner and intermediate musicians is likely to be horrified by a list of terms like “Lydian dominant mode.” I recently had the idea to represent all the scales as colorful icons, like so:

Read more about the rationale and process behind this change here. In this post, I’ll explain what the icons mean, and how they can someday become the basis for a set of new interactive music theory visualizations.

Continue reading

Noteflight as a DAW

The good people at Noteflight have started doing weekly challenges. I love constraint-based music prompts, like the ones in the Disquiet Junto, so I thought I would try this one: compose a piece of music using only four notes.

Noteflight weekly challenge

The music side of this wasn’t hard. My material tends not to use that many pitches anyway. If you really want to challenge me, tell me I can’t use any rhythmic subdivisions finer than a quarter note. Before you listen to my piece, though, let’s talk about this word, “compose.” When you write using notation, the presumption is that you’re creating a set of instructions for a human performer. However, actually getting your composition performed is a challenge, unless you have a band or ensemble at your disposal. I work in two music schools, and I would have a hard time making it happen. (When I have had my music performed, the musicians either used a prose score, learned by ear from a recording, or just improvised.) Noteflight’s target audience of kids in school are vanishingly unlikely to ever hear their work performed, or at least, performed well. Matt Mclean formed the Young Composers and Improvisers Workshop to address this problem, and he’s doing amazing work, but most Noteflight compositions will only ever exist within the computer.

Given this fact, I wanted to create a piece of music that would actually sound good when played back within Noteflight. This constraint turned out to be a significantly greater challenge than using four notes. I started with the Recycled Percussion instrument, and chose the notes B, E, F, and G, because they produce the coolest sounds. Then I layered in other sounds, chosen because they sound reasonably good. Here’s what I came up with: Continue reading

Designing a more welcoming aQWERTYon experience

This post documents my final project for User Experience Design with June Ahn

The best aQWERTYon screencap

Overview of the problem

The aQWERTYon is a web-based music performance and theory learning interface designed by the NYU Music Experience Design Lab. The name is a play on “QWERTY accordion.” The aQWERTYon invites novices to improvise and compose using a variety of scales and chords normally available only to advanced musicians. Notes map onto the computer keyboard such that the rows play scales and the columns play chords. The user can not play any wrong notes, which encourages free and playful exploration. The aQWERTYon has a variety of instrument sounds to choose from, and it can also act as a standard MIDI controller for digital audio workstations (DAWs) like GarageBand, Logic, and Ableton Live. As of this writing, there have been aQWERTYon 32,000 sessions.

Continue reading

QWERTYBeats design documentation

QWERTYBeats logoQWERTYBeats is a proposed accessible, beginner-friendly rhythm performance tool with a basic built-in sampler. By simply holding down different combinations of keys on a standard computer keyboard, users can play complex syncopations and polyrhythms. If the app is synced to the tempo of a DAW or other music playback system, the user can easily perform good-sounding rhythms over any song. 

This project is part of Design For The Real World, an NYU ITP course. We are collaborating with the BEAT Rockers, the Lavelle School for the Blind, and the NYU Music Experience Design Lab. Read some background research hereContinue reading

QWERTYBeats research

Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians


I propose a new web-based accessible rhythm instrument called QWERTYBeats.

QWERTYBeats logo

Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project.  Continue reading

Milo meets Beethoven

For his birthday, Milo got a book called Welcome to the Symphony by Carolyn Sloan. We finally got around to showing it to him recently, and now he’s totally obsessed.

Welcome To The Symphony by Carolyn Sloan

The book has buttons along the side which you can press to hear little audio samples. They include each orchestra instrument playing a short Beethoven riff. All of the string instruments play the same “bum-bum-bum-BUMMM” so you can compare the sounds easily. All the winds play a different little phrase, and the brass another. The book itself is fine and all, but the thing that really hooked Milo is triggering the riffs one after another, Ableton-style, and singing merrily along.

Continue reading

Teaching reflections

Here’s what happened in my life as an educator this past semester, and what I have planned for the coming semester.

Montclair State University Intro To Music Technology

I wonder how much longer “music technology” is going to exist as a subject. They don’t teach “piano technology” or “violin technology.” It makes sense to teach specific areas like audio recording or synthesis or signal theory as separate classes. But “music technology” is such a broad term as to be meaningless. The unspoken assumption is that we’re teaching “musical practices involving a computer,” but even that is both too big and too small to structure a one-semester class around. On the one hand, every kind of music involves computers now. On the other hand, to focus just on the computer part is like teaching a word processing class that’s somehow separate from learning how to write.

MSU Intro to Music Tech

The newness and vagueness of the field of study gives me and my fellow music tech educators wide latitude to define our subject matter. I see my job as providing an introduction to pop production and songwriting. The tools we use for the job at Montclair are mostly GarageBand and Logic, but I don’t spend a lot of time on the mechanics of the software itself. Instead, I teach music: How do you express yourself creatively using sample libraries, or MIDI, or field recordings, or pre-existing songs? What kinds of rhythms, harmonies, timbres and structures make sense aesthetically when you’re assembling these materials in the DAW? Where do you get ideas? How do you listen to recorded music analytically? Why does Thriller sound so much better than any other album recorded in the eighties? We cover technical concepts as they arise in the natural course of producing and listening. My hope is that they’ll be more relevant and memorable that way. Continue reading