QWERTYBeats design documentation

QWERTYBeats logoQWERTYBeats is a proposed accessible, beginner-friendly rhythm performance tool with a basic built-in sampler. By simply holding down different combinations of keys on a standard computer keyboard, users can play complex syncopations and polyrhythms. If the app is synced to the tempo of a DAW or other music playback system, the user can easily perform good-sounding rhythms over any song. 

This project is part of Design For The Real World, an NYU ITP course. We are collaborating with the BEAT Rockers, the Lavelle School for the Blind, and the NYU Music Experience Design Lab. Read some background research here

User persona

The intended QWERTYBeats user is a student at the Lavelle School between the ages of twelve and eighteen. She enjoys music, but has little experience participating in it, and no formal training. She has limited vision, along with some additional motor and cognitive challenges. She has access to internet-connected mobile devices and/or desktop computers in her music classroom, and possibly at home as well.

Pedagogical goals

My goal with QWERTYBeats is to create an expressive, playful rhythm instrument for beginner-level musicians with low or no vision. Within a few minutes of self-guided play, novice users should be able to produce appealing beats. The app should also be deep and versatile enough to reward extensive exploration, and to support meaningful music-making by intermediate or advanced beatmakers or instrumentalists. My design process follows Mitch Resnick’s principle of “low floors, wide walls, and high ceilings.”

Our assignment in Design for the Real World is to create a music interface that supports the work of the BEAT Rockers in the Lavelle School and elsewhere. In addition to their musical activities, the BEAT Rockers are helping Lavelle students with speech therapy activities via beatboxing. Speech therapy requires extensive practice that young people find tedious. Beatboxing can transform speech practice from a chore into an outlet for creative social music making.

For most beginner-level musicians, timekeeping is a major obstacle to sounding good. (It continues to be an obstacle for many advanced musicians as well.) QWERTYBeats will enable entry-level beatboxers to record their own mouth sounds, and then play them back in flawlessly quantized rhythms. In my experience, drum programming can lay the cognitive groundwork for better musical timekeeping and a stronger rhythmic sense generally. I hope that the sampler functionality of QWERTYBeats will embolden users to try out rhythmic ideas that would otherwise be beyond their abilities, to beatbox along, and, eventually, to feel confident performing with or without the computer’s help.

Desktop version

QWERTYBeats was originally conceived as a web app controlled via the standard QWERTY keyboard. It follows the model of the aQWERTYon, a browser-based melody and harmony app developed by the Music Experience Design Lab.

aQWERTYon screencap

The aQWERTYon was designed to turn an ordinary desktop computer into an expressive beginner-friendly musical instrument without the need of any additional software or hardware. While it was not designed with accessibility as an explicit goal, the keyboard is a natural tactile interface, conveniently organized into rows and columns. QWERTYBeats builds on the aQWERTYon’s “no wrong notes” design, adding a visual interface that takes accessibility more into account.

QWERTYBeats desktop - load screen

Desktop – Perform

The app defaults to the Perform screen. Here, each key on the QWERTY keyboard is mapped to one of four drum sounds and one of eight rhythmic values. Tapping a key plays the drum sound once. Holding it down retriggers the drum sound until the key is released.

QWERTYBeats desktop perform - no keys held down

The user can set the tempo numerically via text field, by clicking the tap tempo link, or by tapping the Tab key (a common DAW convention.)

When a key is held down, the screen shows which sound is playing and at what rhythmic interval.

QWERTYBeats desktop perform

It is possible to perform with QWERTYBeats without ever looking at or being able to see the screen. But for sighted users, I expect that the visual appeal of the subdivided circles will inspire aural rhythmic exploration, and vice versa.

It is not possible to vary note velocity with the QWERTY keyboard, since keys are either pressed or not. It is possible to compensate for this somewhat by slightly randomizing each playback event’s velocity.

The idea of representing rhythms as subdivided circles came to me via many different sources, including the wonderful Figure and Rhythm Necklace apps, and the work of Godfried Toussaint as translated into the Groove Pizza. The specific graphic feel of the circles draws on the cover artwork from Selected Ambient Works Volume II by Aphex Twin.

Aphex Twin - Selected Ambient Works Volume II - CD back cover Desktop – rhythm layout

The table below shows the keys assigned to each drum sound:

kick   snare   hi-hat   clap
1 2 3 4 5 6 7 8
Q W E R T Y U I
A S D F G H J K
Z X C V B N M ,

Within each pair of columns, the keys retrigger at the following rhythmic intervals:

Rhythmic values for each sound
1 quarter note 1/2 eighth note
3/4 dotted eighth note 1/4 sixteenth note
2/3 quarter note triplet 1/3 eighth note triplet
3/16 dotted sixteenth note 1/6 sixteenth note triplet

The fractions in this table (and in the graphics above) presume that the quarter note has the beat, as is the case in 4/4 time. All other rhythmic intervals are shown as subdivisions of the quarter note. At 120 beats per minute, a quarter note would be 0.5 seconds long; an eighth note would be 0.25 seconds long; and so on.

Desktop – Sampler

Users can record or import their own drum sounds via the Sampler page.

QWERTYBeats desktop sampler

To audition a sound, the user clicks the Play icon to the left of the waveform. To upload a different sound in that slot, the user clicks the arrow icon to the right of the waveform. To record new sounds, the user clicks the waveform itself, opening the Record screen.

QWERTYBeats desktop full-screen recording

To record, the user simply holds down the space bar. When the space bar is released, recording stops automatically, and the newly recorded sound plays back. The app automatically trims silence from the beginning of the sound by detecting an amplitude change above a certain threshold.

QWERTYBeats desktop recording - keep or redo

If the user likes their sound, they press Enter to return to the Sampler page. Otherwise they can re-record using the space bar as above.

Desktop – kit menu

QWERTYBeats comes preloaded with seven drum kits across a variety of timbres: electronic, beatbox, and acoustic.

QWERTYBeats desktop kit menu

Clicking a kit automatically loads it in the Perform page.

Desktop – user flow

QWERTYBeats desktop user flow

Accessibility notes

It is my goal to make the app accessible to users with the widest possible diversity of abilities. The menus and text commands will use plain text, CSS and JavaScript to support screen readers. The only mode I have tested significantly is Perform, which is easily discoverable by trial and error without any reference to the screen. The details of the Sampler user interface will probably need to be adjusted based on further testing.

QWERTYBeats desktop - welcome

QWERTYBeats desktop - perform - accessibility

QWERTYBeats desktop - sampler - accessibility

QWERTYBeats desktop - record - accessibility

Mobile version

How can an interface designed around the QWERTY keyboard function on a mobile device? My solution is to recreate the four-by-eight grid of keys as regions on the touchscreen.

Mobile – Perform

To perform with mobile QWERTYBeats, the user simply taps and holds the onscreen circles as if they were keys on the keyboard.

QWERTYBeats mobile perform - no circles touched

The target area for each circle includes the entire “grid cell” that it occupies. When a circle is activated, its color palette inverts to give visual reinforcement of the auditory feedback.

QWERTYBeats mobile perform

The user can set the tempo by tapping the upper left corner of the screen. Alternatively, the user can tap and hold on the tempo indicator, which can be edited via ordinary system text entry.

I have had several discussions about the idea of creating a tactile overlay for the screen to assist blind users in discovery. This is a direction I am eager to pursue in the future. In the immediate term, however, I am optimistic that blind users will be able to learn the interface through spatial memory. Several of my colleagues work with students with limited vision and other physical and cognitive abilities, and have successfully incorporated mobile apps like Thumbjam and iKaossilator. The touchscreen seems to pose little obstacle in and of itself.

Mobile – rhythm layout

The layout of the grid is identical to the desktop version.

kick   snare   hi-hat   clap
1 1/2 1 1/2 1 1/2 1 1/2
3/4 1/4 3/4 1/4 3/4 1/4 3/4 1/4
2/3 1/3 2/3 1/3 2/3 1/3 2/3 1/3
3/16 1/6 3/16 1/6 3/16 1/6 3/16 1/6

Mobile – Sampler

The mobile Sampler’s functionality is broadly the same as in the desktop version, with some features removed for the sake of keeping the screen uncluttered. The main difference is that the mobile voersion does not support uploading existing audio files. I am open to exploring this functionality in future versions, but browsing audio files using the screen reader poses significant accessibility challenges that I would need to resolve through extensive testing. Also, the mobile Sampler has no link to the kits menu.

QWERTYBeats mobile sampler

As in the desktop version, sounds can be auditioned with the play button next to each waveform. To add new sounds, the user taps the waveform itself to open the Record screen.

QWERTYBeats mobile full-screen record

Here, the user interface is modeled on Snapchat and Vine: the user holds the screen to record. When the user releases, recording stops and the sound is automatically played back.

QWERTYBeats mobile - keep or reject

As in the desktop version, the user may choose to add this new sound to the kit, or re-record it.

Mobile kit menu

As in the desktop version, tapping the name of a kit automatically loads it in the Perform screen.

QWERTYBeats mobile kit menu

Mobile – user flow

QWERTYBeats mobile - user flow

Accessibility

QWERTYBeats mobile - perform - accessibility

QWERTYBeats mobile - sampler - accessibility

Ableton prototypes

So far, I have carried out most design and testing using prototypes I created using the Session view in Ableton Live. Each prototype consists of four samplers, each controlled by a grid of MIDI clips playing the appropriate rhythms. Each MIDI clips’s launch button is mapped to a key on the keyboard.

QWERTYBeats - Ableton prototype

Because it is easy to remap the keys and replace samples in Live, I have been able to quickly produce and test a variety of keyboard layouts controlling a variety of drum sounds. The SoundCloud playlist below showcases some of my own tests. All of these tracks are unedited improvisations.

While QWERTYBeats was intended as a drum performance tool, it has proven to be an surprisingly gratifying melodic instrument as well. Several of my prototype sessions use pitched sounds like tuned gongs. As it turns out, melodies restricted to only four pitches that interact in rhythmically complex ways nearly always sound good. Performances by testers and myself evoke gamelan music, Philip Glass, Steve Reich, and Terry Riley. I have had similar experiences with the Rhythm Necklace app, another rhythm tool with unexpected melodic applications.

Rhythm Necklace

Playtesting alternative layout schemes

My initial approach to the keyboard layout was to have each row play a different sound, and have each column have a different retrigger value. So the top row of keys (1, 2, 3…) would play the kick drum, the second row (Q, W, E…) would play the snare, the third row would play hi-hats, and the fourth row would play claps. Meanwhile, the first column (1, Q, A, Z) would play quarter notes, the second column (2, W, S, Z) would play eighth notes, and so on. This layout has an appealing simplicity and is easily discoverable. Also, since each row is between eleven and thirteen keys wide, that permits a wide variety of rhythmic values. Unfortunately, the simple rows-and-columns layout immediately proved awkward for actual performance. I found myself having to contort my fingers uncomfortably to reach desired rhythmic combinations.

Through testing, I discovered that the most ergonomic performance layout would need to group the sounds into compact groups, rather than stretched across the entire keyboard. My current design divides the keyboard into “zones” comprised of two columns each. The first pair of columns, (1, Q, A, Z) and (2, W, S, Z), plays the kick. The second pair of columns, (3, E, D, C) and (4, R, F, V), plays the snare. The third pair of columns plays hi-hats, and the fourth pair plays claps. The zone layout allows eight rhythmic values per sound, not as many as rows-and-columns, but still offering plenty of variety.

The QWERTY keyboard contains ten complete pairs of columns, allowing for five different sound zones. However, I decided to limit my design to four sounds. Partially this is for simplicity, and partially it is to keep the screen from being too cluttered. Also, adding a fifth column to the mobile performance interface would make the target areas undesirably small. Perhaps in a future version, users can go into “advanced mode” that includes a fifth sound.

With the basic zone layout established, the next question was how to arrange the rhythmic values within each zone. At first, it seemed obvious that I should just list them in decreasing size order down the left column: quarter note, dotted eighth note, quarter note triplet, eighth note. Then the list would continue down the right column: dotted sixteenth note, eighth note triplet, sixteenth note, sixteenth note triplet. This scheme is reasonably pedagogically clear, but like the rows-and-columns layout, turns out to make for awkward performances. During play testing, whenever I landed on a key, my fingers naturally explored its horizontal neighbors. The decreasing-size layout gave no clear relationship between the pairs of keys in each zone.

In struggling with this problem, I found that organizing rhythm values by size has some of the same shortcomings as organizing pitches by size, which is that proximity does not give any cues about relatedness. Consider the piano: C and C-sharp are adjacent keys, but they clash harmonically in most contexts. C is more closely related in harmonic terms to more distant notes like E, G, and of course, C in other octaves. We usually hear pitches as being related when their frequencies form small-number ratios: 2/1 (the octave), 3/2 (the perfect fifth), and 5/4 (the major third). Rhythms work the same way. It is small-number ratios that make two rhythm values feel related, not closeness in size.

After extensive trial and error, I arrived at the current layout. It seems idiosyncratic on the page, but the ratios between pairwise rhythm values is the most gratifying for actual performance among the variations that I tested. I treat the quarter note in the upper left corner of each zone to be “home base.” The next key to the right is half the quarter note’s value. From there, moving one key down cuts the note value in half again. I have generally tried to have each pairwise combination of keys to play rhythms that are musically complementary in some way. A few of the pairings are asymmetrical, but the lack of symmetry paradoxically makes it easier to find my way around on the keyboard intuitively.

While I find the present layout satisfying, it is certainly possible that a more optimal one exists, and I have continued to test alternatives. I have been assuming so far that I should be optimizing for rhythmic variety rather than sonic variety, prioritizing a large selection of rhythmic intervals over a large selection of drum sounds. I have two main motivations for this assumption:

  1. My intended users are young students of color at the Lavelle School in the Bronx, whose musical tastes center on rhythmically sophisticated genres like hip-hop and salsa. My own tastes are similar.
  2. A variety of rhythms creates a pleasing graphical variety as well, inviting users to consider the visual relationship between the fractions in parallel to the auditory relationships.

However, it is possible that rhythmic variety should not be the most important consideration. Conversations with some of my early user testers prompted me to wonder whether it would be better to offer more sounds and fewer rhythmic values. One tester pointed out that users can create all the rhythmic variety they want through playing. This led me to create an alternative layout offering twice as many sounds, at the expense of having only half as many different rhythmic values.

QWERTYBeats 8-sound layout

There is much to recommend the eight-sound layout. It is difficult to create authentic-sounding hip-hop, dance or Afro-Latin beats using only four different drum sounds, but eight sounds is more than enough. Offering twice as many sounds would also dramatically expand the possibilities for melody. The eight sound slots could, for example, be filled with one octave’s worth of a seven-note scale or mode. However, the eight-sound layout has its drawbacks. First, there is the lack of rhythmic variety. Second, the Sampler window would be twice as complex.

QWERTYBeats desktop (8 sounds) - sampler

My fellow NYU doctoral student Brian McFee made an intriguing suggestion: have the top row of keys play eight different sounds, and use the other keys to dynamically control each sound’s retrigger value, perhaps by cycling through a list. This idea is appealing, but it will have to wait for future iterations.

User testing, evaluation, and future work

It is difficult to carry out research in any school setting, and doubly difficult when that school serves a vulnerable population. For that reason, my user tests have so far been carried out with my friends, colleagues, and family. It is easy to simulate blindness; I simply dim the computer’s screen brightness to zero and/or ask my testers to keep their eyes closed. As of this writing, I have had one opportunity to visit the Lavelle School, accompanying a team from Design for the Real World that was testing a recording and looping app. I felt that the time would best be spent observing their test, since their app’s recording functionality overlaps mine. I also wanted to be free to observe and participate in classroom activities generally.

The music classroom at the Lavelle School is a joyful, chaotic environment, and as befits its location in the Bronx, a hip-hop aesthetic dominates. The older students’ class has a similar feeling to cyphers and jam sessions with the Ed Sullivan Fellows. The room is well equipped, with a PA system, mixer, loop station, desktop computer, guitars, amps, a drum kit, an accordion, and an extensive collection of hand percussion.

The older students spent their entire session in a loose hip-hop jam, with an electronic beat, looped beatboxing, keyboards, singing, rapping, and percussion. The younger students had a more subdued group instrument lesson, with the goal of moving toward a holiday concert next month. Some younger students beatboxed, while others worked on beginner guitar. The Lavelle students’ musical ability and experience ranges from complete beginner to different levels of intermediate. Many of them have prior experience using iOS music apps. This diverse group demands a “low floors, high ceilings” approach to musical experience design.

One reason I chose to observe rather than test on my first visit was simply the noise level in the room. Nothing can be heard over the joyous din unless it goes through the PA system. The other ITP group did their testing with headphones, and that would work for QWERTYBeats too. But headphones are isolating, and would be incompatible with participation in the jam session. In theory, we could put headphones on everyone, but the resulting tangle of cables would not be a good match for a group of blind students. For future testing sessions, I plan to bring the appropriate combination of patch cords and connectors to make sure the QWERTYBeats testers can hear themselves without creating too much of a trip hazard.

Going into this semester, I would have assumed that graphic design would be a secondary concern for an interface intended for blind users. But I quickly learned that the word “blindness” applies to a wide range of levels of visual acuity, including several forms of partial and limited vision. Blind and low-vision users naturally appreciate robust assistive and accessibility features, but many of them use graphical interfaces as well, and we need to accommodate their needs. Fortunately, the design guidelines for low-vision users are the same ones that apply to fully sighted users as well: large type, strong color contrasts, uncluttered screen layouts, and clear visual hierarchy make interfaces more attractive and easier for everyone to use.

Based on the BEAT Rockers’ descriptions of the students at Lavelle and my own observations, different age groups will probably benefit from different introductions to QWERTYBeats. Older students will likely enjoy diving right into the app in the jam session context. Younger students will probably benefit from some time experimenting with it alone or at home first. This is probably going to be even more true for the sampling functionality. The older students at Lavelle are more extroverted and more self-confident, and will not be shy about recording and playing back their own voices. Younger students will happily record themselves in private, but might hesitate to do so in front of the group. They will probably prefer to perform with the built-in sounds for a while in class before recording their own. Ultimately, only more user testing will tell.

I do not plan to include any sequencing functionality in QWERTYBeats, or recording beyond one-shot samples, as I have neither the desire nor the ability to create an entire accessible browser-based digital audio workstation. (My Ableton Live prototypes have the advantage of working within an existing DAW that I can use to record testing sessions at will.) If QWERTYBeats is successful enough that students want to use it for recording beats and songs, we can explore ways to integrate it with existing recording, production and notation software. The aQWERTYon is a good model here. While it has its own very rudimentary recording function, it is far more satisfying to use it as a MIDI input tool for programs like Live, GarageBand, or Noteflight. Similarly, the best way to record with QWERTYBeats will probably be to use it as a MIDI controller for other programs.

Beyond direct user testing and ethnographic study, QWERTYBeats supports more data-driven evaluation methods. Because it is a web app, it is easy to record user behavior on a keystroke-by-keystroke basis. The Music Experience Design Lab is currently gathering aggregate user data on several of our web apps, and we are still exploring the research and evaluation possibilities of all of this data. The quantitative study of web-based music apps is generally in its infancy. It remains to be seen what new learning opportunities all of this data will present.