The fake and the real in Chance the Rapper’s “All We Got”

Every semester in Intro to Music Tech, we have Kanye West Day, when we listen analytically to some of Ye’s most sonically adventurous tracks (there are many to choose from.) The past few semesters, Kanye West Day has centered on “Ultralight Beam,” especially Chance The Rapper’s devastating verse. That has naturally led to a look at Chance’s “All We Got.”

All the themes of the class are here: the creative process in the studio, “fake” versus “real” sounds, structure versus improvisation, predictability versus surprise, and the way that soundscape and groove do much more expressive work than melody or harmony.

Continue reading

The Vocoder, Auto-Tune, Pitch Standardization and Vocal Virtuosity

Writing assignment for History of Science and Technology class with Myles Jackson. See a more informal introduction to the vocoder here.

Casual music listeners know the vocoder best as the robotic voice effect popular in disco and early hip-hop. Anyone who has heard pop music of the last two decades has heard Auto-Tune. The two effects are frequently mistaken for one another, and for good reason—they share the same mathematical and technological basis. Auto-Tune has become ubiquitous in recording studios, in two very different incarnations. There is its intended use, as an expedient way to correct out-of-tune notes, replacing various tedious and labor-intensive manual methods. Pop, hip-hop and electronic dance music producers have also found an unintended use for Auto-Tune, as a special effect that quantizes pitches to a conspicuously excessive degree, giving the voice a synthetic, otherworldly quality. In this paper, I discuss the history of the vocoder and Auto-Tune, in the context of broader efforts to use science and technology to mathematically analyze and standardize music. I also explore how such technologies problematize our ideas of virtuosity.

Ableton vocoder

Continue reading

The vocoder and Auto-Tune

This post documents a presentation I’m giving in my History of Science and Technology class with Myles Jackson. See also a more formal history of the vocoder.

The vocoder is one of those mysterious technologies that’s far more widely used than understood. Here I explain what it is, how it works, and why you should care.

Casual music listeners know the vocoder best as a way to make the robot voice effect that Daft Punk uses all the time.

Here’s Huston Singletary demonstrating the vocoder in Ableton Live.

You may be surprised to learn that you use a vocoder every time you talk on your cell phone. Also, the vocoder gave rise to Auto-Tune, which, love it or hate it, is the defining sound of contemporary popular music. Let’s dive in!

Continue reading

My music technology syllabus

I use variations on this project list for all of my courses. In Advanced Digital Audio Production at Montclair State University, students do all of these assignments. Students in Music Technology 101 do all of them except the ones marked Advanced. My syllabus for the NYU Music Education Technology Practicum has an additional recording studio project in place of the final project. Here’s the project list in Google Spreadsheet format.

Music Ed Tech Practicum image

I talk very little about microphone technology or technique in my classes. This is because I find this information to only be useful in the context of actual recording studio work, and my classes do not have regular access to a studio. I do spend one class period on home recording with the SM58 and SM57, and talk a bit about mic technique for singers. I encourage students who want to go deeper into audio recording to take a class specifically on that subject, or to read something like the Moylan book.

My project-based approach is informed strongly by Matt Mclean and Alex Ruthmann. Read more about their methods here.

I do not require any text. However, for education majors, I strongly recommend Teaching Music Through Composition by Barbara Freedman and Music Technology and Education: Amplifying Musicality by Andrew Brown.

Continue reading

The aQWERTYon pitch wheels and the future of music theory visualization

Try a very early alpha of the scale wheel visualization here

The MusEDLab will soon be launching a revamped version of the aQWERTYon with some enhancements to its visual design, including a new scale picker. Beyond our desire to make our stuff look cooler, the scale picker represents a challenge that we’ve struggled with since the earliest days of aQW development. On the one hand, we want to offer users a wide variety of intriguing and exotic scales to play with. On the other hand, our audience of beginner and intermediate musicians is likely to be horrified by a list of terms like “Lydian dominant mode.” I recently had the idea to represent all the scales as colorful icons, like so:

Read more about the rationale and process behind this change here. In this post, I’ll explain what the icons mean, and how they can someday become the basis for a set of new interactive music theory visualizations.

Continue reading

Designing a more welcoming aQWERTYon experience

This post documents my final project for User Experience Design with June Ahn

The best aQWERTYon screencap

Overview of the problem

The aQWERTYon is a web-based music performance and theory learning interface designed by the NYU Music Experience Design Lab. The name is a play on “QWERTY accordion.” The aQWERTYon invites novices to improvise and compose using a variety of scales and chords normally available only to advanced musicians. Notes map onto the computer keyboard such that the rows play scales and the columns play chords. The user can not play any wrong notes, which encourages free and playful exploration. The aQWERTYon has a variety of instrument sounds to choose from, and it can also act as a standard MIDI controller for digital audio workstations (DAWs) like GarageBand, Logic, and Ableton Live. As of this writing, there have been aQWERTYon 32,000 sessions.

Continue reading

The Groove Pizza now exports MIDI

Since its launch, you’ve been able to export your Groove Pizza beats as WAV files, or continue working on them in Soundtrap. But now, thanks to MusEDLab developer Jordana Bombi, you can also save your beats as MIDI files as well.

Groove Pizza MIDI export

You can bring these MIDI files into your music production software tool of choice: Ableton Live, Logic, Pro Tools, whatever. How cool is that?

There are a few limitations at the moment: your beats will be rendered in 4/4 time, regardless of how many slices your pizza has. You can always set the right time signature after you bring the MIDI into your production software. Also, your grooves will export with no swing–you’ll need to reinstate that in your software as well.

We have some more enhancements in the pipeline, aside from fixing the limitations just mentioned. We’re working on a “continue in Noteflight” feature, real-time MIDI input and output, and live performance using the QWERTY keyboard. I’ll keep you posted.

Measurement in games for learning research

Note-taking for Research on Games and Simulations with Jan Plass

Flow

Kiili, K., &; Lainema, T. (2008). Foundation for Measuring Engagement in Educational Games. J of Interactive Learning Research, 19(3), 469–488.

The authors’ purpose here is to assess flow in educational games, to “operationalize the dimensions of the flow experience.” A flow state involves deep concentration, time distortion, autotelic (self-motivating) experience, a loss of self-consciousness, and a sense of loss of control.

Continue reading