The fake and the real in Chance the Rapper’s “All We Got”

Every semester in Intro to Music Tech, we have Kanye West Day, when we listen analytically to some of Ye’s most sonically adventurous tracks (there are many to choose from.) The past few semesters, Kanye West Day has centered on “Ultralight Beam,” especially Chance The Rapper’s devastating verse. That has naturally led to a look at Chance’s “All We Got.”

All the themes of the class are here: the creative process in the studio, “fake” versus “real” sounds, structure versus improvisation, predictability versus surprise, and the way that soundscape and groove do much more expressive work than melody or harmony.

Continue reading

The Vocoder, Auto-Tune, Pitch Standardization and Vocal Virtuosity

Writing assignment for History of Science and Technology class with Myles Jackson. See a more informal introduction to the vocoder here.

Casual music listeners know the vocoder best as the robotic voice effect popular in disco and early hip-hop. Anyone who has heard pop music of the last two decades has heard Auto-Tune. The two effects are frequently mistaken for one another, and for good reason—they share the same mathematical and technological basis. Auto-Tune has become ubiquitous in recording studios, in two very different incarnations. There is its intended use, as an expedient way to correct out-of-tune notes, replacing various tedious and labor-intensive manual methods. Pop, hip-hop and electronic dance music producers have also found an unintended use for Auto-Tune, as a special effect that quantizes pitches to a conspicuously excessive degree, giving the voice a synthetic, otherworldly quality. In this paper, I discuss the history of the vocoder and Auto-Tune, in the context of broader efforts to use science and technology to mathematically analyze and standardize music. I also explore how such technologies problematize our ideas of virtuosity.

Ableton vocoder

Continue reading

The vocoder and Auto-Tune

This post documents a presentation I’m giving in my History of Science and Technology class with Myles Jackson. See also a more formal history of the vocoder.

The vocoder is one of those mysterious technologies that’s far more widely used than understood. Here I explain what it is, how it works, and why you should care. A casual music listener knows the vocoder best as a way to make that robot voice effect that Daft Punk uses all the time.

Here’s Huston Singletary demonstrating the vocoder in Ableton Live.

This is a nifty effect, but why should you care? For one thing, you use this technology every time you talk on your cell phone. For another, this effect gave rise to Auto-Tune, which, love it or hate it, is the defining sound of contemporary popular music. Let’s dive in!

Continue reading

My music technology syllabus

I use variations on this project list for all of my courses. In Advanced Digital Audio Production at Montclair State University, students do all of these assignments. Students in Music Technology 101 do all of them except the ones marked Advanced. My syllabus for the NYU Music Education Technology Practicum has an additional recording studio project in place of the final project. Here’s the project list in Google Spreadsheet format.

Music Ed Tech Practicum image

I talk very little about microphone technology or technique in my classes. This is because I find this information to only be useful in the context of actual recording studio work, and my classes do not have regular access to a studio. I do spend one class period on home recording with the SM58 and SM57, and talk a bit about mic technique for singers. I encourage students who want to go deeper into audio recording to take a class specifically on that subject, or to read something like the Moylan book.

My project-based approach is informed strongly by Matt Mclean and Alex Ruthmann. Read more about their methods here.

I do not require any text. However, for education majors, I strongly recommend Teaching Music Through Composition by Barbara Freedman and Music Technology and Education: Amplifying Musicality by Andrew Brown.

Continue reading

The aQWERTYon pitch wheels and the future of music theory visualization

The MusEDLab will soon be launching a revamped version of the aQWERTYon with some enhancements to its visual design, including a new scale picker. Beyond our desire to make our stuff look cooler, the scale picker represents a challenge that we’ve struggled with since the earliest days of aQW development. On the one hand, we want to offer users a wide variety of intriguing and exotic scales to play with. On the other hand, our audience of beginner and intermediate musicians is likely to be horrified by a list of terms like “Lydian dominant mode.” I recently had the idea to represent all the scales as colorful icons, like so:

Read more about the rationale and process behind this change here. In this post, I’ll explain what the icons mean, and how they can someday become the basis for a set of new interactive music theory visualizations.

Continue reading

My first Musicto playlist

I have started working with a startup called Musicto, which creates playlists curated by humans around particular themes. For example: music to grieve to, music to clean house to, music to fight evil. My first playlist is music to sing your hipster baby to sleep.

Music to sing your hipster baby to sleep

These are songs I have been singing to my kids, and that I recommend you sing to yours. It isn’t just a playlist, though. Each track is accompanied by a short blog post explaining what’s so special about it. New tracks will be added regularly in the coming weeks. If you’d like, you can follow the playlist on Twitter. If this sounds like the kind of thing you might enjoy putting together, the company is seeking more curators.

Noteflight as a DAW

The good people at Noteflight have started doing weekly challenges. I love constraint-based music prompts, like the ones in the Disquiet Junto, so I thought I would try this one: compose a piece of music using only four notes.

Noteflight weekly challenge

The music side of this wasn’t hard. My material tends not to use that many pitches anyway. If you really want to challenge me, tell me I can’t use any rhythmic subdivisions finer than a quarter note. Before you listen to my piece, though, let’s talk about this word, “compose.” When you write using notation, the presumption is that you’re creating a set of instructions for a human performer. However, actually getting your composition performed is a challenge, unless you have a band or ensemble at your disposal. I work in two music schools, and I would have a hard time making it happen. (When I have had my music performed, the musicians either used a prose score, learned by ear from a recording, or just improvised.) Noteflight’s target audience of kids in school are vanishingly unlikely to ever hear their work performed, or at least, performed well. Matt Mclean formed the Young Composers and Improvisers Workshop to address this problem, and he’s doing amazing work, but most Noteflight compositions will only ever exist within the computer.

Given this fact, I wanted to create a piece of music that would actually sound good when played back within Noteflight. This constraint turned out to be a significantly greater challenge than using four notes. I started with the Recycled Percussion instrument, and chose the notes B, E, F, and G, because they produce the coolest sounds. Then I layered in other sounds, chosen because they sound reasonably good. Here’s what I came up with: Continue reading