I’m working on a paper about music education and hip-hop, and I’m going to use this post to work out some thoughts.
My wife and I spent our rare date night going to see Black Panther at BAM. It was uplifting. Many (most?) black audience members came dressed in full Afrofuturistic splendor. A group of women in our section were especially decked out:
I was admiring their outfits and talking about how I wasn’t expecting such an emotional response to the movie. One of the women said it was as big a deal for them as the election of Barack Obama in 2008. I know representation is important, but this seems like it’s more than just seeing black faces on the movie screen. Black Twitter is talking about how this movie is different because it isn’t about overcoming historical pain or present-day hardship; it’s about showing black people as powerful, rich, technologically advanced, and above all, serenely confident.
Black Panther is heavily overdetermined, like all superhero movies. But I’m especially interested in the way we could read it as a metaphor for music, with the Wakandans as representing African musical traditions and Eric Killmonger as representing the global rise of hip-hop. I see Killmonger this way not only because he’s American, but because so many of his qualities and mannerisms remind me of the role of hip-hop in the public imagination. He’s stylish, effortlessly charismatic, and seemingly indifferent to anyone else’s approval. He’s funny, too, not in the warm and good-natured way that Shuri is, but in a more aggressive and sarcastic way. He’s both arrogant and vulnerable, using implacable cool to conceal deep hurt. And he wants to remake the world by fomenting black revolution, by any means necessary. The Wakandans, meanwhile, are uncomplicatedly strong, self-possessed, and at ease with their own power. But they are also withdrawn from the world, fearing that getting involving in other people’s struggles will destroy what makes their culture so unique and beautiful.
You can put all recorded music techniques and gestures into three categories: realist, hyperrealist, and surrealist. These categories have soft boundaries that broadly overlap. Nevertheless, I find them to be a useful way to organize my thinking about the aesthetics of recordings.
Every semester in Intro to Music Tech, we have Kanye West Day, when we listen analytically to some of Ye’s most sonically adventurous tracks (there are many to choose from.) The past few semesters, Kanye West Day has centered on “Ultralight Beam,” especially Chance The Rapper’s devastating verse. That has naturally led to a look at Chance’s “All We Got.”
All the themes of the class are here: the creative process in the studio, “fake” versus “real” sounds, structure versus improvisation, predictability versus surprise, and the way that soundscape and groove do much more expressive work than melody or harmony.
Writing assignment for History of Science and Technology class with Myles Jackson. See a more informal introduction to the vocoder here.
Casual music listeners know the vocoder best as the robotic voice effect popular in disco and early hip-hop. Anyone who has heard pop music of the last two decades has heard Auto-Tune. The two effects are frequently mistaken for one another, and for good reason—they share the same mathematical and technological basis. Auto-Tune has become ubiquitous in recording studios, in two very different incarnations. There is its intended use, as an expedient way to correct out-of-tune notes, replacing various tedious and labor-intensive manual methods. Pop, hip-hop and electronic dance music producers have also found an unintended use for Auto-Tune, as a special effect that quantizes pitches to a conspicuously excessive degree, giving the voice a synthetic, otherworldly quality. In this paper, I discuss the history of the vocoder and Auto-Tune, in the context of broader efforts to use science and technology to mathematically analyze and standardize music. I also explore how such technologies problematize our ideas of virtuosity.
This post documents a presentation I’m giving in my History of Science and Technology class with Myles Jackson. See also a more formal history of the vocoder.
The vocoder is one of those mysterious technologies that’s far more widely used than understood. Here I explain what it is, how it works, and why you should care.
Casual music listeners know the vocoder best as a way to make the robot voice effect that Daft Punk uses all the time.
Here’s Huston Singletary demonstrating the vocoder in Ableton Live.
You may be surprised to learn that you use a vocoder every time you talk on your cell phone. Also, the vocoder gave rise to Auto-Tune, which, love it or hate it, is the defining sound of contemporary popular music. Let’s dive in!
Note: I refer to mentors by their real names, and to participants by pseudonyms
Ed Sullivan Fellows (ESF) is a mentorship and artist development program run by the NYU Steinhardt Music Experience Design Lab. It came about by a combination of happenstances. I had a private music production student named Rob Precht, who had found my blog via a Google search. He and I usually held our lessons in the lab’s office space. Over the course of a few months, Rob met people from the lab and heard about our projects. He found us sufficiently inspiring that he approached us with an idea. He wanted to give us a grant to start a program that would help young people from under-resourced communities get a start in the music industry. He asked us to name it after his grandfather, Ed Sullivan, whose show had been crucial to launching the careers of Elvis, the Beatles, and the Jackson 5. While Rob’s initial idea had been to work with refugees who had relocated to New York, we agreed to shift the focus to native New York City residents, since our connections and competencies were stronger there.
I’m delighted to announce that my new online music theory collaboration with Soundfly is live. It’s called Unlocking the Emotional Power of Chords, and it gives you a practical guide to harmony for creators of contemporary pop, R&B, hip-hop, and EDM. We tie all the abstract music theory concepts to real-world musical usages, showing how you can use particular chord combinations to evoke particular feelings. I worked hard with the team at Soundfly on this over the past few months, and we are super jazzed about it.
Like my previous Soundfly courses, the Theory for Producers series, the chords class is a blend of videos, online interactives and composition/production challenges. The musical examples are songs by people like Adele, Chance the Rapper, and Frank Ocean. You can download the MIDI files for each example, stick them in your DAW, and dive right into hands-on music making.
One of the best guest verses in the history of hip-hop is the one that Chance The Rapper does on Kanye West’s beautiful “Ultralight Beam.”
The song is built around an eight bar loop. (See this post for an analysis of the chord progression.) Chance’s verse goes through the loop five times, for a total of forty bars. It’s not at all typical for a rap song to include a one and a half minute guest verse–it’s almost enough material to make a whole separate song. By giving up so much space in his album opener, Kanye is gaving Chance the strongest endorsement possible, and Chance makes the most of his moment.