Public-facing note taking on Music Matters by David Elliott and Marissa Silverman for my Philosophy of Music Education class.
This chapter addresses musical meaning and how it emerges out of context. More accurately, it addresses how every musical experience has many meanings that emerge from many contexts. Elliott and Silverman begin with the meanings of performance, before moving into the meanings of composition, listening and so on. They insist that performance is not an activity limited to an elite cadre of “talented” people, that it is within reach of anyone who has the proper support.
We propose that people’s capacities for and enactments of an intrinsic motivation to engage in different kinds of musicing and listening are extremely widespread phenomena, restricted only by lack of musical opportunities, or ineffective and indifferent music teaching. Indeed, developing a love for and devotion to musicing and listening is not unusual when students are fortunate enough to learn from musically and educationally excellent teachers and [community music] facilitators, and when they encounter inspiring models of musicing in contexts of welcoming, sustaining, and educative musical settings, including home and community contexts (240).
Update: I’ve turned this post into an academic article. Here’s a draft.
The title of this post is also the title of a tutorial I’m giving at ISMIR 2016 with Jan Van Balen and Dan Brown. Here are the slides:
The conference is organized by the International Society for Music Information Retrieval, and it’s the fanciest of its kind. You may well be wondering what Music Information Retrieval is. MIR is a specialized field in computer science devoted to teaching computers to understand music, so they can transcribe it, organize it, find connections and similarities, and, maybe, eventually, create it.
So why are we going to talk to the MIR community about hip-hop? So far, the field has mostly studied music using the tools of Western classical music theory, which emphasizes melody and harmony. Hip-hop songs don’t tend to have much going on in either of those areas, which makes the genre seem like it’s either too difficult to study, or just too boring. But the MIR community needs to find ways to engage this music, if for no other reason than the fact that hip-hop is the most-listened to genre in the world, at least among Spotify listeners.
Hip-hop has been getting plenty of scholarly attention lately, but most of it has been coming from cultural studies. Which is fine! Hip-hop is culturally interesting. When humanities people do engage with hip-hop as an art form, they tend to focus entirely on the lyrics, treating them as a subgenre of African-American literature that just happens to be performed over beats. And again, that’s cool! Hip-hop lyrics have significant literary interest. (If you’re interested in the lyrical side, we recommend this video analyzing the rhyming techniques of several iconic emcees.) But what we want to discuss is why hip-hop is musically interesting, a subject which academics have given approximately zero attention to.
This summer, I’m teaching Cultural Significance of Rap and Rock at Montclair State University. It’s my first time teaching it, and it’s also the first time anyone has taught it completely online. The course is cross-listed under music and African-American studies. Here’s a draft of my syllabus, omitting details of the grading and such. I welcome your questions, comments and criticism.
I’m currently working with the Ed Sullivan Fellows program, an initiative of the NYU MusEDLab where we mentor up and coming rappers and producers. Many of them are working with beats they got from YouTube or SoundCloud. That’s fine for working out ideas, but to get to the next level, the Fellows need to be making their own beats. Partially this is for intellectual property reasons, and partially it’s because the quality of mp3s you get from YouTube is not so good. Here’s a collection of resources and ideas I collected for them, and that you might find useful too.
Let’s just get Vanilla Ice out of the way first. White people and hip-hop, oy.
“Under Pressure” by Queen and David Bowie is a testament to the power of a great bass groove. The song itself is pretty weak sauce–it emerged out of studio jam sessions and it doesn’t sound like it was ever really finished. But what a bass groove!
I’m working with Soundfly on the next installment of Theory For Producers, our ultra-futuristic online music theory course. The first unit covered the black keys of the piano and the pentatonic scales. The next one will talk about the white keys and the diatonic modes. We were gathering examples, and we needed to find a well-known pop song that uses Lydian mode. My usual go-to example for Lydian is “Possibly Maybe” by Björk. But the course already uses a Björk tune for different example, and the Soundfly guys quite reasonably wanted something a little more millennial-friendly anyway. We decided to use Katy Perry’s “Teenage Dream” instead.
A couple of years ago, Slate ran an analysis of this tune by Owen Pallett. It’s an okay explanation, but it doesn’t delve too deep. We thought we could do better.
I’m delighted to announce the launch of a new interactive online music course called Theory for Producers. It’s a joint effort by Soundfly and the NYU MusEDLab, representing the culmination of several years worth of design and programming. We’re super proud of it.
The course makes the abstractions of music theory concrete by presenting them in the form of actual songs you’re likely to already know. You can play and improvise along with the examples right in the web browser using the aQWERTYon, which turns your computer keyboard into an easily playable instrument. You can also bring the examples into programs like Ableton Live or Logic for further hands-on experimentation. We’ve spiced up the content with videos and animations, along with some entertaining digressions into the Stone Age and the auditory processing abilities of frogs.
If you’ve ever wondered what it is that a music producer does exactly, David Bowie’s “Space Oddity” is a crystal clear example. To put it in a nutshell, a producer turns this:
Here’s what happened in my life as an educator this past semester, and what I have planned for the coming semester.
Montclair State University Intro To Music Technology
I wonder how much longer “music technology” is going to exist as a subject. They don’t teach “piano technology” or “violin technology.” It makes sense to teach specific areas like audio recording or synthesis or signal theory as separate classes. But “music technology” is such a broad term as to be meaningless. The unspoken assumption is that we’re teaching “musical practices involving a computer,” but even that is both too big and too small to structure a one-semester class around. On the one hand, every kind of music involves computers now. On the other hand, to focus just on the computer part is like teaching a word processing class that’s somehow separate from learning how to write.
The newness and vagueness of the field of study gives me and my fellow music tech educators wide latitude to define our subject matter. I see my job as providing an introduction to pop production and songwriting. The tools we use for the job at Montclair are mostly GarageBand and Logic, but I don’t spend a lot of time on the mechanics of the software itself. Instead, I teach music: How do you express yourself creatively using sample libraries, or MIDI, or field recordings, or pre-existing songs? What kinds of rhythms, harmonies, timbres and structures make sense aesthetically when you’re assembling these materials in the DAW? Where do you get ideas? How do you listen to recorded music analytically? Why does Thriller sound so much better than any other album recorded in the eighties? We cover technical concepts as they arise in the natural course of producing and listening. My hope is that they’ll be more relevant and memorable that way. Continue reading