A draft of my final paper for Philosophy of Music Education with David Elliott – thoughts welcome as I revise it.
Our world is saturated with recorded music. It is effortlessly accessible, and, at times, inescapable. This environment poses new challenges to anyone who aspires to create or perform music. When we come face to face with the ocean of recordings, it is natural to feel helpless. Does recorded music thus inevitably limit most people to passive appreciation? Or can recordings themselves become the impetus for new kinds of active participation and expression? And if so, how do we balance the right of copyright holders to control the use of their work with our right to make new creative use of that work?
In this paper, I use a framework developed by Turino (2016, 2008) to distinguish between “presentational” and “participatory” music. I inquire into the nature of musical participation, and what (if anything) distinguishes interpretation from creation. I then give an overview of sampling as an artistic practice, paying particular attention to the challenges to this practice posed by copyright law and the status of recorded music as a commercial product. Finally, I ask what our ethical obligations are as musicians toward the copyright regime. Must we always operate within the law even if it conflicts with our creative needs, or should we engage in civil disobedience?
Update: I’ve turned this post into an academic article. Here’s a draft.
The title of this post is also the title of a tutorial I’m giving at ISMIR 2016 with Jan Van Balen and Dan Brown. Here are the slides:
The conference is organized by the International Society for Music Information Retrieval, and it’s the fanciest of its kind. You may well be wondering what Music Information Retrieval is. MIR is a specialized field in computer science devoted to teaching computers to understand music, so they can transcribe it, organize it, find connections and similarities, and, maybe, eventually, create it.
So why are we going to talk to the MIR community about hip-hop? So far, the field has mostly studied music using the tools of Western classical music theory, which emphasizes melody and harmony. Hip-hop songs don’t tend to have much going on in either of those areas, which makes the genre seem like it’s either too difficult to study, or just too boring. But the MIR community needs to find ways to engage this music, if for no other reason than the fact that hip-hop is the most-listened to genre in the world, at least among Spotify listeners.
Hip-hop has been getting plenty of scholarly attention lately, but most of it has been coming from cultural studies. Which is fine! Hip-hop is culturally interesting. When humanities people do engage with hip-hop as an art form, they tend to focus entirely on the lyrics, treating them as a subgenre of African-American literature that just happens to be performed over beats. And again, that’s cool! Hip-hop lyrics have significant literary interest. (If you’re interested in the lyrical side, we recommend this video analyzing the rhyming techniques of several iconic emcees.) But what we want to discuss is why hip-hop is musically interesting, a subject which academics have given approximately zero attention to.
The hippest music teachers help their students create original music. But what exactly does that mean? What even is composition? In this post, I take a look at two innovators in music education and try to arrive at an answer.
Matt McLean is the founder of the amazing Young Composers and Improvisers Workshop. He teaches his students composition using a combination of Noteflight, an online notation editor, and the MusEDLab‘s own aQWERTYon, a web app that turns your regular computer keyboard into an intuitive musical interface.
The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.
Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.
Here’s a strange and interesting thing that happened to me. The assignment for Disquiet Junto project 233 was to remix three tracks. The assignment for Junto project 234 was to metaremix one of the remixes from project 233. One of the people whose remix I metaremixed was listening to my track and accidentally had it playing in two different browser tabs simultaneously. He liked how it sounded, so he did a metametaremix with two copies of my metaremix offset by a few beats. It came out amazing!
This summer, I’m teaching Cultural Significance of Rap and Rock at Montclair State University. It’s my first time teaching it, and it’s also the first time anyone has taught it completely online. The course is cross-listed under music and African-American studies. Here’s a draft of my syllabus, omitting details of the grading and such. I welcome your questions, comments and criticism.
My youngest private music production student is a kid named Ilan. He makes moody trip-hop and deep house using Ableton Live. For our session today, Ilan came in with a downtempo, jazzy hip-hop instrumental. I helped him refine and polish it, and then we talked about his ideas for what kind of vocal might work on top. He wanted an emcee to flow over it, so I gave him my folder of hip-hop acapellas I’ve collected. The first one he tried was “Fu-Gee-La [Refugee Camp Remix]” by the Fugees.
I had it all warped out already, so all he had to do was drag and drop it into his session and press play. It sounded great, so he ran with it. Here’s what he ended up with:
Väkevä, L. (2010). “Garage band or GarageBand®? Remixing musical futures.” British Journal of Music Education, 27(01), 59.
I believe that music education should engage with the music that’s meaningful to students. The field is coming to agree with me. School music programs have been gradually embracing rock, for example via Modern Band. Which is great! Unfortunately, rock stopped being the driver of our musical culture sometime in the early 1990s. The kids currently in school are more about computer-generated dance music: hip-hop, techno, and their various pop derivatives. We live in an Afrofuturist world.
Here’s what happened in my life as an educator this past semester, and what I have planned for the coming semester.
Montclair State University Intro To Music Technology
I wonder how much longer “music technology” is going to exist as a subject. They don’t teach “piano technology” or “violin technology.” It makes sense to teach specific areas like audio recording or synthesis or signal theory as separate classes. But “music technology” is such a broad term as to be meaningless. The unspoken assumption is that we’re teaching “musical practices involving a computer,” but even that is both too big and too small to structure a one-semester class around. On the one hand, every kind of music involves computers now. On the other hand, to focus just on the computer part is like teaching a word processing class that’s somehow separate from learning how to write.
The newness and vagueness of the field of study gives me and my fellow music tech educators wide latitude to define our subject matter. I see my job as providing an introduction to pop production and songwriting. The tools we use for the job at Montclair are mostly GarageBand and Logic, but I don’t spend a lot of time on the mechanics of the software itself. Instead, I teach music: How do you express yourself creatively using sample libraries, or MIDI, or field recordings, or pre-existing songs? What kinds of rhythms, harmonies, timbres and structures make sense aesthetically when you’re assembling these materials in the DAW? Where do you get ideas? How do you listen to recorded music analytically? Why does Thriller sound so much better than any other album recorded in the eighties? We cover technical concepts as they arise in the natural course of producing and listening. My hope is that they’ll be more relevant and memorable that way. Continue reading
The simplest and most effective optical illusion ever is the Necker cube. Which side is in front? The answer is both and neither. Very Zen.
In the process of gathering musical simples, I found a P-Funk loop with a similar effect. It’s a keyboard lick from “Do That Stuff” from The Clones Of Dr Funkenstein.