I use variations on this project list for all of my courses. In Advanced Digital Audio Production at Montclair State University, students do all of these assignments. Students in Music Technology 101 do all of them except the ones marked Advanced. My syllabus for the NYU Music Education Technology Practicum has an additional recording studio project in place of the final project. Here’s the project list in Google Spreadsheet format.
I talk very little about microphone technology or technique in my classes. This is because I find this information to only be useful in the context of actual recording studio work, and my classes do not have regular access to a studio. I do spend one class period on home recording with the SM58 and SM57, and talk a bit about mic technique for singers. I encourage students who want to go deeper into audio recording to take a class specifically on that subject, or to read something like the Moylan book.
My project-based approach is informed strongly by Matt Mclean and Alex Ruthmann. Read more about their methods here.
I do not require any text. However, for education majors, I strongly recommend Teaching Music Through Composition by Barbara Freedman and Music Technology and Education: Amplifying Musicality by Andrew Brown.
Final paper for Approaches To Qualitative Inquiry with Colleen Larson
Section 1: Reflections on Received View of Research
I was raised by two medical researchers and a former astrophysicist, surrounded by stacks of quantitative journals. I rarely questioned the assumption that quantitative empirical research is the gold standard of truth, and that while subjective accounts are interesting and illuminating, they are not ultimately reliable. From scientists I learned that stories belong to mythology, while facts do not necessarily organize themselves in ways that can be apprehended so easily. Creation myths tell the story of a human-scale world in which humans are the most important element. Astrophysicists tell us that the universe is unfathomably vast and incomprehensibly old, and that we are insignificant in the grand scheme of things, while evolution teaches that we are more like mushrooms or daisies than unlike them. It is axiomatic for scientists that reality is empirically knowable, and while social and emotional considerations are a fact of life, they are noise to be filtered out.
Update: I’ve turned this post into an academic article. Here’s a draft.
The title of this post is also the title of a tutorial I’m giving at ISMIR 2016 with Jan Van Balen and Dan Brown. Here are the slides:
The conference is organized by the International Society for Music Information Retrieval, and it’s the fanciest of its kind. You may well be wondering what Music Information Retrieval is. MIR is a specialized field in computer science devoted to teaching computers to understand music, so they can transcribe it, organize it, find connections and similarities, and, maybe, eventually, create it.
So why are we going to talk to the MIR community about hip-hop? So far, the field has mostly studied music using the tools of Western classical music theory, which emphasizes melody and harmony. Hip-hop songs don’t tend to have much going on in either of those areas, which makes the genre seem like it’s either too difficult to study, or just too boring. But the MIR community needs to find ways to engage this music, if for no other reason than the fact that hip-hop is the most-listened to genre in the world, at least among Spotify listeners.
Hip-hop has been getting plenty of scholarly attention lately, but most of it has been coming from cultural studies. Which is fine! Hip-hop is culturally interesting. When humanities people do engage with hip-hop as an art form, they tend to focus entirely on the lyrics, treating them as a subgenre of African-American literature that just happens to be performed over beats. And again, that’s cool! Hip-hop lyrics have significant literary interest. (If you’re interested in the lyrical side, we recommend this video analyzing the rhyming techniques of several iconic emcees.) But what we want to discuss is why hip-hop is musically interesting, a subject which academics have given approximately zero attention to.
The hippest music teachers help their students create original music. But what exactly does that mean? What even is composition? In this post, I take a look at two innovators in music education and try to arrive at an answer.
Matt McLean is the founder of the amazing Young Composers and Improvisers Workshop. He teaches his students composition using a combination of Noteflight, an online notation editor, and the MusEDLab‘s own aQWERTYon, a web app that turns your regular computer keyboard into an intuitive musical interface.
The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.
Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.
Here’s a strange and interesting thing that happened to me. The assignment for Disquiet Junto project 233 was to remix three tracks. The assignment for Junto project 234 was to metaremix one of the remixes from project 233. One of the people whose remix I metaremixed was listening to my track and accidentally had it playing in two different browser tabs simultaneously. He liked how it sounded, so he did a metametaremix with two copies of my metaremix offset by a few beats. It came out amazing!
This summer, I’m teaching Cultural Significance of Rap and Rock at Montclair State University. It’s my first time teaching it, and it’s also the first time anyone has taught it completely online. The course is cross-listed under music and African-American studies. Here’s a draft of my syllabus, omitting details of the grading and such. I welcome your questions, comments and criticism.
My youngest private music production student is a kid named Ilan. He makes moody trip-hop and deep house using Ableton Live. For our session today, Ilan came in with a downtempo, jazzy hip-hop instrumental. I helped him refine and polish it, and then we talked about his ideas for what kind of vocal might work on top. He wanted an emcee to flow over it, so I gave him my folder of hip-hop acapellas I’ve collected. The first one he tried was “Fu-Gee-La [Refugee Camp Remix]” by the Fugees.
I had it all warped out already, so all he had to do was drag and drop it into his session and press play. It sounded great, so he ran with it. Here’s what he ended up with:
Väkevä, L. (2010). “Garage band or GarageBand®? Remixing musical futures.” British Journal of Music Education, 27(01), 59.
I believe that music education should engage with the music that’s meaningful to students. The field is coming to agree with me. School music programs have been gradually embracing rock, for example via Modern Band. Which is great! Unfortunately, rock stopped being the driver of our musical culture sometime in the early 1990s. The kids currently in school are more about computer-generated dance music: hip-hop, techno, and their various pop derivatives. We live in an Afrofuturist world.
Here’s what happened in my life as an educator this past semester, and what I have planned for the coming semester.
Montclair State University Intro To Music Technology
I wonder how much longer “music technology” is going to exist as a subject. They don’t teach “piano technology” or “violin technology.” It makes sense to teach specific areas like audio recording or synthesis or signal theory as separate classes. But “music technology” is such a broad term as to be meaningless. The unspoken assumption is that we’re teaching “musical practices involving a computer,” but even that is both too big and too small to structure a one-semester class around. On the one hand, every kind of music involves computers now. On the other hand, to focus just on the computer part is like teaching a word processing class that’s somehow separate from learning how to write.
The newness and vagueness of the field of study gives me and my fellow music tech educators wide latitude to define our subject matter. I see my job as providing an introduction to pop production and songwriting. The tools we use for the job at Montclair are mostly GarageBand and Logic, but I don’t spend a lot of time on the mechanics of the software itself. Instead, I teach music: How do you express yourself creatively using sample libraries, or MIDI, or field recordings, or pre-existing songs? What kinds of rhythms, harmonies, timbres and structures make sense aesthetically when you’re assembling these materials in the DAW? Where do you get ideas? How do you listen to recorded music analytically? Why does Thriller sound so much better than any other album recorded in the eighties? We cover technical concepts as they arise in the natural course of producing and listening. My hope is that they’ll be more relevant and memorable that way. Continue reading