I use variations on this project list for all of my courses. In Advanced Digital Audio Production at Montclair State University, students do all of these assignments. Students in Music Technology 101 do all of them except the ones marked Advanced. My syllabus for the NYU Music Education Technology Practicum has an additional recording studio project in place of the final project. Here’s the project list in Google Spreadsheet format.
I talk very little about microphone technology or technique in my classes. This is because I find this information to only be useful in the context of actual recording studio work, and my classes do not have regular access to a studio. I do spend one class period on home recording with the SM58 and SM57, and talk a bit about mic technique for singers. I encourage students who want to go deeper into audio recording to take a class specifically on that subject, or to read something like the Moylan book.
My project-based approach is informed strongly by Matt Mclean and Alex Ruthmann. Read more about their methods here.
I do not require any text. However, for education majors, I strongly recommend Teaching Music Through Composition by Barbara Freedman and Music Technology and Education: Amplifying Musicality by Andrew Brown.
Since George Michael died, I’ve been enjoying all of his hits, but none of them more than this one. Listening to it now, it’s painfully obvious how much it’s about George Michael’s struggles with his sexual orientation. I wonder whether he was being deliberately coy in the lyrics, or if he just wasn’t yet fully in touch with his identity. Being gay in the eighties must have been a nightmare.
This is the funkiest song that George Michael ever wrote, which is saying something. Was he the funkiest white British guy in history? Quite possibly. Continue reading
Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians
I propose a new web-based accessible rhythm instrument called QWERTYBeats.
Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project. Continue reading
Update: I’ve turned this post into an academic article. Here’s a draft.
The title of this post is also the title of a tutorial I’m giving at ISMIR 2016 with Jan Van Balen and Dan Brown. Here are the slides:
The conference is organized by the International Society for Music Information Retrieval, and it’s the fanciest of its kind. You may well be wondering what Music Information Retrieval is. MIR is a specialized field in computer science devoted to teaching computers to understand music, so they can transcribe it, organize it, find connections and similarities, and, maybe, eventually, create it.
So why are we going to talk to the MIR community about hip-hop? So far, the field has mostly studied music using the tools of Western classical music theory, which emphasizes melody and harmony. Hip-hop songs don’t tend to have much going on in either of those areas, which makes the genre seem like it’s either too difficult to study, or just too boring. But the MIR community needs to find ways to engage this music, if for no other reason than the fact that hip-hop is the most-listened to genre in the world, at least among Spotify listeners.
Hip-hop has been getting plenty of scholarly attention lately, but most of it has been coming from cultural studies. Which is fine! Hip-hop is culturally interesting. When humanities people do engage with hip-hop as an art form, they tend to focus entirely on the lyrics, treating them as a subgenre of African-American literature that just happens to be performed over beats. And again, that’s cool! Hip-hop lyrics have significant literary interest. (If you’re interested in the lyrical side, we recommend this video analyzing the rhyming techniques of several iconic emcees.) But what we want to discuss is why hip-hop is musically interesting, a subject which academics have given approximately zero attention to.
The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.
Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.
The same Disquiet Junto project that spawned this wildly recursive remix also involved a few more people remixing my remix. Here’s a family tree of the three first generation source tracks, the seven second generation remixes of those tracks, and the three third generation remixes of the second generation remixes.
You can hear the three third-generation metaremixes below.
Here’s a strange and interesting thing that happened to me. The assignment for Disquiet Junto project 233 was to remix three tracks. The assignment for Junto project 234 was to metaremix one of the remixes from project 233. One of the people whose remix I metaremixed was listening to my track and accidentally had it playing in two different browser tabs simultaneously. He liked how it sounded, so he did a metametaremix with two copies of my metaremix offset by a few beats. It came out amazing!
This summer, I’m teaching Cultural Significance of Rap and Rock at Montclair State University. It’s my first time teaching it, and it’s also the first time anyone has taught it completely online. The course is cross-listed under music and African-American studies. Here’s a draft of my syllabus, omitting details of the grading and such. I welcome your questions, comments and criticism.
My youngest private music production student is a kid named Ilan. He makes moody trip-hop and deep house using Ableton Live. For our session today, Ilan came in with a downtempo, jazzy hip-hop instrumental. I helped him refine and polish it, and then we talked about his ideas for what kind of vocal might work on top. He wanted an emcee to flow over it, so I gave him my folder of hip-hop acapellas I’ve collected. The first one he tried was “Fu-Gee-La [Refugee Camp Remix]” by the Fugees.
I had it all warped out already, so all he had to do was drag and drop it into his session and press play. It sounded great, so he ran with it. Here’s what he ended up with:
I use a project-based approach to teaching music technology. Technical concepts stick with you better if you learn them in the course of making actual music. Here’s the list of projects I assign to my college classes and private students. I’ve arranged them from easiest to hardest. The first five projects are suitable for a beginner-level class using any DAW–my beginners use GarageBand. The last two projects are more advanced and require a DAW with sophisticated editing tools and effects, like Ableton Live. If you’re a teacher, feel free to use these (and let me know if you do). Same goes for all you bedroom producers and self-teachers.
The projects are agnostic as to musical content, style or genre. However, the computer is best suited to making electronic music, and most of these projects work best in the pop/hip-hop/techno sphere. Experimental, ambient or film music approaches also work well. Many of them draw on the Disquiet Junto. Enjoy.