Last week I was Ableton’s guest for Loop, their delightful “summit for music makers.” I was on a panel about technology in music education, and I got to meet a lot of amazing people and hear some good music too. Here’s my live Twitter feed from the event if you want a fine-grained accounting. Otherwise, read on for some high points.
Writing assignment for History of Science and Technology class with Myles Jackson. See a more informal introduction to the vocoder here.
Casual music listeners know the vocoder best as the robotic voice effect popular in disco and early hip-hop. Anyone who has heard pop music of the last two decades has heard Auto-Tune. The two effects are frequently mistaken for one another, and for good reason—they share the same mathematical and technological basis. Auto-Tune has become ubiquitous in recording studios, in two very different incarnations. There is its intended use, as an expedient way to correct out-of-tune notes, replacing various tedious and labor-intensive manual methods. Pop, hip-hop and electronic dance music producers have also found an unintended use for Auto-Tune, as a special effect that quantizes pitches to a conspicuously excessive degree, giving the voice a synthetic, otherworldly quality. In this paper, I discuss the history of the vocoder and Auto-Tune, in the context of broader efforts to use science and technology to mathematically analyze and standardize music. I also explore how such technologies problematize our ideas of virtuosity.
This post documents a presentation I’m giving in my History of Science and Technology class with Myles Jackson. See also a more formal history of the vocoder.
The vocoder is one of those mysterious technologies that’s far more widely used than understood. Here I explain what it is, how it works, and why you should care.
Casual music listeners know the vocoder best as a way to make the robot voice effect that Daft Punk uses all the time.
Here’s Huston Singletary demonstrating the vocoder in Ableton Live.
You may be surprised to learn that you use a vocoder every time you talk on your cell phone. Also, the vocoder gave rise to Auto-Tune, which, love it or hate it, is the defining sound of contemporary popular music. Let’s dive in!
Note-taking for User Experience Design with June Ahn
Don Norman discusses affordances and constraints in The Design of Everyday Things, Chapter Four: Knowing What To Do.
User experience design is easy in situations where there’s only one thing that the user can possibly do. But as the possibilities multiply, so do the challenges. We can deal with new things using information from our prior experiences, or by being instructed. The best-designed things include the instructions for their own use, like video games whose first level act as tutorials, or doors with handles that communicate how you should operate them by their shape and placement.
Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians
I propose a new web-based accessible rhythm instrument called QWERTYBeats.
Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project. Continue reading
The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.
Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.
For his birthday, Milo got a book called Welcome to the Symphony by Carolyn Sloan. We finally got around to showing it to him recently, and now he’s totally obsessed.
The book has buttons along the side which you can press to hear little audio samples. They include each orchestra instrument playing a short Beethoven riff. All of the string instruments play the same “bum-bum-bum-BUMMM” so you can compare the sounds easily. All the winds play a different little phrase, and the brass another. The book itself is fine and all, but the thing that really hooked Milo is triggering the riffs one after another, Ableton-style, and singing merrily along.
The MusEDLab and Soundfly just launched Theory For Producers, an interactive music theory course. The centerpiece of the interactive component is a MusEDLab tool called the aQWERTYon. You can try it by clicking the image below. (You need to use Chrome.)
In this post, I’ll talk about why and how we developed the aQWERTYon.
This semester, I had the pleasure of leading an independent study for two music students at Montclair State University. One was Matt Skouras, a grad student who wants to become the music tech teacher in a high school. First of all, let me just say that if you’re hiring for such a position in New Jersey, you should go right ahead and hire Matt, he’s an exceptionally serious and well-versed musician and technologist. But the reason for this post is a question that Matt asked me after our last meeting yesterday: What should he be studying in order to teach music tech?
Matt is an good example of a would-be music tech teacher. He’s a classical trumpet player by training who has found little opportunity to use that skill after college. Wanting to keep his life as a musician moving forward, he started learning guitar, and, in his independent study with me, has been producing adventurous laptop music with Ableton Live. Matt is a broad-minded listener, and a skilled audio engineer, but his exposure to non-classical music is limited in the way typical of people who came up through the classical pipeline. It was at Matt’s request that I put together this electronic music tasting menu.
So. How to answer Matt’s question? How does one go about learning to teach music technology? My first impulse was to say, I don’t know, but if you find out, please tell me. The answer I gave him was less flip: that the field is still taking shape, and it evolves rapidly as the technology does. Music tech is a broad and sprawling subject, and you could approach it from any number of different philosophical and technical angles. I’ll list a few of them here. Continue reading
My first set of attempts at controllerism used samples of the Beatles and Michael Jackson. For the next round, I thought it would be good to try to create something completely from scratch. So this is my first piece of music created specifically with controllerism in mind.
The APC40 has forty trigger pads. You can use more than forty loops, but it’s a pain. I created eight loops that fit well together, and then made four additional variations of each one. That gave me a set of loops that fit tidily onto the APC40 grid. The instruments are 808 drum machine, latin percussion, wood blocks, blown tube, synth bass, bells, arpeggiated synth and an ambient pad.