What is the difference between analog and digital recording?

All microphones are analog. They convert pressure waves in the air into electricity. Pressure waves in the air vibrate a little piece of metal, and that generates a fluctuating electrical current. Different kinds of mics have different specific ways of doing this. In dynamic mics, the air vibrates a magnet. This magnet is wrapped in wire, and its motion produces a current in the wire. In condenser mics, the air vibrates a metal plate that’s part of a capacitor with an electrical current already flowing through it. As the plate moves, it blocks or admits more of the current, making the current fluctuate. There are other kinds of mics with other physical setups, but they all do the same thing: they send out an electrical current whose fluctuations match (are an analog for) the fluctuations of air pressure.

Continue reading

Scratching “This Is America”

One of my projects for this summer is to realize my decades-old ambition to learn how to scratch. I borrowed a Korg Kaoss DJ controller from a friend, downloaded Serato, and have been fumbling with it for a week now. The Kaoss DJ leaves much to be desired. The built-in Kaoss Pad is cool, but otherwise it’s too small and finicky. I will definitely want to upgrade to something with big chunky buttons and more haptic feedback in general. Still, the Kaoss DJ is enough to get started with.

For my first serious remix, I thought I would take on Childish Gambino’s “This Is America”–I have the acapella and the instrumental, and it feels like a timely song. I put the instrumental on one deck and the acapella on the other, and did my best to improvise a mix in real time. If you want to hear the result, email me.

I mostly approached this as “soloing” with the acapella, using the instrumental as my “rhythm section.” But I did some improvising with the instrumental too, by looping, and by jumping around between cue points. I don’t consider this to be a polished work of art or anything, but I discovered some pretty cool sounds even at my basic skill level. So I’m excited to see where this leads.

Continue reading

The vocoder and Auto-Tune

The vocoder is one of those mysterious technologies that’s far more widely used than understood. Here I explain what it is, how it works, and why you should care.

Casual music listeners know the vocoder best as a way to make the robot voice effect that Daft Punk uses all the time.

Here’s Huston Singletary demonstrating the vocoder in Ableton Live.

You may be surprised to learn that you use a vocoder every time you talk on your cell phone. Also, the vocoder gave rise to Auto-Tune, which, love it or hate it, is the defining sound of contemporary popular music. Let’s dive in!

Continue reading

Affordances and Constraints

Note-taking for User Experience Design with June Ahn

Don Norman discusses affordances and constraints in The Design of Everyday Things, Chapter Four: Knowing What To Do.

Don Norman - The Design of Everyday Things

User experience design is easy in situations where there’s only one thing that the user can possibly do. But as the possibilities multiply, so do the challenges. We can deal with new things using information from our prior experiences, or by being instructed. The best-designed things include the instructions for their own use, like video games whose first level act as tutorials, or doors with handles that communicate how you should operate them by their shape and placement.

Continue reading

QWERTYBeats research

Writing assignment for Design For The Real World with Claire Kearney-Volpe and Diana Castro – research about a new rhythm interface for blind and low-vision novice musicians

Definition

I propose a new web-based accessible rhythm instrument called QWERTYBeats.

QWERTYBeats logo

Traditional instruments are highly accessible to blind and low-vision musicians. Electronic music production tools are not. I look at the history of accessible instruments and software interfaces, give an overview of current electronic music hardware and software, and discuss the design considerations underlying my project.  Continue reading

Rohan lays beats

The Ed Sullivan Fellows program is an initiative by the NYU MusEDLab connecting up-and-coming hip-hop musicians to mentors, studio time, and creative and technical guidance. Our session this past Saturday got off to an intense start, talking about the role of young musicians of color in a world of the police brutality and Black Lives Matter. The Fellows are looking to Kendrick Lamar and Chance The Rapper to speak social and emotional truths through music. It’s a brave and difficult job they’ve taken on.

Eventually, we moved from heavy conversation into working on the Fellows’ projects, which this week involved branding and image. I was at kind of a loose end in this context, so I set up the MusEDLab’s Push controller and started playing around with it. Rohan, one of the Fellows, immediately gravitated to it, and understandably so.

Indigo lays beats

Continue reading

Milo meets Beethoven

For his birthday, Milo got a book called Welcome to the Symphony by Carolyn Sloan. We finally got around to showing it to him recently, and now he’s totally obsessed.

Welcome To The Symphony by Carolyn Sloan

The book has buttons along the side which you can press to hear little audio samples. They include each orchestra instrument playing a short Beethoven riff. All of the string instruments play the same “bum-bum-bum-BUMMM” so you can compare the sounds easily. All the winds play a different little phrase, and the brass another. The book itself is fine and all, but the thing that really hooked Milo is triggering the riffs one after another, Ableton-style, and singing merrily along.

Continue reading

Inside the aQWERTYon

Update: try the Theory aQWERTYon!

The MusEDLab and Soundfly just launched Theory For Producers, an interactive music theory course. The centerpiece of the interactive component is a MusEDLab tool called the aQWERTYon. You can try it by clicking the image below. (You need to use Chrome.)

aQWERTYon screencap

In this post, I’ll talk about why and how we developed the aQWERTYon.

Continue reading

How should we be teaching music technology?

This semester, I had the pleasure of leading an independent study for two music students at Montclair State University. One was Matt Skouras, a grad student who wants to become the music tech teacher in a high school. First of all, let me just say that if you’re hiring for such a position in New Jersey, you should go right ahead and hire Matt, he’s an exceptionally serious and well-versed musician and technologist. But the reason for this post is a question that Matt asked me after our last meeting yesterday: What should he be studying in order to teach music tech?

Matt is an good example of a would-be music tech teacher. He’s a classical trumpet player by training who has found little opportunity to use that skill after college. Wanting to keep his life as a musician moving forward, he started learning guitar, and, in his independent study with me, has been producing adventurous laptop music with Ableton Live. Matt is a broad-minded listener, and a skilled audio engineer, but his exposure to non-classical music is limited in the way typical of people who came up through the classical pipeline. It was at Matt’s request that I put together this electronic music tasting menu.

So. How to answer Matt’s question? How does one go about learning to teach music technology? My first impulse was to say, I don’t know, but if you find out, please tell me. The answer I gave him was less flip: that the field is still taking shape, and it evolves rapidly as the technology does. Music tech is a broad and sprawling subject, and you could approach it from any number of different philosophical and technical angles. I’ll list a few of them here. Continue reading