The simplest and most effective optical illusion ever is the Necker cube. Which side is in front? The answer is both and neither. Very Zen.
I have a thing for circular rhythm visualizations. So I was naturally pretty excited to learn that Meara O’Reilly and Sam Tarakajian were making an app inspired by the circular drum pattern analyses of Godfried Toussaint, who helped me understand mathematically why son clave is so awesome. The app is called Rhythm Necklace, and I got to beta test it for a few weeks before it came out. As you can see from the screencaps below, it is super futuristic.
The app is delightful by itself, but it really gets to be miraculous when you use it as a wireless MIDI controller for Ableton. Here’s some music I’ve made that way.
I was expecting to use this thing as a way to sequence drums. Instead, its real value turns out to be that it’s a way to perform melodies in real time. Continue reading
Bennett, J. (2011). Collaborative songwriting – the ontology of negotiated creativity in popular music studio practice. Journal on the Art of Record Production, (5), online.
My professional life at the moment mostly consists of teaching classical and jazz musicians how to write pop songs. While every American is intuitively familiar with the norms of pop music, few of us think about them explicitly, even trained musicians. It’s worth considering them, though. While individual pop songs might be musically uninteresting, in the aggregate they’re a rich source of information about the way our culture evolves. Bennett describes popular song as an “unsubsidized populist art form,” like Hollywood movies and video games. The marketplace exerts strong Darwinian pressures on songwriters and producers, polishing pop conventions like pebbles being tumbled in a river.
I was reading this super valuable post by Rob Walker listing different strategies for how to pay attention. Deep attention makes the difference between looking at something and actually seeing it. Rob is talking mostly to visual artists and designers, but his methods work well for musicians too–seeing is to looking as hearing is to listening. Paying attention is the most basic skill an artist needs in any medium, and one of the most basic skills a person needs in life. Not only does artistic practice require attention, but it also helps you learn it. When you look critically at a painting or listen critically to a song, you’re disciplining your attentional system.
Being able to focus deeply has its obvious practical benefits, but it’s also an invaluable tool for making your emotional life more manageable. It’s significant to me that the image below appears in two different Wikipedia articles: attention and flow.
When people ask why we should study the arts, the attention argument is the best answer. The variety of deep attention known as mindfulness is a powerful antidepressant. Teaching the arts isn’t just about cultural preservation and transmission; it’s also a cost-effective public health measure. Music isn’t the only method for practicing your attention, but it’s one of the best. This post will address my preferred method for focusing my musical attention: the infinite loop.
My students are currently hard at work writing pop songs, many of them for the first time. For their benefit, and for yours, I thought I’d write out a beginner’s guide to contemporary songwriting. First, some points of clarification:
- This post only talks about the instrumental portion of the song, known as the track. I don’t deal with vocal parts or lyric writing here.
- This is not a guide to writing a great pop song. It’s a guide to writing an adequate one. Your sense of what makes a song good will probably differ from mine, whereas most of us can agree on what makes a song adequate. To make a good song, you’ll probably need to pump out a bunch of bad ones first to get the hang of the process.
- This is not a guide to writing a hit pop song. I have no idea how to do that. If you’re aiming for the charts, I refer you to the wise words of the KLF.
- You’ll notice that I seem to be talking a lot here about production, and that I never mention actual writing. This is because in 2014, songwriting and production are the same creative act. There is no such thing as a “demo” anymore. The world expects your song to sound finished. Also, most of the creativity in contemporary pop styles lies in rhythm, timbre and arrangement. Complex chord progressions and intricate melodies are neither necessary nor even desirable. It’s all in the beats and grooves.
To make a track, you’ll need a digital audio workstation (DAW) and a loop library. I’ll be using GarageBand, but you can use the same methods in Ableton Live, Logic, Reason, Pro Tools, etc. I produced this track for illustration purposes, and will be referring to it throughout the post:
We usually think of “recorded” and “live” as two distinct and opposed forms of music. But technology has been steadily eroding the distinction between the two. Controllerism is a performance method using specialized control surfaces to trigger sample playback and manipulate effects parameters with the full fluidity and expressiveness of a conventional instrument. Such performance can take place on stage or in the studio.
Controllerism is attractive to me because I came to music through improvisation: blues, jazz, jam bands. I spent years improvising electronic music with Babsy Singer, though she did the beats and loops, not me. My life as a producer, meanwhile, has involved very little improvisation. Making music with the computer has been more like carefully writing scores. Improvisation and composition are really the same thing, but the timescales are different. Improvisation has an immediacy that composing on paper doesn’t. The computer shortens the loop from thought to music, but there’s still a lot of obligatory clicking around.
It’s certainly possible to improvise on the computer with MIDI controllers, either the usual keyboard variety or the wackier and more exotic ones. Improvising with MIDI and then cleaning up the results more meticulously is pretty satisfying, though my lack of piano skills make it almost as slow and tedious an input system as the mouse. Jamming on iPhone and iPad apps like Animoog or GarageBand is better. What they lack in screen real estate, they make up for with form factor. Making music on the computer comes to feel like office work after a while. But you can use the phone or the tablet while lying in bed or on the ground, or while pacing around, or basically anywhere. Multitouch also restores some of the immediacy of playing instruments.
There’s also the option of recording a lot of vocal or instrumental improvisation, and then sorting out all the audio afterwards. This is the most satisfying strategy for infusing electronic music with improvisation that I’ve found so far. You get all the free-flowing body-centered immediacy of live jamming, with no pressure whatsoever to be flawless. However, then you have to do the editing. It’s easier now than it was five or ten years ago, but it’s still labor-intensive. It can take an hour of work to shape a few minutes of improv into musical shape.
All of this time, I’ve had severe DJ envy, since their gear is designed for immediacy and improvisation. It’s a lame DJ indeed who meticulously stitches together a set ahead-of-time in an audio editor. However, DJ tools operate at the level of entire songs. It’s not easy to use Serato to write a new track. I’ve been wanting a tool that gives me the same sense of play, but at the scale of individual samples rather than entire songs.
Enter the APC40. The form factor resembles an MPC, and you can use it that way, to trigger one-shot samples like drum hits or chord stabs. But the intended use case is for Ableton session view, starting and stopping the playback of loops. By default, loop playback is quantized to the bar, so whenever you hit a pad, the loop begins playing cleanly on the next downbeat. (You can set the quantization interval to be as wide or narrow as you want, or disable it completely.) Playing your loops live makes happy accidents more likely. Of course, unhappy accidents are more likely too. But those are easy to fix in Arrange view. When I discovered that NYU has a little-used APC, I signed it out and started teaching myself controllerism. Here’s a picture of it.
It seems complex, and it is. The Starship Enterprise quality appeals to my tech nerd side. Creating an Ableton session for APC playing is like inventing a new musical instrument, every time. After you design your instrument, then you have to learn how to play it. On the other hand, if you design your instrument right, the actual playing of it can be fun and easy. When I set up the APC with some Michael Jackson samples and let Milo try it, he figured out the concept immediately.
My students at NYU and Montclair State are beginning to venture into producing their own tracks. There are two challenges facing them, the small one and the big one. The small challenge is learning the tools: remembering where the menus are and which key you hold down to turn the mouse pointer into a pencil, learning to conceive of notes and beats as rectangles on the piano roll, troubleshooting when you play notes on the MIDI keyboard and no sound comes out. The big challenge is option paralysis. Even a lightweight tool like GarageBand comes with a staggeringly large collection of software instruments, loops and effects, even before you start dealing with recording your own sounds. Where do you even begin?
The solution I’m using with my classes is the shared-sample project. Students are challenged to build a track out of a particular sound, or set of sounds. The easy version requires that they use the given sound, along with any additional sounds they see fit to include. The hard version, and for me the really interesting one, requires that they use the given sound(s) and absolutely nothing else. I was inspired in creating these assignments by the many Disquiet Junto shared sample projects I’ve had the pleasure of participating in. I’m trying out my own project ideas on MSU advanced audio production independent studiers Dan Bui and Matt Skouras, and will soon be giving shared-sample projects to my beginner-level classes as well.
The first assignment I gave Dan and Matt was to use eight GarageBand factory loops to build a track. They were free to do whatever processing they wanted, but they could not use other sounds. Also, they only had an hour to put their tracks together. Here are the loops:
My music-making life has revolved heavily around Ableton Live for the past few years, and now the same thing is happening to my music-teaching life. I’m teaching Live at NYU’s IMPACT program this summer, and am going to find ways to work it into my future classes as well. My larger ambition is to develop an all-around electronic music composition/improvisation/performance curriculum centered around Live.
While the people at Ableton have done a wonderful job documenting their software, they mostly presume that users know what they want to accomplish, they just don’t know how to get there. But my experience of beginner Ableton users (and newbie producers generally) is that they don’t even know what the possibilities are, what the workflow looks like, or how to get a foothold. My goal is to fill that vacuum, and I’ll be documenting the process extensively here on the blog.
Later this week I’m doing a teaching demo for a music technology professor job. The students are classical music types who don’t have a lot of music tech background, and the task is to blow their minds. I’m told that a lot of them are singers working on Verdi’s Requiem. My plan, then, is to walk the class through the process of remixing a section of the Requiem with Ableton Live. This post is basically the script for my lecture.