I’ve been asked enough times for mobile music app recommendations that I decided to collect all of them here. The iOS apps are ones that I’ve personally used and enjoyed. I haven’t tried most of the Android ones, but they were recommended by people whose opinions I trust. If you have suggestions, please add them in the comments. Continue reading
I want to expand my private teaching and speaking practice. If you were to book me for a workshop or seminar, what would you want it to be about? Music production? Intellectual property and authorship? Music and math? Music and science? Music pedagogy? Improvisation and flow, both in music and in life generally? Something else?
I’d be happy to visit your music classroom, non-music classroom, company, co-working space, or community organization. Here are some instructional videos of mine to give you a sense of my style.
I do traditional music teaching and production too, but I’m pitching here to people who don’t consider themselves to be “musicians” (spoiler alert: everybody is a musician, you just might not have found your instrument yet.) Group improvisation on iOS devices or laptops is always a good time, and it’s easier than you would think to attain musical-sounding results. Instrument design with the Makey Makey is a fun one too. If you have Ableton Live and are wondering what to do with it, a remix and mashup workshop would be just the thing. All of the above activities are revelatory windows into user interface and experience design. Group music-making is an excellent team-building exercise, and is just generally a spa treatment for the soul. Get in touch with your suggestions, requests and questions.
Music theory is hard. But we make it harder by holding on to naming and notational conventions that are hundreds of years old, and that were designed to describe very different music than what we’re playing now. Here are some fantasies for how note naming might be improved.
Right now, the “default setting” for western diatonic harmony is the C major scale. It’s the One True Scale, from which all else is derived by adding sharps and flats. Why do we use the C major scale for this purpose? Why not the A major scale? Wouldn’t it make more sense if harmonic ground zero for our whole harmonic system was the sequence ABCDEFG? I know there are historical reasons why the unmodified first seven letters of the alphabet denote the natural minor scale, but so what? How is a person supposed to make sense of the fact that scale degree one falls on the third letter of the alphabet?
Furthermore, I question whether the major scale really is the one we should consider to be the most basic. I’d prefer that we use mixolydian instead. The crucial pitches in mixo are close to the natural overtone series, for one thing. For another, Americans hear flat seven as being as “natural” as natural seven, if not more so. While the leading tone is common inside chords, it’s rare to hear it in a popular melody. Flat seven is ubiquitous in the music most of us listen to, and in plenty of other world cultures besides.
My first set of attempts at controllerism used samples of the Beatles and Michael Jackson. For the next round, I thought it would be good to try to create something completely from scratch. So this is my first piece of music created specifically with controllerism in mind.
The APC40 has forty trigger pads. You can use more than forty loops, but it’s a pain. I created eight loops that fit well together, and then made four additional variations of each one. That gave me a set of loops that fit tidily onto the APC40 grid. The instruments are 808 drum machine, latin percussion, wood blocks, blown tube, synth bass, bells, arpeggiated synth and an ambient pad.
We usually think of “recorded” and “live” as two distinct and opposed forms of music. But technology has been steadily eroding the distinction between the two. Controllerism is a performance method using specialized control surfaces to trigger sample playback and manipulate effects parameters with the full fluidity and expressiveness of a conventional instrument. Such performance can take place on stage or in the studio.
Controllerism is attractive to me because I came to music through improvisation: blues, jazz, jam bands. I spent years improvising electronic music with Babsy Singer, though she did the beats and loops, not me. My life as a producer, meanwhile, has involved very little improvisation. Making music with the computer has been more like carefully writing scores. Improvisation and composition are really the same thing, but the timescales are different. Improvisation has an immediacy that composing on paper doesn’t. The computer shortens the loop from thought to music, but there’s still a lot of obligatory clicking around.
It’s certainly possible to improvise on the computer with MIDI controllers, either the usual keyboard variety or the wackier and more exotic ones. Improvising with MIDI and then cleaning up the results more meticulously is pretty satisfying, though my lack of piano skills make it almost as slow and tedious an input system as the mouse. Jamming on iPhone and iPad apps like Animoog or GarageBand is better. What they lack in screen real estate, they make up for with form factor. Making music on the computer comes to feel like office work after a while. But you can use the phone or the tablet while lying in bed or on the ground, or while pacing around, or basically anywhere. Multitouch also restores some of the immediacy of playing instruments.
There’s also the option of recording a lot of vocal or instrumental improvisation, and then sorting out all the audio afterwards. This is the most satisfying strategy for infusing electronic music with improvisation that I’ve found so far. You get all the free-flowing body-centered immediacy of live jamming, with no pressure whatsoever to be flawless. However, then you have to do the editing. It’s easier now than it was five or ten years ago, but it’s still labor-intensive. It can take an hour of work to shape a few minutes of improv into musical shape.
All of this time, I’ve had severe DJ envy, since their gear is designed for immediacy and improvisation. It’s a lame DJ indeed who meticulously stitches together a set ahead-of-time in an audio editor. However, DJ tools operate at the level of entire songs. It’s not easy to use Serato to write a new track. I’ve been wanting a tool that gives me the same sense of play, but at the scale of individual samples rather than entire songs.
Enter the APC40. The form factor resembles an MPC, and you can use it that way, to trigger one-shot samples like drum hits or chord stabs. But the intended use case is for Ableton session view, starting and stopping the playback of loops. By default, loop playback is quantized to the bar, so whenever you hit a pad, the loop begins playing cleanly on the next downbeat. (You can set the quantization interval to be as wide or narrow as you want, or disable it completely.) Playing your loops live makes happy accidents more likely. Of course, unhappy accidents are more likely too. But those are easy to fix in Arrange view. When I discovered that NYU has a little-used APC, I signed it out and started teaching myself controllerism. Here’s a picture of it.
It seems complex, and it is. The Starship Enterprise quality appeals to my tech nerd side. Creating an Ableton session for APC playing is like inventing a new musical instrument, every time. After you design your instrument, then you have to learn how to play it. On the other hand, if you design your instrument right, the actual playing of it can be fun and easy. When I set up the APC with some Michael Jackson samples and let Milo try it, he figured out the concept immediately.
I contributed a chapter to a soon-to-be-released book, Learning, Education and Games (Volume One): Curricular and Design Considerations. I wrote about the potential value of video games in music education. The book will be out in October 2014. Here’s the table of contents.
We’re having a launch party on October 9th at the NYU Game Center, with a panel on games, featuring the contributors to the series. In addition to myself, the panelists will include Elena Bertozzi and Gabriela Richard. The book’s editor, Karen Schrier, will be moderating.
Update: here’s a drawing of Elena, Gabriela, Karen and myself by Jay Boucher.
I’m working on a long paper right now with my colleague at Montclair State University, Adam Bell. The premise is this: In the past, metaphors came from hardware, which software emulated. In the future, metaphors will come from software, which hardware will emulate.
The first generation of digital audio workstations have taken their metaphors from multitrack tape, the mixing desk, keyboards, analog synths, printed scores, and so on. Even the purely digital audio waveforms and MIDI clips behave like segments of tape. Sometimes the metaphors are graphically abstracted, as they are in Pro Tools. Sometimes the graphics are more literal, as in Logic. Propellerhead Reason is the most skeuomorphic software of them all. This image from the Propellerhead web site makes the intent of the designers crystal clear; the original analog synths dominate the image.
In Ableton Live, by contrast, hardware follows software. The metaphor behind Ableton’s Session View is a spreadsheet. Many of the instruments and effects have no hardware predecessor.
My music-making life has revolved heavily around Ableton Live for the past few years, and now the same thing is happening to my music-teaching life. I’m teaching Live at NYU’s IMPACT program this summer, and am going to find ways to work it into my future classes as well. My larger ambition is to develop an all-around electronic music composition/improvisation/performance curriculum centered around Live.
While the people at Ableton have done a wonderful job documenting their software, they mostly presume that users know what they want to accomplish, they just don’t know how to get there. But my experience of beginner Ableton users (and newbie producers generally) is that they don’t even know what the possibilities are, what the workflow looks like, or how to get a foothold. My goal is to fill that vacuum, and I’ll be documenting the process extensively here on the blog.
Here’s the presentation I’ll be giving of my masters thesis next week, enjoy.
This is the fifth in a series of posts documenting the development of Play With Your Music, a music production MOOC jointly presented by P2PU, NYU and MIT. See also the first, second, third and fourth posts.
Soundation uses the same basic interface paradigm as other audio recording and editing programs like Pro Tools and Logic. Your song consists of a list of tracks, each of which can contain a particular sound. The tracks all play back at the same time, so you can use them to blend together sounds as you see fit. You can either record your own sounds, or use the loops included in Soundation, or both. The image below shows six tracks. The first two contain loops of audio; the other four contain MIDI, which I’ll explain later in the post.