The good people at Noteflight have started doing weekly challenges. I love constraint-based music prompts, like the ones in the Disquiet Junto, so I thought I would try this one: compose a piece of music using only four notes.
The music side of this wasn’t hard. My material tends not to use that many pitches anyway. If you really want to challenge me, tell me I can’t use any rhythmic subdivisions finer than a quarter note. Before you listen to my piece, though, let’s talk about this word, “compose.” When you write using notation, the presumption is that you’re creating a set of instructions for a human performer. However, actually getting your composition performed is a challenge, unless you have a band or ensemble at your disposal. I work in two music schools, and I would have a hard time making it happen. (When I have had my music performed, the musicians either used a prose score, learned by ear from a recording, or just improvised.) Noteflight’s target audience of kids in school are vanishingly unlikely to ever hear their work performed, or at least, performed well. Matt Mclean formed the Young Composers and Improvisers Workshop to address this problem, and he’s doing amazing work, but most Noteflight compositions will only ever exist within the computer.
Given this fact, I wanted to create a piece of music that would actually sound good when played back within Noteflight. This constraint turned out to be a significantly greater challenge than using four notes. I started with the Recycled Percussion instrument, and chose the notes B, E, F, and G, because they produce the coolest sounds. Then I layered in other sounds, chosen because they sound reasonably good. Here’s what I came up with: Continue reading
Everyone can agree that the term “classical music” is silly, unless we’re specifically talking about European music of the Classical period.
It’s incorrect to call Baroque or Romantic or modernist music “classical,” even though we all colloquially do, to the annoyance of the classical tribe. It makes even less sense to call the music of Steve Reich or Julia Wolfe “classical.” So what should we call it?
I’ve been transcribing a lot of beats for the MusEDLab‘s forthcoming music theory learning tool. Many of those beats require swing, and that has been giving me a headache. In trying to figure out why, I stumbled on a pretty interesting shift in America’s grooves over the past sixty or so years. To understand what I’m talking about, you first need to know what swing is. Here’s a piece of music that does not use swing:
Here’s a piece of music that uses a lot of swing:
Music theory is hard. But we make it harder by holding on to naming and notational conventions that are hundreds of years old, and that were designed to describe very different music than what we’re playing now. Here are some fantasies for how note naming might be improved.
Right now, the “default setting” for western diatonic harmony is the C major scale. It’s the One True Scale, from which all else is derived by adding sharps and flats. Why do we use the C major scale for this purpose? Why not the A major scale? Wouldn’t it make more sense if harmonic ground zero for our whole harmonic system was the sequence ABCDEFG? I know there are historical reasons why the unmodified first seven letters of the alphabet denote the natural minor scale, but so what? How is a person supposed to make sense of the fact that scale degree one falls on the third letter of the alphabet?
Furthermore, I question whether the major scale really is the one we should consider to be the most basic. I’d prefer that we use mixolydian instead. The crucial pitches in mixo are close to the natural overtone series, for one thing. For another, Americans hear flat seven as being as “natural” as natural seven, if not more so. While the leading tone is common inside chords, it’s rare to hear it in a popular melody. Flat seven is ubiquitous in the music most of us listen to, and in plenty of other world cultures besides.
As we continue to flesh out the video content for Play With Your Music, I put together this series on rhythm.
Update: I now have a functioning prototype of my app. If you’d like to try it, get in touch.
My NYU masters thesis is a drum programming tutorial system for beginner musicians. It uses a novel circular interface for displaying the drum patterns. This presentation explains the project’s goals, motivations and scholarly background.
If you prefer, see it on Slideshare.
Matthew D. Thibeault. Wisdom for Music Education From the Recording Studio. General Music Today, 20 October 2011.
Stuart Wise, Janinka Greenwood and Niki Davis. Teachers’ Use of Digital Technology in Secondary Music Education: Illustrations of Changing Classrooms. British Journal of Music Education, Volume 28, Issue 2, July 2011, pp 117 - 134.
Digital recording studios in schools are becoming more common as the price of the required hardware and software falls. Matthew Thibeault urges music teachers to think of the studio not just as a collection of gear that can be used to document the “real” performance, but as a musical instrument in its own right, carrying with it an entire philosophy of music-making. Digital studio techniques have collapsed composition, recording and editing into a single act. Since most of the music we encounter in the world is recorded, and most of that digitally, any music program needs to include the recording, sequencing and editing process as part of the core curriculum.
For my grad school thesis, I’m designing an intro-level music education app. I’m operating within the techno/hip-hop paradigm, with an Afrocentric rhythm-oriented approach. Electronic dance music production software had brought me much joy over the years, joy that I’m eager to spread to more people. I firmly believe that everyone is a potential musician, and that the right interface can draw beginners in and motivate them. So as I ponder this project, I’m naturally giving a lot of thought to electronic music interfaces, both software and hardware. And because all interfaces on a screen necessarily involve some music visualization, I’ve been exploring that too. For example, here’s a particularly attractive music interface/visualization, the pitch correction program Melodyne:
Earlier this summer I took Advanced Computer Music Composition, which included a lot of history of the twentieth century avant-gardists. While these people have had a lot of not-so-wonderful ideas about music, they have done a lot of interesting experiments with novel interfaces.
Update: check out my masters thesis, a radial drum machine. Specifically, see the section on visualizing rhythm. See also a more scholarly review of the literature on visualization and music education. And here’s a post on the value of video games in music education.
Computer-based music production and composition involves the eyes as much as the ears. The representations in audio editors like Pro Tools and Ableton Live are purely informational, waveforms and grids and linear graphs. Some visualization systems are purely decorative, like the psychedelic semi-random graphics produced by iTunes. Some systems lie in between. I see rich potential in these graphical systems for better understanding of how music works, and for new compositional methods. Here’s a sampling of the most interesting music visualization systems I’ve come across.
Western music notation is a venerable method of visualizing music. It’s a very neat and compact system, unambiguous and digital, and not too difficult to learn. Programs like Sibelius can effortlessly translate notation to and from MIDI data, too.
But western notation has some limitations, especially for contemporary music. It doesn’t handle microtones well. It has limited ability to convey performative nuance — after a hundred years of jazz, there’s no good way to notate swing other than to just write the word “swing” at the top of the score. The key signature system works fine for major keys, but is less helpful for minor keys and modal music and is pretty much worthless for the blues.
Here’s a suggestion for how notation could improve in the future. It’s a visualization by Jon Snydal of John Coltrane’s solo in Miles Davis’ “All Blues” (I edited it a little to be easier on the eyes.)
Snydal’s visualization is more analog than digital — it shows the exact nuances of Coltrane’s performance, with subtle shadings of pitch, timing and dynamics.
Turntablists use record players to play records in ways they weren’t meant to be played. By speeding up, slowing down and reversing the record under the needle, a whole universe of new sounds becomes possible. The record player as musical instrument is still in its early stages of development. DJs already invented the instrumental sound of hip-hop. I wonder what else they have coming.