June 5, 2014
I HAVE GRACED THE SILVER SCREEN with portrayals of karate masters and hardcore rappers, and I’ve watched a lot of videos on the Internet. But that makes up most of my experience with video. Instead, I’ve spent the bulk of my artistic life as a musician: performing, writing, and recording. So when I started working at Fancy Rhino editing video, naturally I thought to myself, “I can do this, I think.”
I was familiar with music editing software and had dabbled in Final Cut. The technical aspect of the job didn’t scare me too much. After all, you’d be surprised how often professional editors look up how to do stuff on Youtube. What terrified me was that I had no idea how to go about cutting images and linking them together in a coherent way, a way that somehow gets people to cry and laugh and think and stuff. Imagine my surprise when I started editing, without any concrete ideas of what to do, and things seemed to be working. With a lot of “That looks about right,” and “I feel like I should cut it there,” the video began to take shape. The images and sounds took on larger meaning within the whole. I felt like I had nothing solid to latch onto, but it made sense.
As a musician, I was intrigued by the level of abstractness at which I was concretely creating. Surely music is more abstract, right? What is music anyways besides a collection of sounds we call notes that make up harmonies and rhythms? But, that’s it. Music has harmony. It has rhythm. What’s more is that these musical constructs can be further defined by meter and key. When I’m working on a song, I can write the third of a C Major chord so that it sounds on the second beat of the fifth measure. I can write a piece in 4/4, knowing that each measure will have four beats and that most of the sections will be structured with symmetrical phrasing. This is a simplification of music theory and history, but it’s the norm for most of us. When a musician breaks these conventions it affects the listener. If a song skips a beat or a drummer loses count, it’s like, “Say WHAT?!” And when a friend sings out of key during your favorite song, you want them to cram it. These elements of “offness” are apparent, but because of the many years of analyzing music theory, you can also point to what’s “off” about it. Although we might experience the same types of feelings while watching a video, it’s not always as easily identified. Walter Murch, editor of The Godfather, Apocalypse Now, and other modern film staples, thinks that this may have to do with the presence of a written system that represents what is happening abstractly.
“I like to think cinema is stumbling around in the ‘pre-notation’ phase of its history…Not that we haven’t made wonderful things. But if you compare music in the twelfth century with music in the eighteenth century, you can clearly sense a difference of several orders of magnitude in technical and emotional development, and this was all made possible by the ability to write music on paper.” – The Conversations: Walter Murch and the Art of Editing Film, p. 51
Sure, we can zoom out of our Final Cut timeline and see the wider view, but what if we could scan through a written sentence, phrase, or collection of shapes and see the harmonies created by story, visuals, and sound? In music I create according to rhythmic and harmonic structures that I can see and hear, but when I’m editing film I find myself saying, “Cut iiiiitttttt……..now!” Perhaps this “abstractness” is actually a result of the medium’s complexity. Voiceovers, music, mistakes, dialogue, objects in the background, and even the constraints of project length can all affect the editing process, but somehow all of these variables boil down to milliseconds in which we can collectively sense a video’s harmony and rhythm. Something about it is clear, even if we can’t necessarily write it out on paper. Maybe one day, students will read a holographic manuscript of the symphony that is Apocalypse Now or The Godfather. Until then, keep feeling it out.