Sunday, 24 August 2008

Videogames and Digital Musicians

Before I proceed, I must confess that this is stepping on the toes of my thesis a tiny bit – but I think anything I’d write at this stage probably would anyway. Maybe bits of this will find their way into the thesis anyway, so that’ll make it worthwhile.

I’ve been meaning to write this blog post for night on three months now. Obviously I’ve been busy writing other things (I just completed a first draft for my thesis - quite an accomplishment in itself) but now that I’m taking a couple days to catch my breath I wanted to jot down some thoughts I've had which also might incidentally help explain how and/or why I, a Bachelor of Music student, came to be writing a thesis on music in videogames. Because there is a connection.

My major was ‘digital music’ a kind of catch-all subject for anything music related that was technical (nyquist frequency anyone?) or that used a computer. Suffice to say that I love the subject. I hadn’t actually realised until recently but most of my listening habits have arisen from the desire to find people making the kind of music that I felt that I could make. Primarily that is computer and effects based music, with artists like The Knife, Röyksopp, and Junior Boys inspiring me with their synth-tones and crisp TR-808 samples. Anyway, one of the bigger parts of the 'Digital Musics' subject stream was live electronic music performance and it's here that I want to make an observation – videogame designers could stand to learn a lot by looking at live electronic musicians.

Artists like those that perform at the NIME (New Interfaces for Musical Expression) Conference every year, not least of all my own honours supervisor, the brilliant Dr. Garth Paine, have been making interesting and engaging live music with computers for decades. And a central concern in these performances is often keeping the audience engaged and the performer looking like they’re doing more than just checking email. They accomplish this mainly through interesting and engaging performance interfaces like wacom tablets and by connecting swathes of transducers (anything that takes a real world input and converts it to an electronic signal) to their bodies.

So what the heck does that have to do with videogame music? Well as hardware gets more and more powerful, I believe, we are going to see more and more music generated ‘on the fly’ in videogames, and when we do, videogame developers would do extremely well to pay attention to those musicians that have already experimented with making music on the go to glean what they can; what works, what doesn’t, how to keep an audience engaged, and (most importantly for this discussion) how to meaningfully connect audio to visuals.

Significantly, videogames don’t have to worry about keeping an audience visually entertained – the games visuals should have that aspect covered. However, what if the videogame’s visuals much like the electronic musicians visual performance gestures were similarly connected? Can you see the potential? This is getting at the heart of what my thesis is about, so I don’t want to give it all away, but I am incredibly excited by the potential for music in games to be generated by game elements on a level that goes beyond those like what Tommy Tallarico recently depicted. In an interview with Tom Kim on the Gamasutra podcast he described the process of implementing music in videogames, saying:

You have to let them [the music programmers] know that you want this version to play when there are a hundred guys on horseback and this other version when there's only one.[1]

Seriously guys (and most composers are guys, by the way - an issues that’s also planned for the thesis), we’ve had the processing power of multi-core CPU’s since the start of this generation of videogame consoles and multiple gigabytes of RAM, but the very best we can come up with is “play this song now, and this other one later”? Because that’s essentially what it boils down to. Composer for games Phil Richard Jacques said in an interview on Gamasutra about his own ‘interactive’ scores that

the way I want to do [music] interactively is have a musical transition, or rise or whatever that sounds very musical... rather than hard-cutting two tracks, because I think it can be done better than that.[2]

Okay that’s great, but what about doing something a bit more than that? With all due respect to Mr Jacques, I think the scope for ‘interactivity’ is just a little bit broader than just making transitions sound smooth. He goes on to say later in the interview that the original Outrun game from way back in the 80’s, in a move that seems to pre-empt current videogame musical interactivity, had a faux interactive score in the sense that the music was composed so that roughly around the time that a player reached branching sections at the end of a track the song would change signifying this.

It was actually composed so the music would change where an average player would do the branching at the end.

I mean, it wasn't interactive, but they timed the music so that when you went down the end of one course and you branched into the next course, the music would go into a different chorus or something like that.[3]

So if Outrun twenty-odd years ago could do music that in a way reflected what was happening on screen (in it’s own pre-determined and constricted way) and affected the change in music on a structural level within the song – please explain how the best we can do today is make pleasant transitions between tracks?

My answer? I’ll have it for you in writing in, oh, say seven weeks. Actually, add a little bit longer to that for it to be marked as I don’t think they’d appreciate me putting it out there before the markers had a chance to read it. But I think I will have some answers. Also if any of the composers I mentioned end up reading this (hey it happened over on Cruise Elroy!) I’d love to hear their thoughts on the issue – although I have a sneaking suspicion I know what they’d say.

[1] “Video Game Music Strikes a Bigger Chord,”

[2] “Gamasutra - Staying In Tune: Richard Jacques On Game Music's Past, Present, And Future,”

[3] Ibid.

Thursday, 14 August 2008

Thesis Update

[The guy on the far left is me :D]

It's been a while since I last posted and although apologising for not posting more frequently smacks of disingenuousness (if you were really worried about it, you'd do something about it) I'm going to do so anyway. To anyone who cares, Sorry- but I've been busy writing my thesis.

Yes, that's right I'm actually up to the writing stage, despite the fact that I have 8 weeks and 3 days to go. Also I've just passed the rough 10,000 word mark which is pretty spiffy if I do say so myself, as it's a milestone on the way to completion and an order of magnitude larger than anything I've ever written on one subject before.

I would really like to share some parts of it with my readers, but I'm also aware that I maybe shouldn't, as potentially my final marker could end up reading it and getting the wrong impression. It will probably undergo some serious revision before it reaches its final incarnation.

Suffice to say, I've been writing about the games Guitar Hero, Audiosurf and Everyday Shooter which are all must-play games for people interested in what the future of music in videogames looks like. [Or could look like it everyone reads my thesis ;)]

So I'll leave it at that for a while and hope it satisfies the hunger of those desperately desiring an update on my work.


Monday, 4 August 2008

Kountry Gentleman

This song has been in my head for the better part of four days and I just can't seem to get it out.

Maybe sharing it with you will help get it out. Pity about the compressed quality of the sound, oh well, that's YouTube for ya.

And if you didn't like that one, this one is possibly even better.

Too much? Naw...