Sword & Sworcery EP

I’m looking forward to the new Superbrothers game, Sword & Sworcery EP. The graphics are essentially old fashioned, but despite this they don’t come across as retro. Part of this is the murky colour palette and the approach to character design. The player character looks lanky and grizzled, like a reanimated corpse. Something about the limbs reminds me of the skeletons from Jason and the Argonauts. That there are a few smooth-looking things that aren’t pixellated also helps the game break away from the past.  I will wait to play the game before making a full judgement, of course, but graphics-wise it’s a nice example of how to use outdated or limited technology while avoiding the dead hand of retro.

Implement-world.

In his WWII diary, Jean-Paul Sartre noted that in the world of the war “the meaning of things has changed” and everything becomes only its usefulness for war: “[L]ong before a bomb destroys a man-made object, the human meaning of the object is already destroyed. In wartime we wander through an implement-world.”

What does this remind you of? It reminds me of architecture in videogames. The more “realist” especially. Think about exploring say a GTA game and coming across false doors that are just decoration on a wall of a building that has nothing inside it but a void. It also makes me think of the City of London at the weekend. Everything shut, even the coffee shops and sandwich bars, the buildings may as well be impenetrable sheets of glass. Having only one purpose makes these places empty. In a war everything becomes it’s usage for war. In a videogame if it’s not in the script then it doesn’t exist. In the City of London the buildings exist only for office workers.

Do I expect every door to be a door in videogames? No. Facade-doors often exist in games that reach for realism. Locked, inaccessible doors are to be found in reality more than unlocked ones. It would be a chore to have to investigate what was behind every door in a city level. Nevertheless when walking into solid, door-like walls I tend to feel disappointed. When walking through a realist FPS videogame city with only my disembodied gun-arm for company and being limited, like in real life, to the street, I feel bored. Oh look, some graffiti like some real graffiti and nothing happening like on a real street. Like facebook, realist videogames are a rather mundane sort of hyperreality.

Remembering for the first time

Originally published 7th March, 2010 @ http://voorface.posterous.com/remembering-for-the-first-time

There was an article in the Guardian’s smug section yesterday where the journalist said he didn’t like videogames and had only ever played one videogame in his life (and would actually prefer to read Martin Amis; a damning indictment, indeed). Then he decided that he did like videogames because the history of videogames is like that of cinema’s and indie games exist now so therefore we are at an interesting moment and videogames are worth playing.

I assume that anyone reading this is familiar – and by now tired – with this argument. The idea that “videogames = cinema” has been widely discredited and I’m glad that people aren’t talking as much about Citizen Kane as they were six months ago. I think most people are aware that the medium is the message and one can’t just plonk the history of one media onto the other and expect it to make sense. But even though the history of videogames is completely different to the history of cinema, not to mention the differences between what cinema and videogames can do, I do think there are some interesting parallels to be made, some comparisons that show what videogames don’t do or do best.

Comparing videogames to cinema is usually done to ligitimise the former. Of course, this is a waste of time. Cinema was considered both an exciting opportunity for new artistic endeavours and a coarse corrupter of society from the very beginning. The moment something has complete social acceptance is the moment no one cares.

So, briefly, let’s look at the history of these two media. Films, we know, started off silent and were a lot shorter. After a few experiments, the first things to be filmed tended either to be plays or technological advancements like the steam train. In a way, this period can be put in parallel to videogames. The first videogames were often simple simulations of tennis or versions of board games like draughts. So the new technology is used to simulate what is already being done elsewhere. When silent films make way for talkies is where, I think, the comparison between film and videogames begins to fall apart. I want to stay with silent films for a bit and look at why this happens.

The most obvious correlation between the two media in their early periods was that they were both “silent”. Silent from human voices, that is. Silent films would most often be shown in a cinema that had a live orchestra playing along and early videogames didn’t have the processing power to mimic the human voice. Mario’s Italian accent, for example, wasn’t featured in a game until Super Mario 64. [1] During the silent era of cinema, and for most of videogame history until the Playstation, actions had to speak louder than words (except when words – real text – were used, of course. You see how quickly these comparisons become muddled). Look at the youtube video above. Buster Keaton and Mario jumping around. The similarities are obvious. The fun of watching/playing comes from the movement of the actor/player character through the environment. There is a key difference, though. In movie action sequences the audience wants the character to die and are pleased when he doesn’t. The possibility of the character dying is exciting. Deep down, it’s what they want to see happen and it rarely does. In videogames the most frustrating thing in the world can be your character dying. It’s the last thing you want to happen and it happens often.

Despite this key difference an action sequence like from a Buster Keaton silent film has something in common with a Mario game that it doesn’t have in common with a “talkie” movie. To explain this I have to offer you my pet theory. People, when talking about movies or videogames or whatever, often evoke dreams as a comparison, a metaphor. Hollywood is the “dream factory”, after all. I think there is an analogy to be made between entertainment media and dreams, but I’m not sure that it’s appropriate for movies. Firstly, let me repeat what I said earlier, that “the medium is the message”. Ultimately films are about films, fiction is about fiction and videogames are about videogames. They’re about other things too, of course, but mainly they are about all the things they could have done, but didn’t do, in their medium. People say “that’s great cinema” or “this is great music” or “this is just bad poetry” or “this is a bad videogame”.

That doesn’t mean we can’t say that one medium reminds us of dreams. Of course we can. We can say it reminds us of oranges if it does. But movies are, to me, more like false memories than dreams.

Memory is a notoriously inaccurate record of events. It is not static. It is constantly changing and is very unreliable. Every time you access a memory you change it. It’s a living thing. That doesn’t sound very much like a movie, does it? Nevertheless I think movies are like memory. In videogames the audience is a player who controls movement in some way. It can be as minor as clicking a “next” button, but the point is it is up to the player when to click “next”. In a movie, the “player” is onscreen and the audience watches passively. In a dream the dreamer sees things from her own eyes and can often control what is happening. If she doesn’t like something in the dream she may try to “run away”. With memory, even if the memory is inaccurate, there is one “true” version of events. Even if we are unsure, we try to come to a decision about what exactly happened. This is how false memories are created. We are compelled to make a decision and in doing so we create something new and – compared to the reality we experienced – something inaccurate. Movies are as static as we want our memories to be. Many people tend to get very annoyed – irrationally annoyed – if a movie turns out to be all “just a dream” or if the version of events a movie presents to them is deliberately unreliable.

Another thing about memory is that we often remember ourselves in the third person. We don’t tend to see our memories through our own eyes. In a dream both points of view can occur and more often than not we do see things in a dream as if we were seeing them right now. Movie audiences very often identify with the protagonist, who is almost always shown in the third person. The reason why I say that films are like false memories, rather than just memories is because of tense. Movies are always in the present tense. Memories are usually in the past tense, “I did this” or “Then that happened”. To use an extreme example of the creation of a false memory, think about someone in therapy being coerced into accessing a “past life”. They might say something like “I’m in an opulent throne room and I’m wearing beautiful clothes. I’m the Queen of Sheba and I’m very important”. No, you’re not. But the point is that it’s in the present tense. They “remember” it like a movie.

This is why films aren’t like dreams. Dreams often mutate, they can change at any moment and can be changed by the dreamer. They feel like they are really happening. Everything becomes more important when it is happening to you.

But isn’t that how people describe watching a film? That they forget their surroundings and feel like they are in the movie? The reason why it’s exciting when Neo jumps an impossible distance isn’t just because you’re right there with him, it’s because you are him. Of course, you don’t truly believe you are him, just like you never truly forget that you are in a cinema or sitting on your sofa, but, if you can suspend your disbelief, it does feel like it. But wait a moment. If you are him, then why can’t you control him? I think perhaps it is more accurate to say that you were him. The events have happened, they are unchangeable, but you are remembering them for the first time. [2] This is the paradox of cinema.

How does this apply to my comparison between silent films and early videogames? [3] Well, remember that I said that silent films have something in common with “silent” videogames that they don’t with “talkies”? I think this is because silent films are, although still like memories, more like dreams than (non-silent) movies are. Without the human voice, and when words are only captions, the present tense of films becomes heightened, movement is fore grounded and events become dream-like. That videogames need a player makes them very much like dreams and completely different from (non-silent) movies. In this sense silent films exist somewhere between videogames and (non-silent) movies. They are “set in stone” like memory, but the role of the audience as the “player” is more pronounced. Movies right now – especially Hollywood movies – are all about the plot. [4] The plot is considered so important that any possible enjoyment of the film will be spoiled if any details of the plot are known by the viewer before seeing the movie. This doesn’t say very much about the quality of Hollywood cinema if the pleasure of watching can be so easily destroyed by such a tiny thing. It seems that people think that if you get rid of the surprise then there’s nothing left. [5] Silent films are, on the other hand, mostly about action; a certain type of action that demands audience participation. The audience have to invent much of the plot – or, rather, fill in the blanks – themselves. There are many films post the silent era that allow room in the plot for the audience to move, but it is becoming less and less popular, especially in Hollywood [6]. Filmmakers that understand that cinema is like memory – like Alain Robbe-Grillet or Jean Rollin – allow for ambiguities in the plot and demand the audience become involved in the creation of the film. They recognise that a film without an audience is nothing and they make it possible for the audience to become more like a participant rather than a passive viewer.

Silent films treat the audience more like a player than later cinema does. This probably isn’t surprising, as much of later cinema didn’t – doesn’t – take place in the cinema. Films used to be shown exclusively to groups of people watching together and they would cheer, boo at or laugh along with the action onscreen, behaviour that is actively discouraged at most cinemas today. The change in audience participation with movies is, strangely, most obvious in the films that were made during the transition from silent to talkie. Think about a Marx bros. film. There are those big silences after gags that make the film seem half asleep. But that’s because you’re watching it on your own. They were designed to be seen by a cinema audience, an audience that would fill the gaps with laughter.

The potential for letting the audience fill in the gaps (whether the gaps in the plot or the gaps after jokes) in films is why I say they are like false memories. Memory and dreams are both creative acts, it’s just that memory doesn’t feel creative. Cinema loses out to videogames when it forgets that an audience can be a player, that the audience’s imagination can be the most important storytelling tool. This difference between the two media – that one is like memory and the other is like a dream – is one of the many, many reasons that definitive comparisons between the two can be so futile. Saying, like Jacques Peretti does in the Guardian article I mentioned at the beginning, that videogames are in their “John Cassavetes period”, is meaningless. You may as well say that videogames are in their Kevin Keegan period. It’s a tortured metaphor that doesn’t help us better understand either media. What we need is an understanding of how different the history and application of videogames is to that of other media like films. What I think we would need to acknowledge is that the category “videogames” is far too large to make any meaningful generalisations and that making smaller subcategories only compounds the problem. Six months ago I told you that I couldn’t care less if videogames are considered art. In an upcoming post I will lay out why that term is unhelpful when used to describe videogames particularly.

  1. Well, actually Mario had a voice in an educational game before Super Mario 64. But I don’t care.
  2. Or for the 100th time. Remembering isn’t “spoiled” by knowing the outcome of events.
  3. Earlier. Broadly speaking I mean pre-fifth generation.
  4. A rather large generalisation, I know, but who would deny it?
  5. Of course, people complain about “spoilers” for videogames too now, but I think this is just a meme that will – hopefully – go away.
  6. A bi-product of this is the growing length of all Hollywood films, even popcorn blockbusters. Look at the marathon length of one of the most popular films right now, Avatar. It feels it needs to explain everything, but why? We’re used, in the post-modern era, to treating everything as appropriate for critical analysis. And it’s true; nothing is “unworthy” of analysis. Even dumb films like Avatar have a “deeper meaning”, of course they do. Avatar, in fact, is designed to have one. It’s just that the deeper meaning isn’t very deep.

The videos in this post are from a series called Between Silent Film and New Media by Manuel Garin. In August, after I did my first post here, I intended to do a post focused solely on these videos, but it fell by the wayside. To see the rest, visit: http://www.gameplaygag.com/

Originally published 7th March, 2010 @ http://voorface.posterous.com/remembering-for-the-first-time

Are Videogames Art? I Couldn’t Care Less

Originally published 26th August, 2009 @ http://voorface.posterous.com/2618370

Videogames are outselling movies and have been for a while. With Nintendo’s marketing of the Wii and the DS, videogames are now considered, despite spurious controversies over violence, a universal form of entertainment. Now that videogames are more popular and more lucrative than their entertainment competitors, there is even more demand for videogames to be thought of – like film, painting, sculpture, music – as Art.

Why do gamers want videogames to be called Art? Videogames as we understand them today are around 40 years old. Films are over a hundred. Art was invented in the 18th century. If we look at the timeline of film, we can see that it went from a novelty, to crass entertainment, to being explored as a form of Art in no time at all. It is cinema history that people have in mind when they talk about videogames as Art. Music doesn’t have the same trajectory. It starts being considered as Art at the invention of Art. This music is classical music. Folk music doesn’t count because it is the music of the lower classes. Pop and jazz had the same position in the 20th century. Too popular, too poor; not Art. We now consider pop music and jazz to be Art. Well, some of it. The Beatles are called Art, Miles Davis is called Art. Is Britney Spears Art? No, probably not. Not unless an artist uses her image to make Art, like Andy Warhol did to Marilyn Monroe. This has already happened to videogames in the art world, where videogames are used as a medium by people like Cory Arcangel, so aren’t videogames already Art? No. The definition of Art is what ever the artist calls Art equals Art. This is not a new idea. In fact, it is older than videogames. What gamers want is Citizen Kane. They want popular prestige. They have a romantic view of Art that is stuck in the 19th century. Art as a higher calling. Art as the fullest expression of life. They want videogames to be entered into the pantheon.

But why do they want this? If they really knew what Art was, would they want videogames to be Art? Do they want the inflated prices and exclusivity of the Art market? I doubt it. They think that Art doesn’t have to be about that, that it’s not just about the market. They’re wrong. Art is inseparable from the market because it was created for the market. Before the invention of Art, the cultural activities that are now associated with it – painting, sculpture etc – were not considered separate from what is now called “crafts”; pottery, embroidery etc. The creation of Art made painting and sculpture special. Objects made more valuable merely because they were created by a “genius”. A replacement of religion reserved for the ruling classes. Art still, give or take, occupies that position today. Despite many museums and galleries being free and open to the public, the attendence is still predominantly middle class. The only people who can afford to buy Art are rich.

Even though videogames and the computers that play them are expensive, they are no where near as expensive as works of Art. Again we see the affinity with film. Hollywood blockbusters cost obscene amounts of money and the big videogame developers are catching them up in terms of budget, but the end product is (while being overpriced) affordable*. Although videogames have become more populist since the Wii, it is still a mostly middle class world. Indie games, by being cheaper or even free, broaden things, but of course are still limited to the most privileged group of the world; those than can afford computers.

There are some ways that the Art world is similar to the videogame industry. Both are largely misunderstood by the general public and both have been the subject of tabloid trolling. Unlike Hollywood, the Art and videogame businesses have been – and still sometimes are – incorrectly described as “recession-proof”.

Despite this, the videogame industry is in much the same position as the film industry. High profile films and games are expensive to make and are easily pirated. There are games that are as formulaic and trite as any Hollywood blockbuster and there are games as self indulgent and obtuse as any arty flick.

Being in an industry similar in structure to the movie industry, gamers feel that videogames should be treated with the same respect that “classic” films are. This is what they mean by “art”. They are using the term “art” to loosely refer to artistic practices, as in the old, pre-capitalist sense of the word, but they’re keeping the silly ideas about importance, prestige and genius that exist in the modern definition of Art. The problem is the concept of genius is as stupid as the concept of Art as a higher calling. Gamers should avoid making the mistakes of aficionados of other media, pushing pop music/jazz/film/etc into “art” and finding it an empty room.

Look at comics. In the past, comics were demonised in the exact same way as videogames are now. Called juvenile, a waste of time and potential and accused of corrupting people the ruling classes consider inferior like the working class or children. In the 80’s a number of comic artists wanted to change that reputation. The comic that represents that effort is Watchmen. This comic was an attempt to make people realize that comics can be “art” and “art” in this instance means “Citizen Kane”. It wanted to show that comics aren’t limited to being only about “kid’s stuff”. What it spawned was less sophisticated. Instead of making people open to the idea that comics can be limitless in their potential for expression, it ended up being the daddy to a load of copycats, each more goonishly “grim” and “dark” than the last.

It’s obvious that gamers don’t mean “Art” when they say they want videogames to be treated as Art. But the other, wishy-washy definition is no better. Thankfully, videogames will never be Art. They’re just too popular. I don’t doubt that videogames will continue to be used as media in Art works, but just as comics are not Art, just as anything that is not called Art by an artist is not Art, so videogames will remain not-Art.

Videogames reaching for the Serious Business prestige of Citizen Kane is more worrying. Have you seen that film? Did you think it was that great? Did you wonder why it is constantly described by such a large amount of people as “the best film of all time”? The reason it has so much prestige is because it was crowned king and the king is unquestionable. Needless to say, this kind of attitude does not encourage creativity. The canon selection is arbitrary and not a good judge of quality. Surely F For Fake is a better film than Citizen Kane? But that’s the point. With canons, it is not up for discussion. The winners are decided from on high by our cultural leaders and we must accept their superiority. The prestige that gamers want for videogames comes with this stifling attitude. Better let the canonisers believe that videogames are beneath them and leave the rest of us to enjoy their creativity and potential.

Originally published 26th August, 2009 @ http://voorface.posterous.com/2618370

1st

I decided to move from posterous to a more customisable host. I’m going to add a few posts from there to here and keep the posterous going as a surveillance of my activities.