Amidst all the brouhaha in recent months over Roger Ebert's denunciation of the possibility of video games being considered as art forms, few seem to have asked why the eminent Chicagoan film critic might wish to decry the aesthetic status of works he admits to having never engaged with. Most seem to be content with assuming that it is Mr Ebert's age that is the barrier, what Techland calls the "old man theory". I would argue, however, that were this case it should be seen as more of a problem for the possible acceptance of video games as art, than for Ebert.
The question remains why Ebert felt the need to wade into a zone he is obviously unfamiliar with. The answer, I suspect, comes from the subjective feeling that his own zone of expertise is being encroached upon by video games. Over the course of the last few decades, whether video games are becoming art or not, there have certainly been numerous instances of art - and perhaps particularly film - becoming video games.
The first time I felt particularly struck by this was in the cinema watching the second offering from the Lord of the Rings franchise (I hadn't bothered with the first, but I'd read the books as a child so the story, at least, was familiar). Between recoiling from the filthy looks I received from others in the audience every time I failed to stop myself from laughing out loud at all of Legolas's lines ("A red sun rises, blood has been spilled this night" and so on), I developed the uncanny impression that I was watching someone else play a computer game. Someone else was having the fun here, and there seemed little point in even attempting to vicariously enjoy that. My presence, I felt, was unnecessary. Odd, perhaps, that the reminiscence of a more 'active' pass time in the midst of a supposedly 'passive' one should end up inducing a greater sense of passivity than ever before.
The first use of CGI in a feature film was neither in The Two Towers nor its predecessor, The Fellowship of the Ring, but in Westworld, from 1973 - the same year that the first ever coin-operated arcade game, Spacewar, designed by two MIT students, was featured in the film Soylent Green. However, after the relative commercial failure of Tron (1982) and The Last Starfighter (1984), the first films to make extensive use of Solid 3D CGI, its use was considerably restricted for another decade until Jurassic Park (1993) proved that computer-generated monsters were now capable of generating serious box office revenue.
It is only in the last decade that CGI has become the norm for all special effects in major Hollywood blockbusters. Ironically, though CGI allows complicated-looking effects to be created effectively on a home computer, the period, between 1995 and 2005, which saw its rise to dominance of the effects world, witnessed a leap in the average effects budget of a "wide-release" feature film from $5 million to $40 million. In the words of Hutch Parker, president of production at 20th Century Fox, the effects have grown to the stature of "a character in the movie" in their own right.
In The Lord of the Rings trilogy, Parker's statement is realised quite literally, as several characters - from battle scene extras to major characters - are computer-generated and animated using, for the first time, artificial intelligence. The computer has quite literally become (several) character(s) and in the battle scenes we are literally repeating those demo animations from console games where the computer plays against itself. In the same year as The Fellowship of the Ring, the first Final Fantasy feature film to be adapted from Hironobu Sakaguchi's successful games franchise was celebrated by some for its attempt to create photo-realistic CGI humans, while others were put off by these characters tendency to fall into the 'uncanny valley'. The film was not a success.
It is not just the characters in The Lord of the Rings that might provoke this sense of the uncanny, however, and nor does the film rely exclusively on CGI. Large numbers of old style miniature (and, due to the need for a lot of false perspective shots, even 'bigature') models were made, and vast numbers of extras, real life animals, and so forth. As much as it opened the gates to the fully CGI-oriented films of today, it was at the same time one of the last of the old-style Cecil B. DeMille type affairs of cinema as mass troop warfare. The way these were handled is key though.
Of the backdrops and landscapes, it is frequently nigh-on impossible to tell which have been computer-generated and which are but the glorious vistas of the New Zealand countryside - and yet this is not so much a victory for the machines as it might seem. The CGI still looks as ray-traced and colour-mapped as ever, but, due to the way cinematographer Andrew Lesnie used digital grading on all the images in post-production, it is more the case that now all of the backdrops - even those that were not in fact computer-generated - have the look and feel of CGI. Nature itself falls into its own uncanny valley.
Similarly, though several hundred horses were used to film the battle scenes, fear of the horses getting hurt led the producers to film each horse individually doing battle-type movements in a closed studio and then putting these together in a digitally created environment. What we have then in The Lord of the Rings is digitally created creatures and characters moving in a digitally created environment - or at least creatures, characters and environments for whom the very distinction between the real and the fake has collapsed on the side of the fake. Yet not, as in Toy Story, in the manner of a cartoon. Signatures of the real, little memento mori, crop up frequently in order to signify the realness of the image presented to us, though in fact belying the very same by interrupting the smooth weightlessness of the fake.
There is of course no problem with 'the fake' per se. I see no reason to demand that films - or any art - should slavishly attempt to imitate the real as Aristotle once demanded. And many would argue that CGI looks a lot more realistic than older methods such as actors in monster suits, flying saucers on strings, and even the kind of fairly sophisticated stop motion model animation of Ray Harryhausen and so on. What all of the latter have in common, however, even if for a brief flash you might see the thread or the actor's legs or whatever, is a certain sense of groundedness. They have weight, and make some kind of real impact on their surroundings. CGI on the other hand makes a kind of curious analogue for the self-presentation of 'weightless' postmodern capital; capable of doing anything, assuming any shape, without once touching the ground or making its impact felt upon the commodities it abstractly trades in - until the crisis, that is, which would be the equivalent, perhaps, of those little moments, such as in Minority Report, where for a split second Tom Cruise's head separates from his body, revealing the computer-generated nature of the image.
If it were just the appearance of computer games that were infecting the cinema, however, there would remain little cause for alarm. It may be no coincidence though that in the same year as Jurassic Park created effective jump scares with its CGI-dinosaurs, Super Mario Brothers kick-started a Hollywood trend for films adapted from video games which continues up to this year's Prince of Persia and Resident Evil: Afterlife. It has also become standard practice (albeit witnessing a slight decline in the last couple of years) for new films to spawn a tie-in video game. As a consequence of these trends, and increasing number of people from the games world getting involved in film production, the arcade is coming to influence far more than just the look of certain films.
Reviewing this year's Dreamworks family blockbuster, How To Train Your Dragon, for Screen Jabber, I was repeatedly struck by the way certain scenes seemed to have been written in order to facilitate a smooth transition to the console adaptation, "Every setpiece recalls its arcade double, from the training sequence in the ring, familiar from beat 'em ups like Mortal Kombat, to the final showdown with the big 'end of level' baddy." Sure enough, the release of HTTYD was accompanied by the release of a home computer game, packaged to cash in on the big marketing push allocated to the movie.
Of course, Hollywood's family blockbusters have for a long time depended on their merchandising opportunities, and since at least Return of the Jedi's Ewoks, have even catered elements of the film to suit in advance. But now even films that seem to have no plans for such tie-in opportunities seem to have been infected by video game fever. Even a relatively low budget, independent, adult film from Australia, such as Steven Kastrissios's (2008) The Horseman, lifts its entire structure from an arcade beat-em up, dispensing of its minimal plot (which, anyway, is largely lifted from Schrader's Hardcore) in a quick flashback early on, and then presents us with a series of fights that lead the protagonist inexorably from one to the other and finally towards the final boss of the title for an even bigger fight.
Under such circumstances we might begin to understand why Ebert, a critic who started out at a time when it was still sometimes necessary to defend the claim that cinema is itself an art form, might hesitate before embracing video games - even for all the sophistication possible given a few online video game design classes. And this need not be seen as the elitist pose of a supposedly established now-'high' art repudiating incursion from 'low' culture. The influence on cinema from video games today is comparable to the influence from the theatre after the introduction of sync sound in the 20's and 30's, and may be just as nefarious. Whether video games are 'art' or not, they are certainly not helping this 'art' to become more 'artistic', however we might wish to define art (something that seems generally to be done with extraordinary wooliness by those wishing to defend the artistic status of the video game).