The Witcher 2 seems to be everywhere at the moment. It sounds like a hell of a game, and I wish I had the hardware to play it. But alongside how great everyone is saying the game is to play, I keep hearing also how good it looks. And that makes me at once strangely wistful and nostalgic, and yet filled with excitement and anticipation. Because I remember that feeling, way back down in the dim, distant, murky parts of my gaming history, that feeling of being blown away by the visual impact of a game. And I really thought it had gone forever.
I was never a console boy. My first encounters with video games was on the 8-bit computing platforms of the ’80s, from the awful graphics of the ZX Spectrum with its eight shade palette and colour bleed to the much more advanced Commodore 64, the graphical powerhouse of its day. The games were new, thrilling, breathlessly exciting things to my virgin generation, unused to technological toys, and for the most part they looked like shit. But we didn’t care: this was the birth of the home gaming movement, and we were too busy being joyously carried along on the crest of a new wave to think about the future.
But the future came, nevertheless. It came in the form of 16-bit computing. I can remember still, with extraordinary vividness for a day more than 20 years ago, coming home from school and having my parents ask me to quickly run an important errand in a forced manner that seemed odd and noticing behind them on the kitchen table a large box swathed with towels in a futile attempt at disguise and knowing, knowing for certain that my Amiga had arrived. I ran my errand as quickly as I could and spent the rest of the day totally absorbed in video games, barely pausing even to eat, as they knew I would. It was on the Amiga that I first discovered the extraordinary potential shock value of updated graphics and sound, the day I shoved Shadow of the Beast into the hungry maw of my machine’s disc drive.
By most objective standards Shadow of the Beast was an awful game. A side-scrolling beat-’em-up/platformer hybrid it was tediously derivative, stupidly difficult, repetitive and driven by an incomprehensible, meaningless plot. But in spite of this, and an eye-wateringly high price tag for the time of £35, the game received critical acclaim and sold by the bucket load. It managed this feat purely because of its graphics and sound. With an enormous colour palette, crisp, fluid sprites and an unheard of 12levels of parallax scrolling powering a stunning piece of visual and audio design Shadow of the Beast looked better than anything else in home computing, like something that should be running on a Cray supercomputer and not the little gray box in your living room. Like almost every other gamer I broke my teeth on its difficulty level and resorted to cheat codes, enduring the dull gameplay for hour after hour just to feast my eyes and ears on the smorgasbord of delights that the game offered. It was wonderful, the attainment of a nirvana that my fifteen-year old self had never dreamed existed.
I can also vividly remember, for entirely different reasons, a conversation I had with some friends around this time about the quality of graphics in video games. We discussed, and agreed, that further advancement in graphical technology would be nice, but was hardly necessary, because 16-bit games looked so good and that it wouldn’t be much longer before we had video quality graphics beyond which any improvement was impossible. I remember that because of the way that later years demonstrated it was a grandiose, naive, ignorant and stupidly arrogant and statement to make. But if you can’t make statements like that when you’re 15, when else can you do it?
And over the coming years, as hardware was upgraded and replaced, it was proved hollow time and time again. On my first PC the game that floored me with its visual was Ultima Underworld. On the next rig, a 486, it was Doom. On my first Pentium machine it was Quake. But each time there was something of a law of diminishing returns. Each time the impact was a little bit less, my reaction a little bit more jaded with experience and weighted with the cynicism of the passing years.
All that changed with the next upgrade though. When Quake II came out I bought myself a brand new PC with a hot graphics card just so I could play that particular game. The guy that built it for me slung a copy of a game I’d never heard of, Unreal, into the box for me to boot up when I’d got the machine installed. And this I duly did, and such was my astonishment that I called my non-gaming wife in from the living room to share the moment with me and she, normally totally disinterested in my hobby, sat in open-mouthed wonder, desultorily poking at the mouse from time to time just to make the viewpoint change. I was so overawed by this, my first ever experience of a game properly rendered in 3D polygons with full lighting effects, that I spent that whole first evening just wandering in circles round the lake in the opening scene of the game, looking at the crystals on the ground, the water in the pool, the stars in the sky, discharging my weapon into the distance just to watch the bolts fade into obscurity.
Of course, I eventually got round to doing the proper thing and venturing deeper into the environs of the game to kick some scaly alien buttock, but there were repeated occasions when I’d be absorbed so totally by the visual design that some enemy or other would walk right up and blow me away without my noticing until it was too late. It was wonderful to have that feeling again, dragging me right back to those first moments in front of Shadow of the Beast, the ultimate digital nostalgia trip.
But that was the last time.
Bigger PCs with beefier graphics cards didn’t reproduce it, nor did the first console I ever owned, the Xbox. Halo and Half-Life 2 are probably the most graphically advanced games I’ve played extensively and even though I took the time to sit back and note the resolution and the detail in those games and nod in satisfaction, appreciating the effort that went into the design and development, that wow factor seemed to have gone forever. Why? Partly, and in danger of replicating my teenage hubris, I feel that while photorealistic graphics are still a ways away in video games, once you’ve got to the point of realistic physics and lighting effects, all there is to do is to increase the resolution and add detail. And while that helps things look pretty, it’s not the sort of earth-shattering advance in visuals that we’ve seen in older iterations of hardware development.
I suspect this may also be part of the reason why Sony got trumped in the current generation by the Xbox 360. The previous console generation may well have been the last one where there was a genuine quantum leap in terms of graphical processing power, and because of that gamers were still drawn towards the superior hardware of the original Xbox and some people bought one over the PS2 on that, and that alone. Sony must have known this, so for the next generation they pulled out all the stops to deliver the beast of a machine that is the PS3, not realising that in this generation, graphical power was no longer going to be the hot selling point it had been in the past, because we’re in a place now where all designers and developers can do is tweak the resolution and the details that’s on offer.
Another culprit in decreasing appreciation for video game graphics might have been the advent of genuine photorealistic computer graphics in films. We’re still a step away from genuine lifelike movement and expressions, but it’s hard to admire the visuals of a computer game when computer effects in Hollywood products you can see every day on your TV have become so common and so detailed that you barely notice them anymore.
It made me sad to think that those moments, those few precious moments of wonder that I’d shared with my computer games as we’d grown up together, were something that nascent gamers, born into a world where visualising dreams had become commonplace, might never experience with their own PCs and consoles. And now we have The Witcher 2, and that’s the first time in a long, long time that I’ve really noticed games journalists writing about the graphics in a game with anything like that childish tinge of astonishment and appreciation. It’ll be awhile before I get the chance to play Witcher 2, and then there’s a new hardware generation to think about, but it seems as though there’s a little spark of hope that I, and millions of others, might not have seen our last “wow” moments after all.