The illusion of control – all or nothing
Developers are striving for a more immersive game, a cinematic experience that’ll cram briefcases with cash and break into mainstream culture. While there’s no doubt that the blockbuster games are making more than a sufficient amount of money, the attempts to make these titles more immersive are being killed by the insistence of trying to remove control and at the same time inject interaction.
QTEs and on-screen commands work to remove or force control, to ensure the player witnesses a particular event exactly how the director wanted or does what they’re told. They take an interactive medium and then force it down a narrow pipe; it’s a bottle-neck in design. There are differences between the two, though: one will suddenly ask for a specific command while the other acts as a reminder of what the player needs to do to continue. It feels like video games are at some awkward halfway point, unsure of whether to completely remove control at times or not, and suffering from a lack of trust with the player.
“They take an interactive medium and then force it down a narrow pipe”The finest first-person shooter I’ve played so far this year is F.E.A.R. 3. Not once in the entire campaign was there a quick time event. This is no coincidence. The excessive use of QTEs as an attempt to force players to watch precisely what the game director wants is deplorable. There has been under no circumstances an instance when I’ve thought to myself ‘wow, that QTE was fantastically written and I’m really glad I got to press A then’.
When watching a film, a message doesn’t pop up stating ‘press 1 on your remote control now or it’ll skip back to the beginning of this chapter’ to ensure you’re paying attention. If it’s good enough to watch, I’ll bloody well watch it. Films are immersive because the viewer is passive; allowing them to believe in what is unfolding before their eyes. A quote from the BBFC regarding the classification of games shows how the interaction with a video game breaks the realism and belief of the unfolding events:
“The element of interactivity in games carries some weight when we are considering a video game. We were particularly interested to see that this research suggests that, far from having a potentially negative impact on the reaction of the player, the very fact that they have to interact with the game seems to keep them more firmly rooted in reality.
Now that we have a few franchises running a monopoly on the yearly top game lists, innovation will slow down. Developers will want to mimic the achievements, and in turn conform to even the worst design decisions made. QTEs are a product of this. ‘They want a QTE bit!’, the director cries, and we then get showered with the resulting brown fluids that spray from every on-screen command. Thanks for that.
The reason Duke Nukem 3D is quoted for its interactivity is that it never informed the player that something could be used. You just tried it out, without a prompt of what specific action it happened to involve. The game never broke the fourth wall by flashing up the action needed for the forty-eighth time in the story. Please make note – gamers are not stupid. We are fully capable of remembering what button opens a door.
The film industry of late was struggling with the conviction that mainstream cinema goers were troglodytes, lower-beings who’d only in recent times learnt to shuffle along without scraping their knuckles on the ground. Any and everything had to be explained. Who is he, why did she just say that and a second (terrible) joke to spell out what that first joke really meant – it was insulting.
Then Inception was released, a film, that while not overly complicated, didn’t patronise its audience in the same way. It revelled in letting the audience decide what was happening, and whether it was a dream or not. It made a few dollars in sales too, becoming quite the success.
Games need to adopt the same approach. Gamers are not idiots. We’ve spent a long time with the controller and know where the buttons are without looking. So the control blueprints need to be drawn in a way that doesn’t necessitate unrelenting reminders, continually breaking the fourth wall (a no-no in cinema, the exact medium it’s trying to emulate with an interactive twist). Old games managed it fine, so we can do it now too – hell, they often didn’t even grant a tutorial of the controls.
But now every time an item you’re meant to use or pick up shines, or a ‘press X’ appears, it reminds you that you’re playing a game. If a door has to shimmer to let the player know that this is one of the twenty they’ve passed that actually opens, perhaps you should be reflecting on why you need to point this out. The answer’s simple – your environment has been built in an unintelligent way. Left 4 Dead is an example of getting this right, building a race from A-B which manufactures the illusion of the player choosing the way they’re going, that they’re in control.
On-screen commands and highlighted items take you out of the game world and inform you that you’re on a sofa holding a pad, and now press what it demands to continue because you’re not in command. Like some bullied-child-turned-dictator, this obsession with ruling our actions is overtly apparent. Considering video games are about control, it’s sad to see its key purpose being limited to try and entertain us.
“On-screen commands and highlighted items take you out of the game world”If you need to constantly remind the player what they should be pressing then there is a communication and training failure on the developers’ behalf. You’re more concerned about following the reaction-time commands on-screen, rather than what’s actually happening. While a good cutscene – devoid of sudden player interaction – can add to character development, providing information in a way that might be lost during the game. But it needs to be relevant and not added as a bridge between ‘levels’, or overused to the point of being contrived.
Games are pre-written experiences in essence, there may be variations in when we do certain actions, but the beginning, middle and end is the same. The best games hide this, constructing the belief that the journey we’re taking is unique to ourselves – that’s immersion. The Mass Effect series is a prime example of this.
See, if it really was an immersive experience, you’d already know what to press; it’d make sense in the situation. Why, Crysis 2, are there specific commands to crawl along the floor during a cutscene? Just let me push forward like I would do in the game. You’ve successfully killed the moment with clichéd crap in the name of ‘an experience’.
Bastian got this right, and while it’s guilty of often reminding you what buttons to press, you can get back up from being knocked down by pressing up. It feels a natural thing to press, and the kid reacts by rising to his feet. Developers should look closer at what controls and responses feel innate for players during the course of QA.
Developers’ laziness with QTEs or on-screen commands never ceases to amaze me. Forcing players to watch your directorial work by throwing in a ‘press this or die now’ moment is pathetic; it’s bone idol, frustrating and worst of all plain arrogant. The new Tomb Raider looks good on paper, but it’ll no doubt turn into a series of ‘press X now to make them wobble a bit’ moments. God forbid that they present the player with absolute control.
Developers, if in truth your cutscenes were that first-class then I’d watch them with wide-eyed awe, enjoying the pleasure of witnessing character development or colossal explosions on-screen. And not in the fear that you may throw some random button commands at me to make sure I’m paying attention. While if your control scheme was logical and considered we wouldn’t need reminders, would we?
Some games get it right, and these are few and far between. But it’s attempting a middle ground that grates on me; no more in-between, no more mediocrity. Please, give up the ghost.