It's easy to become cynical about new tech in the games industry. Ray tracing is a prime example. Touted as a technological leap perhaps matched only by the adoption of super-fast Solid State Drives, it's icing on the PS5 and Xbox Series X cakes. However, I understand feeling sceptical about how much it'll tangibly change things.
That's what I thought when booting up Watch Dogs Legion, a game saddled with crossing the generational divide. Despite being rather handsome on our Nvidia RTX 2080 Ti-powered testing PC, the UK-based threequel doesn't seem like a big step forward – not at first, anyway. Then I had my character look down. My God.
Despite taking place in catacombs that stretch under the House of Commons, even the prologue offers a strangely beautiful scene. Water pooling on the muddy ground reflected my surroundings with eerily lifelike clarity, and it adds a subtle depth that increases Legion's sense of place. Especially when you step into London itself: light shimmers off buses and taxi-cabs as they trundle around the capital; and the glow of neon ads bloom across rain-slicked sidewalks. Roads after a downpour are absurdly realistic too, and glass buildings are made so much more believable by an incredibly complex mirroring of the street around them. This is the power of ray-tracing and (what was) one of the best graphics cards in clear-as-day action.
Honestly, words don't do justice to the effect - a better example would be this Reddit video posted by Niklasgunner (via Screenrant). While playing, the effect is subtle enough that you might not notice it at first. But put a ray-traced location next to one without it? The latter becomes flat and strangely two-dimensional, a landscape version of uncanny valley where things aren't quite right.
That's just the beginning of what ray tracing is capable of. Yes, it's an advanced system that provides more believable reflections. But it's also much more. As we explained in our feature on whether ray tracing is the future of gaming, it simulates the way light works in the real world by bouncing off surfaces – as an example, you can see things that aren't actually in frame yet, like an explosion from around the corner reflected on a car. It also influences how shadows work via multiple light sources.
No going back
More importantly, it can influence a scene in real-time. In the past, developers had to spend hundreds of hours faking a realistically-lit shot with numerous tricks and development sleights of hand. Now ray tracing can drastically alter the lighting – and mood – of a scene in a way that's so much more genuine, partially due to AI systems working in the background to ensure things look as good as possible, even if you're not on the absolute best gaming monitor going.
You could argue that this is all set-dressing, and you'd be correct. Yet set-dressing is important. In both single and multiplayer games, it results in an increasingly immersive experience that sinks you into a world so much more. I can only imagine what this will do for the likes of a Bethesda RPG (such as Starfield) or more immediate releases including Spider-Man: Miles Morales. Frankly, it'll be hard to go back.
Not that we'll have to. With next-gen consoles hitting shelves in just a few weeks and the RTX 30-series graphics cards (including the RTX 3070) being so much more affordable than their predecessors, it looks like this will become the new normal.
If I wasn't excited about the possibilities of ray tracing before, I am now.