Why 60 frames per second should be the standard for next-gen

But why is 60fps the holy grail for graphics hogs? Why not 70? Well, it's to do with TV sets. The difference between 30 frames per second and 60 frames per second sounds huge, but it’s actually rather subtle. American and Japanese CRT TVs refresh their still picture a total of 30 times every second. But, crucially, each one of those frames is constructed in two sweeps.

That’s where 60fps comes in. Each sweep between odd and even lines of pixels contains a new, up-to date version of the image. For PAL TVs, the rates are 25fps with two scans, totalling 50fps (or said another way, 50hz). So when everyone in the UK talked about 60fps in the '90s, it was actually just 50fps. But the result was the same: Every fresh scan had a fresh update to the image, making it look better.

Modern TVs don't construct their picture in the same ways as CRT, but the difference between 30/60 is still easy to describe. In layman's terms, 60fps looks mega-smooth. I would show you a video of it, but YouTube itself is capped at 30fps so you wouldn't see any difference. But load up any 60fps game like Call of Duty: Black Ops II (Wii U version notwithstanding), Dead or Alive 5, or Bayonetta on Xbox 360 and you'll immediately see how smoothly they move.

In fact, Bayonetta is the perfect example because it runs at around 30fps on PS3 and (mostly) 60fps on Xbox 360. The difference this makes to the experience is massive, especially if you come to one version after playing the other for a decent length of time. This video from DigitalFoundry illustrates the difference in frame-rate well:

It's worth noting that we still gave Bayonetta 10/10 at review, even though we only received the PS3 version prior to release. A great game is a great game. But a great game running at 60fps will always be better. And, crucially, once you've seen how it's supposed to be, you'll never go back.

So I say why not have next-gen machines just deliver this premium experience all the time? Let's not forget what id's John Carmack said, though: "[Next-gen] will let us do everything we want to do now, with the knobs turned up. If you take a current game like Halo which is a 30 hertz game at 720p; if you run that at 1080p, 60 frames with high dynamic frame buffers, all of a sudden you've sucked up all the power you have in the next-generation. It will be what we already have, but a lot better."

I'm actually OK with that. Because for me (and perhaps this goes back to those dreadful 32-bit days), a solid 60fps frame-rate demonstrates a developer's complete mastery of a console's hardware. It means a game always looks 100% assured in its movement and responds quicker to control inputs. It ages better. But it would also convince the masses that the new generation of consoles are awesome because they'd be purring along like a cheetah on catnip. I can picture the adverts now...

Of course, anything left over should go into everything else. The eminently quotable industry analyst Michael Pachter has gone on-record to say he believes the PS4 will be able to render games at 240fps. That's madness. At least in terms of frame-rate. For instance, the physics engine of GRID 2 is said to run at 1,000 hertz. That’s 1,000 cycles per second, or 33 samples of the car’s tyres against the track for every one frame of the 30fps visuals shown to the gamer. The car drives and handles more smoothly as a result. That's where progress should be made. There's no need for 240fps graphical refresh rates because your eyes literally can't see the difference. Few TVs even display images that quickly.