Just as the film and TV industries have their own standards when it comes to the presentation of their media, so too does the video games industry. Since its inception, console video games have typically operated under one of two performance targets: 30fps and 60fps. That said, with the release of the PlayStation 5 and Xbox Series X|S, developers have been able to push both visuals and performance further than ever; as high as 120fps. Interestingly, this leap in GPU/CPU headroom has also allowed for the creation and implementation of a best-of-both-worlds scenario – making 30fps largely redundant in favour of the similar-yet-highly-superior 40 frames per second.
In explaining just how transformative 40fps can be, we must first go back a bit, to the earlier days of the video games industry, Before the adoption of the HDMI standard and LCD/LED screens, video game consoles would use an analogue connection in order to produce the image on (then) CRT screens. As such, in PAL regions (like the UK for example), thanks to the use of 50hz frequencies in its electrical systems, consoles would be able to output games at either 25hz or 50hz – aka 25 or 50 frames per second. Of course, with the advent and standardisation of HDMI following the rise of the high definition era, the new global standard became that of NTSC regions, meaning 30 and 60fps.
Since then, developers – on the console side – have stuck to these metrics, be it due to the capabilities of the consoles or their now-archaic connection ports. Throughout the 7th and 8th generation of consoles (PS3/4), while there did exist some 60fps titles, a majority of console releases opted for the lower 30 frames per second target. This is where the industry stood for a solid decade plus, until the arrival of the 9th-generation.
Before advancing further, I want to touch on the benefits of 30fps, giving it its flowers and explaining why in my opinion the lower framerate can be objectively superior to 60fps in gaming.
Of course, by far the easiest argument for 30fps is in its ability to allow for additional GPU/CPU resources to be utilised elsewhere, facilitating bigger, higher detailed worlds at greater resolutions. The PS4 generation in particular took advantage of this. Even prior to its release, the PS4 and Xbox One were underpowered, especially when it came to their CPUs. In order to convey a strong sense of generational leapage over the PS3 / Xbox 360, developers opted to focus on improving resolutions; texture quality; geometric density and so forth. In fact, many remasters of late-gen PS3 titles were still capped at 30fps – though this did allow for the PS4 versions to boost graphics drastically, using the likes of higher resolutions and improved anti-aliasing to make for a much more clear and visually-consistent experience.
Beyond the obvious, I genuinely feel that in certain games and circumstances, capping a game at 30fps makes for a more impactful experience – and no game is able to convey this better than Insomniac Games’ Marvel’s Spider-Man.
Released initially in 2018 for the PlayStation 4, the open-world superhero title ran at a smooth, locked 30fps. And I do mean smooth. The game’s clear presentation and clean graphics in combination with silky animations and a high quality motion blur meant that despite the lower framerate, Spider-Man’s sense of speed, power and momentum is maintained throughout. In fact, not only is it maintained, but it’s enhanced by this framerate limitation.
With the release of Spider-Man Remastered for the PS5, Insomniac Games both increased visuals with the likes of ray-traced reflections but also framerates, offering both 30 and 60fps modes. Though many selected the performance option and never looked back, it took me but a few minutes to realise that the 60fps mode felt in many ways less satisfying to play.
While Insomniac Games made plenty of use of motion blur on both modes, the added smear and streaks which emanated as I would swing through the city or take a deep swan dive from the top of a skyscraper at 30fps aided significantly in the sense of speed, wind resistance and ultimately gave more indication of the G-forces which Spider-Man would be put under during these superhuman feats of speed and momentum.
Similarly, the animation timing, pacing and ever-so-subtle delay with the game’s combat made each successive hit, kick or flip feel more impactful. In fact, to me, the game’s Spider-Verse inspired even-lower-framerate mode made such combat manoeuvres feel that much more severe. Of course, the rest of the world still operated at 30fps (including the camera), but my point is simply to illustrate that the lower framerate did in my opinion make for more impactful combat and traversal overall. In a way, it’s not too dissimilar to what films do with their own framerate techniques. Speaking of, I must address the elephant in the room – yes, 30fps does make a game feel more ‘cinematic’.
While there was a time when filming at 24fps was a necessity, it's been decades since this limitation was removed, with filmmakers arguably able to produce projects at any arbitrary frame rate they wish. And yes, though there have been a couple of notable instances in which filmmakers have tried to smoothen things out with HFR (high framerates), reception to the move has never been positive, with basically everyone agreeing that 24fps just feels right for such an experience.
Of course, video games are an interactive medium meaning control is a vital aspect of every title – and with lower framerates directly correlating to worse input latency – it makes sense to push for as smooth an experience as possible. That said, it’s been proven since the very start of video games that 30fps is a viable framerate, especially if implemented correctly with consistent frame pacing; correct camera speed and motion blur. In making a game which is trying to be more cinematic than a multiplayer shooter for example, going for 30fps can not only aid in the realisation of the team’s vision, but can in fact enhance it.
Even so, the choices of 30 or 60fps were decided based on the possible output of screens at the time, and as the power throughput of such displays have improved, so have the number of available options for developers to use in order to make their vision come to fruition.
Of course, the PC market have had high refresh rate monitors alongside Display Ports for decades, but thanks to the introduction of the PS5 / Series X|S and their HDMI 2.1 ports, developers were freed to explore many more possibilities for ways to present their projects.
With console games no longer needing to stick to either 30 or 60fps, developers could now experiment with ways in which they could push the hardware further. Be it thanks to the use of dynamic resolution, AI upscaling, 120hz support or VRR, each game’s potential was now unlocked, making higher framerates possible across the board.
For some time, developers stayed away from this new flexible model, instead going for the standard of offering 30fps quality modes or 60fps performance modes. Some games with a particular amount of headroom did also push for 120fps for maximal responsiveness – but it was in PlayStation’s first-party portfolio where we began to witness the birth of a new target performance: the titular 40fps mode.
So, what are the benefits to having a 40fps mode in a game, and why 40fps in particular? Well, though it may seem odd, 40 frames per second is actually perfectly half-way between 30fps and 60fps in terms of responsiveness and frame times. What this means is that developers can allow for a smoother sense of control while still maintaining almost if not all of the visuals from the 30fps mode due to the relative ease of squeezing out an extra 10 frames per second. As mentioned, pretty much all first party PlayStation 5 games which offer a 30fps mode now also include a 40fps option at basically equivalent settings, including the previously-discussed Marvel’s Spider-Man and Miles Morales.
Shortly following the release of the PS5 remaster, Insomniac Games pushed an update which added a 40fps mode alongside the prior 30 and 60fps options. This has now become my default for almost all Sony games, as it functions as an upgrade on all fronts – serving as the goldilocks framerate for video games.
In Spider-Man Remastered / Miles Morales specifically, the 40fps mode maintained all of the previous sense of impact, animation delay and momentum from the 30fps option while giving me that extra sense of visual clarity; maintaining the crispness of a pre-rendered cinematic but during gameplay.
Now, I must admit that I have gone to bat for lower framerates in the past, with me having typically selected the graphics mode in most instances regardless. That said, thanks to the addition of 40fps, I rarely need to decide, as the perfect compromise (or lack thereof) in both visuals and framerates now exists. It really is more than the sum of its parts.
So, what does the future look like for framerates in video games? Well, in my own perfect world, 40fps would become the new console standard, setting itself apart from the film and TV industries with its own unique sense of cinematic personality due to the underutilised framerate. Of course, for those who do prefer as much performance as possible, I do believe devs should continue to offer 60fps modes and beyond. That said, as a complete convert to the 40fps club, its numerous benefits and relative lack of drawbacks means that in my eyes, the ‘next-gen experience’ is that which is offered at 40 frames per second. As Dr Dre said in the unreleased track ‘syllables’: “They said 30′s the new 20. Funny, must mean 40′s the new 30 – Interesting.”
Discuss on our Facebook page HERE.
KitGuru says: Are you a believer in 40fps modes? Can your display correctly support 40hz/fps? What do you believe is the best blend between visuals and framerates? Let us know down below.