Next-gen has to mass-transition to 60fps

Never going to happen because of devs lust with wanting to give you guys the most high fidelity textures and polygon models. Best you can hope for is devs to make it standard to offer a performance mode and a graphics mode in each game they release. Let the gamer choose what they want.
 
Nah. Last gen was a 20 FPS gen, this is a 30 FPS gen with plenty of 60 FPS games as well, and next gen can use the extra CPU and GPU power to bring the 60 FPS target to even more games.
Graphics won't be getting a whole lot better because there isn't a breakthrough in rendering models coming up in the future (Beyond real-time ray-tracing, and that's kinda far away and being marketed as more of a back-end tool) and the budget of games cannot increase anymore.
Well more games could have a 60 fps mode or be 60 fps is a possibility, sure, but a standard ? You can forget it right now, there is a lot that you can do at 30 fps, not just graphics, but physics, game logic, etc under the hood. And graphics will increase a lot in multiple ways, budget or not. Software houses have already high quality assets ready, scaled down for this generation.
 
A hardcore gaming enthusiast will play a game at whatever fps the developer decides upon during development, because a hardcore gaming enthusiast wouldn't let something as petty as framerate stop them from enjoying an incredible experience. Same goes for controller preference and console preference. Any hardcore gaming enthusiast would shut up and enjoy the game as the artist intended.

Sony didn't make the pro to stop people going to PC. They made it because they know there is a demand for better graphics, plain and simple.
Sony said themselves:
https://www.pcgamer.com/sony-says-the-pc-and-not-the-xbox-pushed-it-to-create-the-ps4-pro/

Also lol at "what developer decides" it's not that they "decide" it's that they have to compromise. Ideally devs would want 60fps in all their games but it's not always possible on consoles thanks to hardware limitations and especially this gen very weak CPU held them back. Devs like from Destiny, Planetside and a bunch of other devs through interviews confirmed at points where they wanted to go 60fps on console but they couldn't since the consoles couldn't handle it so had to drop it to 30fps and increase graphics settings which can also appeal more to casual gamers through screenshots so they had no choice

Next gen no doubt we will see more 60fps games than this gen
 

EMGESP

Banned
Member
There will be a lot more games that target 60fps or at least have a performance mode come next-gen, but there should not be a mandate for it to be the standard. Developers should be able to use the power however they see fit.
 
People don't seem to understand a basic thing. People are always demanding better graphics, and that means that 30 FPS will always be the target for many developers. It doesn't matter if they put in a 16 core processor or GTX Titan+ pascal level graphics in the next consoles, developers will still target 30 FPS. The only difference is that games will look better. The lack of 60 FPS this gen is not due to hardware limitations, it's due to developers target. You want a game the size of the Witcher 3 with all it's complex world interactions? 30 FPS. You want a game the size of MGS5 with pretty much empty land, and relatively basic world interactions? 60 FPS. The last thing I want is to play another MGS5 over The Witcher 3 because people are demanding 60 FPS. Developer vision should not be constraint in any way.
Exactly!
 
Re-posting for the new page
Surely this is good enough fidelity
Surely this is good enough fidelity
Surely this is good enough fidelity
Surely this is good enough fidelity
Surely this is good enough fidelity
Surely this is good enough fidelity
Surely this is good enough fidelity

Until we get realtime raytracing and digital humans indistinguishable from photographs/film, there will always be a segment of the industry dedicating resources toward making the most impressive, most photorealistic images at lower frame rates.

And on the other hand, why the limit at 60? 120hz+ displays are becoming more popular for the PC crowd and feel so smooth it's hard to go back to just 60fps. 90hz is bare minimum for VR, so why stop at 60?

Different devs, different games, different priorities. There's never a one-size-fits-all in software development.
 
I've said it before, and I'll say it again and face the inevitable load of people disagreeing with me.
If you're that bothered about framerate and graphics, you need to be gaming on PC
 
I'm probably the only one in the gaming industry who likes his third person games to be 30fps cause of the 'cinematic' feel.

Yeah, I said it. But that's just my opinion. I played GOW 2018 in 30FPS and 60FPS. I preferred playing the main story in 30FPS for that cinematic feel with enhanced visuals and played the challenges in Muspelheim and Niflheim in 60FPS.

Again, I'm surely just a major minority in this aspect, but I would prefer if video games, going forward, gave such options for people who like 60FPS and those who like 30FPS.
 
It wouldn't surprise me if devs push harder for 4k 60fps to limit the amount of additional asset work they will be doing. Gotta try to keep costs and dev times down when possible.
 
Can't remember if I've posted in this thread before given how old it is, but unfortunately it won't happen. Developers will always chase graphical fidelity first and foremost outside of a few genres.

I've said it before, and I'll say it again and face the inevitable load of people disagreeing with me.
If you're that bothered about framerate and graphics, you need to be gaming on PC
Yes, this is the only sure fire way.
 
Let the people choose the framerate. That's it. What's so wrong about this. Amp up the graphics which will come at the cost of a lower framerate for your commercials, trailers, etc but when it comes to playing it, let the people make the choice.
 
It won't happen, as much as I would love it... maybe at first but not for long. I'm hoping system-wide Freesync and per-game framerate unlocks become standard though. I can deal with stutter-free 45fps once they actually start using the CPU next gen
 
I wonder how many TVs will support HDMI 2.1 by the time PS5/X4 come out. Hopefully at least 20%.

Devs can target 45-60 FPS with dips to 30 FPS. People with VRR can game around 45-50 FPS most of the time, and it'll feel almost 60FPS. When they drop to 30 FPS, it shouldn't be for too long.

People without VRR can have locked 30FPS to avoid screen tearing.

This seems more realistic than locked 60FPS for everybody.
 
People say last gen was really a 20fps generation, but the same was true of the PS1/N64 era as well (one could even call that a 15fps generation). I think MAYBE the PS2 era had a higher proportion of 60fps games because it didn't involve a massive resolution jump.True games were generally going from 240p to 480i/p, but it wasn't this massive thing dragging down game performance. Maybe another reason is anti-aliasing wasn't huge back then.

And the guy talking about ray-tracing is right. There are still major graphics milestones consumer video games have yet to reach. And we're not even talking about gameplay features that might depend heavily on CPUs. What if a developer finds they can have more cool stuff going on with AI if they go down to 30fps? There are games that even most PCs can't run at 60fps because of CPU implementation.
 
Not gonna happen. Best case scenario is a mandatory 60fps mode. I believe that's the best solution since it would not only please the "FPS first" crowd but also make sure the mid-gen refresh machines (if they happen) would automatically keep the 60fps even if the CPU doesn't get a big upgrade with the possible addition of better resolution and/or graphics.
 
Nah. Last gen was a 20 FPS gen, this is a 30 FPS gen with plenty of 60 FPS games as well, and next gen can use the extra CPU and GPU power to bring the 60 FPS target to even more games.
Graphics won't be getting a whole lot better because there isn't a breakthrough in rendering models coming up in the future (Beyond real-time ray-tracing, and that's kinda far away and being marketed as more of a back-end tool) and the budget of games cannot increase anymore.
Raytracing's cousin variant, Pathtracing, works in real-time already. Density of dots would be due in further improvements.
 
Next gen is going to be so bad with everyone pushing for 4K and the textures involved so expect framerates to drop like crazy and then wait until Generation 10 to get the best remastered (ie how it should've been in the first place).
 
Based on what we've seen, I really truly feel that we would have seen this if consoles had had better CPUs. I mean, heck we already do see a huge rise, especially in a lot of more well known franchises. And the presence of a lot of now higher framerate remasteres/remakes also helps make it more common.

That actually makes me very glad because at the beginning of this Gen I was very cynical that any and all power would just be put towards fidelity and maybe some cool previously-impossible mechanics if we were lucky. The fact that we saw this huge uptick in 60 fps despite the underpowered netbook CPUs we got makes me very happy.

Hearing Phil Spencer say in that interview that they for sure messed up with the CPU side of things helps me be a lot more confident that both Sony and Microsoft will be making much more balanced systems this time around, and hopefully that 60 fps bug can spread more.

The demand for higher framerates also isn't huge but at the same time what are they going to actually do next gen?
4KKKKKKKK

That said we've already seen Microsoft get Forza 7 to 4K 60 fps and it ain't a bad looking game. My best guess is we'll start seeing a great deal of pseudo 4K/60 fps titles, especially shooters. I feel like that's pretty reasonable prediction.

EDIT: and like some posts above you are saying, things like ray tracing is probably going to make some big moves. Wonder if consoles will get specific raytracing hardware.
 
Next gen is going to be so bad with everyone pushing for 4K and the textures involved so expect framerates to drop like crazy and then wait until Generation 10 to get the best remastered (ie how it should've been in the first place).
Basically like it has always been.

60 fps isn't needed in tons and tons of games. Would be nice to have them, but better graphics, more on screen, more effects, more everything is also nice.
 
If this were to happen almost every high end PC would be built around HFR and the cycle would continue.

30fps is becoming hard for me to enjoy these days.
 
We could have had 60 fps next gen, but then "someone" thought 4k was the next thing to chase. So you can all have 4k / 30 fps and like it.

I would have preferred 1080p / 60 fps so much more. If given the choice over resolution* and frame rate I'd pick frame rate any day. (* 1080p minimum).

And when we finally have acceptable frame rates at 4k the shit will repeat all over with 8k because "gotta sell new TVs with new buzzzzzwordzzz".
 
We could have had 60 fps next gen, but then "someone" thought 4k was the next thing to chase. So you can all have 4k / 30 fps and like it.

I would have preferred 1080p / 60 fps so much more. If given the choice over resolution* and frame rate I'd pick frame rate any day. (* 1080p minimum).

And when we finally have acceptable frame rates at 4k the shit will repeat all over with 8k because "gotta sell new TVs with new buzzzzzwordzzz".
Next gen will be powerful enough for 4k/60 in most games. The most demanding games could use 4k checkerboard. 8k isn't worth it, the diminishing returns are real. You'd need a 100 inch TV to see a real difference.
 
Basically like it has always been.

60 fps isn't needed in tons and tons of games. Would be nice to have them, but better graphics, more on screen, more effects, more everything is also nice.
This one is going to be the worst though, watch and see when those N64 framerates come through with reinforcements. Personally I'd rather have less on the effects and more on the smoothness and if not 60FPS then at least a fixed 30FPS at all times because some titles are pretty much flickbooks when things get hectic on the screen and yet people put up with it as they want to play bullshots and care not about the important things it seems.
 
This one is going to be the worst though, watch and see when those N64 framerates come through with reinforcements. Personally I'd rather have less on the effects and more on the smoothness and if not 60FPS then at least a fixed 30FPS at all times because some titles are pretty much flickbooks when things get hectic on the screen and yet people put up with it as they want to play bullshots and care not about the important things it seems.
Framerates have been far more stable on PS4/XB1 than they were on PS3/360. Digital Foundry have pointed this out many times. The trend has been towards better framerates over the years, not worse.
 
Next gen will be powerful enough for 4k/60 in most games.
Only if games stay with current technologies and fidelity. And we all know that will not happen. Developers will no doubt use the extra power for the graphics arms race as it has always happened. Better, more precise global illumination, more complex and precise shaders, raytracing reflections and shadows which is the new hot thing etc. etc. they'll gobble up all the power and deliver 30 fps at the target resolution.

I'm not a graphics luddite by any stretch (perhaps a resolution luddite), but I'd take all of the above delivered at 1080p / 60 fps over 4k / 30 fps which is what we'll be getting.
 

X1 Two

Banned
Member
Next gen will be powerful enough for 4k/60 in most games. The most demanding games could use 4k checkerboard. 8k isn't worth it, the diminishing returns are real. You'd need a 100 inch TV to see a real difference.
I have been reading the same nonsense ever since SVGA was introduced. Resolution matters. Resolution is detail. You may not see individual pixels at 4K, but you will instantly notice the increase in clarity and details on 8K. Just as it doesn't stop at 30 fps (because your eye can't make out individual frames anymore) but you still notice a difference going from 120 fps to 144 fps. And no, you don't need a 100" TV.
 
I sure hope it'll be, everytime I play something at 30 I think "I sure wish this was 60".
Unfortunetly as long as graphics sell publishers are gonna try to go graphics over performance.
 
Can't wait to play the 4k, 60FPS remasters of The Last Of Us 2, Ghost Of Tsushima, Death Stranding.
I would also love to see Horizon Zero Dawn and God Of War running at 4k, 60.

That said, I am also a graphics whore and can't wait to see what these studios will be able to get out of the Ps5 and I don't mind 30FPS in most games, so I am not sure if I want developers to not crank up the visuals all the way in order to be able to hit 60FPS.

As long as you have fixed hardware, its always a deliberation between using the finite amount of horsepower for visuals or performance.
We always have this discussion at generational transitions, but the issue with 60FPS isn't that consoles don't have enough power, but that they have finite power. This won't change next gen either, so while I expect more games to run at 60FPS, I think most games will still opt to go with 30 and push visuals.
 
I can't play below 60 fps on PC but with consoles I don't mind it if it's stable with proper framepacing.

Some games don't NEED 60 fps, they are better with it, no doubt, but it's not a dealbreaker in my book.
 
This current 8th generation of consoles is ultimately CPU bottle-necked due to Jaguar being the only viable choice.
Despite these limitations however there has been, I feel, an uptick in push for 60fps as we can see from Metal Gear Solid 5 to Resident Evil 7 to Halo 5 & many more. There is wider recognition that games simply play better at a smoother framerate (with a likely wake-up call being the massive success of COD last gen leap-frogging its genre competitors). However there are those developers who have stuck with 30fps due to certain ambitions as there's no way you could i.e. drastically raise the level for human visuals as seen in TLoU2 while doing 60. Not with a Jaguar baseline & a mere handful of TFLOPS at least. Having reached that level however I think it's time to put the focus back on performance come next-gen and these are the reasons why:
  • Silicon-based processor technology has reached a plateau & we've seen this in the PC landscape for a handful of years now. Next-gen consoles will catch up in good fashion in 2019/2020 with a massive CPU jump thanks to Ryzen & likely the Raven Ridge APU being the new baseline that defines the 9th generation. However Moore's law has ended and there's no free-rides on the horizon following this making it crucial to do the transition!
  • Since the middle of last-gen there has appeared many battle-tested software approaches in maintaining framerate such as Dynamic Resolution & more advanced LOD techniques. These have now become more common practice & more readily available.
  • HDMI 2.1 arriving in 2018 will bring variable-framerate-sync to the mass market. Missing that 60fps target occasionally will no longer be a cause for concern as your display adapts accordingly with no tearing in sight. Developers can now dare to push higher instead of locking it down to 30. Input lag will also be kept to a bare minimum.
  • 4K will remain good enough for a long while. At 60fps, checkerboard 4K and its temporal resolution is arguably more than good enough for a long while. Despite intervals shortening between screen-uptake in higher resolutions statistically it is however my impression that there's not much if any developer buy-in for going past 4K render target in the next decade as GPU resources are better spent elsewhere. Hollywood movies for example are still being mastered in 2K-4K.
  • With a higher framerate target, having additional VR support will become drastically easier. Double up like Resident Evil 7 did by offering both a great 2D & VR game in the same package.
  • Again: VR & AR and all the relevant technology pushes & innovations.
  • The TV & Movie industry has been stuck in the 24hz rut for decades despite a few valiant efforts and it's hard to push out of it due to the current ecosystem. Don't let this happen to the interactive medium!
  • AAA game asset-creation costs are already outrageous.
  • Most importantly: Games just downright play better. If you want players to hang onto your GaaS extravaganza then this is one of your most important tickets. I also think the general uninformed public definitely DO appreciate 60fps through "this game is awesome and I want to play more" without being able to point it out directly while still spreading positive word of mouth.
What say you?
I'm not saying a strict 60fps mandate, but just a general heavy PR & cultural push.

Surely this is good enough fidelity
Or what?
 
Top