Starfield's performance is locked on Xbox, Todd Howard says, causing worry about the space sim on PC, but a God of War Ragnarok dev comes to Bethesda’s defense.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
What’s so revolutionary or ambitious about Starfield that it couldn’t be optimized to have “acceptable” framerate? Pretty much everything Starfield does has been done before and the creation engine isn’t some visual marvel that would burn down graphics cards. So where’s the performance going?
What is the absolute most important thing about every video game? They all have it in common: there are zero video games ever made, ever, where this isn’t the absolute most important thing that there is.
The answer is: being able to play it. Is a game that crashes to desktop every time you move the camera a good game? No. If I can feel comfortable judging whether or not a video game is any good based on whether or not it passes that single metric, I feel even more comfortable to extend it to “being able to see it without motion sickness and eye strain”. Wanting your game to be optimized properly and not a juddery slide show isn’t entitlement, it’s the bare minimum of functionality.
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Nobody was motion sick or got eye strain.
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
What’s so revolutionary or ambitious about Starfield that it couldn’t be optimized to have “acceptable” framerate? Pretty much everything Starfield does has been done before and the creation engine isn’t some visual marvel that would burn down graphics cards. So where’s the performance going?
What is the absolute most important thing about every video game? They all have it in common: there are zero video games ever made, ever, where this isn’t the absolute most important thing that there is.
The answer is: being able to play it. Is a game that crashes to desktop every time you move the camera a good game? No. If I can feel comfortable judging whether or not a video game is any good based on whether or not it passes that single metric, I feel even more comfortable to extend it to “being able to see it without motion sickness and eye strain”. Wanting your game to be optimized properly and not a juddery slide show isn’t entitlement, it’s the bare minimum of functionality.
Every video game and every TV program for DECADES ran at 30fps. 29.97, actually. Nobody was motion sick or got eye strain.
Removed by mod
Most games of the NES, Genesis, and SNES era ran at 240p, 60fps (in the NTSC regions).
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
The difference is that TV and movies have a consistent delay between frames. That is often not the case with video games.