• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

gamegpuEverybodys Gone to the Rapture Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Why? It would seem to me that a game should run around 30 min fps on the very max settings on the top end card. Otherwise you are kind of making the same claim that some people make about speedometers.... which is that they should only read 0-85mph max because that is the range most people ever use.

Agreed, all games should have an ability to ramp past what contemporary cards can do comfortably. People play games after they're released after all.
 
The trick is not to play everything on Ultra. Adjust some settings a notch or two down, and enjoy smooth framerate while maintaining virtually the same graphics fidelity.

But I know I know, our ego won't allow that.

I think any game released should be able to run at top speeds on ultra at 1080p on the best card available. That rewards people who spend the most for their cards, but also leaves run for improvement in future playthroughs for the majority of people who don't have elite cards. Especially now that SLI is almost dead, the ceiling is the best card out there.

The Crysis situation where the game can't be run on the best card you can buy didn't help Crysis and didn't help gaming.
 
I think any game released should be able to run at top speeds on ultra at 1080p on the best card available. That rewards people who spend the most for their cards, but also leaves run for improvement in future playthroughs for the majority of people who don't have elite cards. Especially now that SLI is almost dead, the ceiling is the best card out there.

The Crysis situation where the game can't be run on the best card you can buy didn't help Crysis and didn't help gaming.

As long as the game is programmed properly (i.e. the performance is commensurate with the IQ) and the graphical settings options are sufficiently granular, there is really no reason* why the developers should limit the IQ level and thus no limit to how demanding the game should be.

Anything else, and you're just sacrificing IQ for future hardware setups.

*other than budgetary ones obviously, which is of course the main limitation for developers. That plus some misguided idea amongst some gamers, that if they are not running the game on max then it's pointless, even though the medium/high settings may still be better than anything else out there (like what was the case with Crysis).
 
I think any game released should be able to run at top speeds on ultra at 1080p on the best card available. That rewards people who spend the most for their cards, but also leaves run for improvement in future playthroughs for the majority of people who don't have elite cards. Especially now that SLI is almost dead, the ceiling is the best card out there.

The Crysis situation where the game can't be run on the best card you can buy didn't help Crysis and didn't help gaming.

Context: When Crysis 1 came out, it was impossible to max it out at 1080p on 2 flagship 8800GTXs (Titan X SLI). Then, GTX 280 SLI still couldn't max it (Big Pascal SLI). I gotta look up the benches but I think it took GTX480 OC SLI to do it (Big Volta SLI). The difference though Crisis and Metro 2033 were easily the best looking games during 2007-2012, easily. When Crisis 1 came out, I had to force it down to 1280x1024 medium and it still looked better than any game I had running at 1920x1080 maxed.

This game (or ARK: Survival Evolved) should not be compared to Crisis 1. It doesn't blow any game out of the water and no way will it be something every next AAA game is compared to for the next 5 years. Nor will people suddenly start upgrading just to max it out. All of these things will forever make Crisis 1 a ground-breaking title.
 
Last edited:
As long as the game is programmed properly (i.e. the performance is commensurate with the IQ) and the graphical settings options are sufficiently granular, there is really no reason* why the developers should limit the IQ level and thus no limit to how demanding the game should be.

It just doesn't make good business sense. Crysis 1 is the perfect example, by the time we had cards that could handle it we weren't concerned if our machines could run Crysis 1 we were concerned that our machines couldn't handle Crysis 2's tessellating hidden water.

I see the argument for something like an Ashes, something that comes out at the dawn of a new framework and shows the potential from the start. But games that are just too demanding for their era otherwise will still have old technology inside when a GPU can finally run it right.

I would rather the resources go to having real 4k textures and real 4k support. That is your "value" out of a future replay, not to run the game like it should run today (60fps at 1080p) but instead to run the game at 4K 60fps on that future monster GPU.
 
It just doesn't make good business sense. Crysis 1 is the perfect example, by the time we had cards that could handle it we weren't concerned if our machines could run Crysis 1 we were concerned that our machines couldn't handle Crysis 2's tessellating hidden water.

I see the argument for something like an Ashes, something that comes out at the dawn of a new framework and shows the potential from the start. But games that are just too demanding for their era otherwise will still have old technology inside when a GPU can finally run it right.

I would rather the resources go to having real 4k textures and real 4k support. That is your "value" out of a future replay, not to run the game like it should run today (60fps at 1080p) but instead to run the game at 4K 60fps on that future monster GPU.

Problem is that the reason it doesn't make good business sense, is because (some) consumers are irrational.

Crysis is the perfect example of this. Some people arguably held off on buying this because they couldn't run it maxed out. However those very same people could probably run it just fine on medium setting, and even on medium settings Crysis looked better than anything else out there at the time.

As such it's a false dilemma. The question shouldn't be how the game runs on your hardware at max settings, but rather how the game looks once you've optimized the settings to get the performance you desire. But as I said consumers don't think this way (at least some of them don't).
 
Problem is that the reason it doesn't make good business sense, is because (some) consumers are irrational.

I think it never makes sense outside of a Ashes-style technology demo. I mean, if your technology is that far enough ahead pick a cut-off point at the max of current hardware and save the rest for a sequel.

I do agree that Crysis maybe inspired people to get better hardware just to run it, but I would argue some of the enthusiasm for that game was it took PC gaming (which for a while was behind the then next gen consoles) and gave the master race a banner to rally under. That isn't really a concern today when every AAA game is basically a console game on roids.
 
So basically milk the consumer?

You must be new to the industry, almost all current major AAA games are sequels. Even worse, gamers WANT sequels and judge platforms on which ones can give them Halo 5 vs Uncharted 4 vs GTA 7 (note those numbers are made up). If we look at Steam sales from last year:

http://www.gamespot.com/articles/steams-20-best-selling-pc-games-of-2015-made-650m-/1100-6433510/

All of the top 5 selling games were sequels. The first original game on the list is Rocket League, which isn't the kind of game that is setting a new high water mark its technology.

The market has spoken, gamers want sequels and they want games that run 60fps on Ultra on their 980 tis to make themselves feel better about the expenditure. Either you cater to your audience, or someone else will.

Maybe VR will change the priorities of the average gamer, but I doubt it.
 
You must be new to the industry, almost all current major AAA games are sequels. Even worse, gamers WANT sequels and judge platforms on which ones can give them Halo 5 vs Uncharted 4 vs GTA 7 (note those numbers are made up). If we look at Steam sales from last year:

http://www.gamespot.com/articles/steams-20-best-selling-pc-games-of-2015-made-650m-/1100-6433510/

All of the top 5 selling games were sequels. The first original game on the list is Rocket League, which isn't the kind of game that is setting a new high water mark its technology.

The market has spoken, gamers want sequels and they want games that run 60fps on Ultra on their 980 tis to make themselves feel better about the expenditure. Either you cater to your audience, or someone else will.

Maybe VR will change the priorities of the average gamer, but I doubt it.

And pray tell, which one of those sequels had IQ features that were cut from their prequels? You know the actual topic at hand.

Besides I'm not saying that it's not in the publisher's best interest to deliberately downgrade IQ to make future titles look better in comparison, I'm simply saying that it certainly isn't in the consumer's best interest. And seeing as consumers were what we were talking about here, I don't know why you're suddenly so concerned about what is or isn't in the publisher's best interest.
 
And pray tell, which one of those sequels had IQ features that were cut from their prequels? You know the actual topic at hand.

The best example I can think of offhand is Farcry Primal. Farcry 4 used Gameworks technology when it was released, while the sequel has new technology built just for it. Or when a game like The Division, or The Witcher 3 or Watch Dogs is scaled down from the initial demo to what we actually get in the game that fancy technology just doesn't go to waste, it certainly will be used at some point in a sequel when the hardware can handle it.

In fact those games are perfect examples of developers limiting what the game does to match what consumer machines can do, just the consumer machines in question are consoles.

And seeing as consumers were what we were talking about here, I don't know why you're suddenly so concerned about what is or isn't in the publisher's best interest.

It is in my best interest as a consumer for the companies that are actually pushing the bar technology-wise to make smart decisions so they CAN stay in the market and make things like sequels/engines that I want to play. If would be way more awesome if Crysis or its sequels were such a massive hit that EVERY AAA game pretty much HAD TO use the engine, we would be so much further ahead technology-wise.

Also as a consumer it is helpful to know why companies make the decision they do, so my expectations are not out of line with reality which can lead to disappointment.
 
The best example I can think of offhand is Farcry Primal. Farcry 4 used Gameworks technology when it was released, while the sequel has new technology built just for it. Or when a game like The Division, or The Witcher 3 or Watch Dogs is scaled down from the initial demo to what we actually get in the game that fancy technology just doesn't go to waste, it certainly will be used at some point in a sequel when the hardware can handle it.

In fact those games are perfect examples of developers limiting what the game does to match what consumer machines can do, just the consumer machines in question are consoles.

So you honestly think that Ubisoft used Hairworks in Farcry 4 because it was less demanding, and then switched to the more visually impressive TressFX in Farcry Primal? The truth is exactly the opposite of this (Hairworks is significantly more demanding than TressFX, and TressFX doesn't look any better than Hairworks).

As for the remaining games you mentioned, none of them where downgraded to save the features for sequels (which is what we were talking about), they were downgraded to create parity between consoles and PC, a completely different motive.

It is in my best interest as a consumer for the companies that are actually pushing the bar technology-wise to make smart decisions so they CAN stay in the market and make things like sequels/engines that I want to play. If would be way more awesome if Crysis or its sequels were such a massive hit that EVERY AAA game pretty much HAD TO use the engine, we would be so much further ahead technology-wise.

Also as a consumer it is helpful to know why companies make the decision they do, so my expectations are not out of line with reality which can lead to disappointment.

Again I'm not disagreeing that doing this is in the best interest of the gaming companies, but that was not what was being discussed. You previously claimed that games should be able to run at top speed on the best cards available, since this is what rewards the consumer the most, however I disagree with that.

Imagine that you have 2 games, A and B. Game A and B are completely identical gameplay wise, performance wise and graphics wise, except game B has an additional graphics setting which looks significantly better than the standard setting (standard setting being identical between game A and B), but is also significantly more demanding (to the point that the best available GPU cannot run it at 60 FPS). Which game would you prefer game companies released to us?
 
It doesn't matter what the freakin settings say. There is no reason a 980TI *has to* run anything at any settings, if the IQ is good enough to match the requirement.

If the average gamer wasn't 11 years old and worried about 'maxed' settings, then they might have bought Crysis at launch, and would have been rewarded with by far the best looking game on the market even though it was only on 'medium.'

I WISH a Crysis 4 would come out that would crush 980TI SLI if it had the same graphical jump that Crysis 1 did. No, upping the resolution and AA don't always count.
 
Agreed, all games should have an ability to ramp past what contemporary cards can do comfortably. People play games after they're released after all.

Depends on whether or not they are giving you back the IQ to justify it. What all games should do is run well on all hardware on release day.
 
Depends on whether or not they are giving you back the IQ to justify it. What all games should do is run well on all hardware on release day.

That is an entirely separate issue. Even if the game is giving subpar IQ, playing it in the future, you should have the ability to turn up the settings to a future setting and have slightly less subpar IQ.

Yes they should give good IQ and have better IQ in the future, but even if that isn't the case, a setting intended for future tech should be there, and the ultra setting addiction of enthusiasts is only harming games.
 
That is an entirely separate issue. Even if the game is giving subpar IQ, playing it in the future, you should have the ability to turn up the settings to a future setting and have slightly less subpar IQ.

Yes they should give good IQ and have better IQ in the future, but even if that isn't the case, a setting intended for future tech should be there, and the ultra setting addiction of enthusiasts is only harming games.

It should have better IQ now if it needs more resources now.
 
I just don't think a game with no scene complexity that is present in all modern FPS should have such crap performance. It's a walking simulator on par with Unity or UE4 indie stuff so there's no excuse to look like a last-gen game while tanking GPUs like a next-gen game.

The interesting result here though, is despite the 780Ti being slower than the 970, which typically only happens in modern console ports that are GCN optimized, GCN runs it very poorly.
 
The CPU performance here is an absolute joke. 60 FPS minimums are unattainable without overclocking apparently. What is the CPU even being used for in this "game"?
 
I think the 55 mins on the cpu chart are due to the gpu bottleneck. If they'd overclock the 980 Ti you'd see multiple cpus reach 60+ minimums.
 
Not to mention that increased resolution and AA is probably the least efficient way to improve IQ.
What? You think GTA5 in 800*480 looks impressive? I would like to disagree. As somone who owns a 110 dpi screen and has seen a 140 dpi screen side-by-side with it I'll say that we're far from the point of diminishing returns in dpi on desktop screens.

Or did I just misunderstand you and you meant increasing resolution while maintaining the same AA setting? Because that is indeed silly.
If you go from ~90 dpi (1080p 24") to 110 dpi (1440p 27") or 140 dpi (4k 32") then you'll have a natural reduction in aliasing artifacts. Applying the same amount of AA to a 90 dpi and a 140 dpi screen is just not going to gain anywhere near the same improvement in IQ.
 
It should have better IQ now if it needs more resources now.

Yes. And the only relation that has to a game having a settings bracket intended to give future GPUs a way to increase IQ is that some people get unreasonably mad when they can't run the biggest baddest settings right now without even looking at the IQ. I say it again because apparently saying it outright repeatedly hasn't been enough, I don't dispute that performance requirements should be matched by IQ, but the idea that games shouldn't have better options just because it gives people who spent a bunch of money on their card the sads is ludicrous.
 
Back
Top