I parroted that all modern games are GPU limited, but I think it is false from real experiences

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Allright, one more reboot. Back to 1.8 ghz, actually playing the lost coast demo.

There was absolutely no hint of stuttering, pausing, hitching or anything hinting at unplayability at any time. Frame rates were solid with no strange dips. The lowest I saw was 72 frames/sec when running/jumping/spazzing at the insertion point. It played exactly as I remember HL2 playing on my single core 2.4 ghz amd64, except much higher res, with no jaggies and with prettier lighting. I could target any pixel on my screen as quickly as I wanted.

What sound card are you using? Is it sharing any physical or logical IRQs with anything else?
 

amenx

Diamond Member
Dec 17, 2004
4,405
2,725
136
Originally posted by: v8envy
Allright, one more reboot. Back to 1.8 ghz, actually playing the lost coast demo.

There was absolutely no hint of stuttering, pausing, hitching or anything hinting at unplayability at any time. Frame rates were solid with no strange dips. The lowest I saw was 72 frames/sec when running/jumping/spazzing at the insertion point. It played exactly as I remember HL2 playing on my single core 2.4 ghz amd64, except much higher res, with no jaggies and with prettier lighting. I could target any pixel on my screen as quickly as I wanted.
You shouldnt notice any difference in any game when the FPS is typically above 60 (or even above 30 in some games). So pointless to use a high FPS game to demonstrate lack of noticeable differences between CPUs.

However take a tough game like Crysis where the difference is from 25 to 30 FPS and you will most likely see differences in playabitlity and smoothness.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
if they both do it, it is not cheating ... it is called "optimization" ..
If they only improve performance in the benchmark but not in actual gameplay they are both cheating.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
if they both do it, it is not cheating ... it is called "optimization" ..
If they only improve performance in the benchmark but not in actual gameplay they are both cheating.

Do you really thing that they are ONLY improving the Crysis benchmark while ignoring the REST of the game?
:roll:

Do you think IF *you* were either on the AMD or nvidia driver team and you noted that there was a *chug* at frame Number '323', wouldn't you consider fixing it a priority?
:confused:
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: amenx


However take a tough game like Crysis where the difference is from 25 to 30 FPS and you will most likely see differences in playabitlity and smoothness.

Ok, absolutely NO hardware on this planet is capable of manhandling games 'like' Crysis at high res and everything cranked. Overclocked E8400s paried with multi-GPUs come close but still no cigar. Maybe Skulltrail with both quads OCd to over 4 ghz and 3x 8800GTX or 2x3870x2 is enough, but I'll believe it when I see benchmarks.

My point is for the vast majority of games out right now a budget or low powered CPU like the 3800x2 is good enough. Taltamir's original statement was that HL2 was unplayable with a 3800x2. I remember having a blast with a single core 3200+ at 2.4ghz, so I gave it another shot to confirm underclocking my CPU to roughly half its normal clock for some tests. For Crysis a low end CPU or low end GPU means a poor experience. If you want to play Crysis today (as opposed to a year from now) be prepared to fork over for the best of the best of the best.

Remember, this thread is about bang for buck. If you can only choose one of: upper mid range GPU or upper mid range CPU which do you go for? I maintain the conventional wisdom of 'always get a better GPU' is still valid.

edit: spelling
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
My point is for the vast majority of games out right now a budget or low powered CPU like the 3800x2 is good enough.
but it isn't good enough for the vast majority of us ... a stock 3800x2 is way too slow for even a GTS class card :p
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: apoppin
My point is for the vast majority of games out right now a budget or low powered CPU like the 3800x2 is good enough.
but it isn't good enough for the vast majority of us ... a stock 3800x2 is way too slow for even a GTS class card :p

Specially in Non Multi Threaded scenarios, would be as fast as an Athlon 64 3000+. A good single core CPU needs to run faster than 2.6GHz to reduce it's bottleneck, and a Dual core CPU needs to run at least 2.20Ghz or higher. As an example, I got a friend that have a Pentium 2160 running at stock 1.60GHz and only scores 436Kb/s in WinRaR with Multi Threading activated, and his friend with a lowly Pentium 4 Prescott oced to 3.5GHz makes up to 518kb/s, and I can do 634Kb/s
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Do you really thing that they are ONLY improving the Crysis benchmark while ignoring the REST of the game?
Well according to [K]yle ATi's newer 3870 X2 drivers provided big gains in the GPU benchmark but when he actually played the same section in normal gameplay he got marginal gains at best.

If that's true then ATi are definitely cheating since those optimizations do not translate into benefits to actual gameplay.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
Do you really thing that they are ONLY improving the Crysis benchmark while ignoring the REST of the game?
Well according to [K]yle ATi's newer 3870 X2 drivers provided big gains in the GPU benchmark but when he he got marginal gains at best.

If that's true then ATi are definitely cheating since those optimizations do not translate into benefits to actual gameplay.

You mean HardFUD?:p

i can do *exactly* the same thing as Mr B and show nvidia cheating ... or that my own Crossfire can beat ANY card ... just do your "run" with the PoV 'off' by one or two degrees ... with a little "practice" you can prove anything with "real fake" world benchmarking

it is *impossible* to actually "play" the same section in normal gameplay as the canned run
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
But it is possible to play though a level and eyeball FRAPS numbers in the upper left. If the problem sections still stay at the same exact rates they always have you know the improvements are fairly bogus. Average frame rates run over run mean squat, but you can still get an idea re: performance or lack thereof.

A fairly subjective measurement, yes. But a possible one.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Originally posted by: apoppin
Originally posted by: BFG10K
Do you really thing that they are ONLY improving the Crysis benchmark while ignoring the REST of the game?
Well according to [K]yle ATi's newer 3870 X2 drivers provided big gains in the GPU benchmark but when he he got marginal gains at best.

If that's true then ATi are definitely cheating since those optimizations do not translate into benefits to actual gameplay.

You mean HardFUD?:p

i can do *exactly* the same thing as Mr B and show nvidia cheating ... or that my own Crossfire can beat ANY card ... just do your "run" with the PoV 'off' by one or two degrees ... with a little "practice" you can prove anything with "real fake" world benchmarking

it is *impossible* to actually "play" the same section in normal gameplay as the canned run

I would normally discount anything HardOCP says, but Bit-tech has reported the same thing.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: CP5670
Originally posted by: apoppin
Originally posted by: BFG10K
Do you really thing that they are ONLY improving the Crysis benchmark while ignoring the REST of the game?
Well according to [K]yle ATi's newer 3870 X2 drivers provided big gains in the GPU benchmark but when he he got marginal gains at best.

If that's true then ATi are definitely cheating since those optimizations do not translate into benefits to actual gameplay.

You mean HardFUD?:p

i can do *exactly* the same thing as Mr B and show nvidia cheating ... or that my own Crossfire can beat ANY card ... just do your "run" with the PoV 'off' by one or two degrees ... with a little "practice" you can prove anything with "real fake" world benchmarking

it is *impossible* to actually "play" the same section in normal gameplay as the canned run

I would normally discount anything HardOCP says, but Bit-tech has reported the same thing.

not the *same thing*

We tested the game under both DirectX 9.0 and DirectX 10 with the 1.1 patch applied. We used a custom timedemo recorded on the Island map which is more representative of gameplay than the built-in benchmark that renders things much faster than you?re going to experience in game. We found that around 25-30 fps in our timedemo was sufficient enough to obtain a playable frame rate through the game. It?s a little different to other games in that the low frame rates still appear to be quite smooth.

For our testing under DirectX 9.0, we set all of the in-game settings to high, while we set shader quality and water quality to 'very high' for DirectX 10 -- all other settings remained the same as what they were set to for our DX9 performance testing. Because of how intense the game is, we tested with both anti-aliasing and anisotropic filtering disabled at resolutions above 1280x1024 for the time being. There is currently no support for anisotropic filtering in the game, but you can still force it from the driver control panel.

They did their OWN custom CANNED timedemo ... with variable settings
... imo .. the "official" Cysis benchmarks are dev tools that were made into a pretty tech demo ... that is ALL .... no, it is not terribly representative of the ENTIRE game .. no two or 3 timedemos ever are.

now IF you were on the nvidia or AMD driver team and you kept getting 01 FPS at frame 3256, wouldn't you fix it? or ignore it?
:confused:

as long as we relate to the *same* benchmark - realizing that the game will run slower than the crysis tech demos "benchmarks" - we can judge RELATIVE performance.

We need MORE "custom canned Crysis benchmarks" ... then we will get a better picture - still very IMPERFECT until the CryTek devs "polish" the game ... it has a LONG way to go ... according to their "roadmap" ... next year it should be finally optimized for Vista 64

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
i can do *exactly* the same thing as Mr B and show nvidia cheating ... or that my own Crossfire can beat ANY card ... just do your "run" with the PoV 'off' by one or two degrees ... with a little "practice" you can prove anything with "real fake" world benchmarking
Let?s ignore the source for now and concentrate entirely on logic and facts.

He?s saying ATi?s performance significantly increases in the GPU benchmark but when playing the actual level in similar areas there are marginal performance gains at best.

If that is true then ATi?s likely cheating since genuine optimizations would benefit similar spots in the same level, regardless of whether the level is run from a benchmark or from actual gameplay.

Or to put it another way, ATi?s benchmark score increase tells potential customers nothing about what they will experience if they play the same section of the game.

Note that it?s entirely possible nVidia?s doing the same thing, in which case they would be cheating as well.

It is *impossible* to actually "play" the same section in normal gameplay as the canned run
No it isn?t. Have you seen the benchmark? It runs along the beach, along the road and then rotates around the hill a few times before hitting the road again.

Now granted you can?t always be up in the air as much as the benchmark is, but you should still be seeing some performance gains if you stick to said areas.

So if you?re only seeing a performance gain along the exact rails of the benchmark then it?s likely cheating akin to nVidia?s 3DMark clip planes.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
didn't hardOCP specifcally say BOTH are cheating on the canned benchmark for crysis? getting results that are way above that of real gameplay, as well as mentioning how certain hotfixes raised real FPS by 0.X fps, but the canned FPS by a sizeable percentage?

Anyways, I tried doing some benchmarking of company of heroes, which BFG might recall from the 9600GT thread about the value of stream processors...
Min FPS: 13, average FPS 16.7, max FPS 42... actual gameplay at those settings? solid 4 fps, on action scenes it will dip to 1, on occasion it will spike to 7... both happened for less then a second.

Here is a review where they tested same video card:
http://www.legitreviews.com/article/648/7/

check out their 1920x1200 highest quality score.. I tested mine on 1920x1200 with highest+ setting (set highest, go into advanced, increase 3 additional options disabled even in highest) and 16x Q CSAA, they used 8x... I used a weaker CPU and with background programs (anti virus, and stuff) and got 16.7... But the game is a slideshow at 4fps on actual gameplay.

That resolution is completely unplayable unless I completely disable AA. (which according to reviews should have given me a nice solid 60fps... not a barely playable FPS with stutter here and there).

If I would have bought this 300$ card to play CoH based on that review I would have been pissed.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
didn't hardOCP specifcally say BOTH are cheating on the canned benchmark for crysis?
Like I said earlier it's quite possible and in fact quite probable.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
i can do *exactly* the same thing as Mr B and show nvidia cheating ... or that my own Crossfire can beat ANY card ... just do your "run" with the PoV 'off' by one or two degrees ... with a little "practice" you can prove anything with "real fake" world benchmarking
Let?s ignore the source for now and concentrate entirely on logic and facts.

He?s saying ATi?s performance significantly increases in the GPU benchmark but when playing the actual level in similar areas there are marginal performance gains at best.

If that is true then ATi?s likely cheating since genuine optimizations would benefit similar spots in the same level, regardless of whether the level is run from a benchmark or from actual gameplay.

Or to put it another way, ATi?s benchmark score increase tells potential customers nothing about what they will experience if they play the same section of the game.

Note that it?s entirely possible nVidia?s doing the same thing, in which case they would be cheating as well.

It is *impossible* to actually "play" the same section in normal gameplay as the canned run
No it isn?t. Have you seen the benchmark? It runs along the beach, along the road and then rotates around the hill a few times before hitting the road again.

Now granted you can?t always be up in the air as much as the benchmark is, but you should still be seeing some performance gains if you stick to said areas.

So if you?re only seeing a performance gain along the exact rails of the benchmark then it?s likely cheating akin to nVidia?s 3DMark clip planes.

Have you seen the benchmark?


RotFL ... only 1,000,000 times
[would you believe 100,000 times ...?]:p

i have only one further thing to say to you

Try it for yourself a dozen times and report your results back
- let us know your results - and especially - how close each run is to each other ;)