I parroted that all modern games are GPU limited, but I think it is false from real experiences

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
He didn't test it with the games I did.
Download lost cast through steam (free to anyone who owns an AMD or nVidia video card..) and try it yourself.

Tell me how many FPS you get on the build in test at 1280x resolutions with max everything with and without AA.
Try the same on the same CPU but with a much faster card...
Try 1920x1200 resolution...
Try it with a faster CPU with the older and with the newer card...

"I remember it being completely playable at low settings" is too vague, sorry.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I didnt read this whole thread, but generally speaking, GPU upgrades will almost always give you more increase in performance for your money.

When you game at 1920x1200 with the highest possible settings, you could buy a $1000 CPU and see ZERO FPS improvement if your GPU was the limiting factor.

Compare that wasted $1000 to a $200 8800GT, you could theoretically see a 50x performance jump depending on what GPU you are upgrading from.

Bottom-line, if it came down to a GPU or CPU upgrade, the decision has got to be GPU almost every time.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Will do. I already have access to every product valve makes or will make, but the download may take a few days. I'm too lazy to swap cards so I'll limit it to 12x10 and 16x12 (ancient monitor won't do 100hz refresh any higher) at 2 and 3 ghz. I'll be testing the fast card + slow CPU = fail theory.

In fact, to debunk the 'unplayable on a 3800x2' I only have to run at 16x12 at various quality settings with cpu at 6x333. If the frame rates stay above 30 I'd pronounce it 'eminently playable.' If the frame rate is too low only then would I bother with a lower resolution test. I presume the demo displays an FPS counter so I could get an idea when and if frame rates drop below 20?

Keep in mind my testing will be on Linux. On Windows the results will almost certainly look better.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
why do people keep on mentioning 1000$ CPUs? they are what... 20% faster then a 200$ CPU at most?
How much more relative performance do you get from paying 2400$ for 3x8800Ultra cards? (plus extra for PSU and mobo ofcourse).
Who cared really? We are talking in the reasonable upgrade range here.. 200-300$ for either a video card, or a CPU (maybe CPU+mobo)
for best results, upgrade both

I am talking about replacing an old CPU with a new 200$ one or an old GPU with a new 200$ one. keep a realistic perspective people...
I am not even saying the GPU is not the better choice in many cases. I am just saying sometimes, especially for certain games. A fast CPU gives you much more.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: taltamir
why do people keep on mentioning 1000$ CPUs? they are what... 20% faster then a 200$ CPU at most?
How much more relative performance do you get from paying 2400$ for 3x8800Ultra cards? (plus extra for PSU and mobo ofcourse).
Who cared really? We are talking in the reasonable upgrade range here.. 200-300$ for either a video card, or a CPU (maybe CPU+mobo)
for best results, upgrade both

I am talking about replacing an old CPU with a new 200$ one or an old GPU with a new 200$ one. keep a realistic perspective people...
I am not even saying the GPU is not the better choice in many cases. I am just saying sometimes, especially for certain games. A fast CPU gives you much more.

We keep mentioning $1000 CPUs because that is a pretty realistic price for the top of the line CPU. Our point is that 8 out of 10 games are going to see a massive FPS increase from spending $500 for a top of the line GPU compared to a minimal FPS gain from spending $1000 on a top of the line CPU.

Basically, you are right, in some games upgrading a CPU will net you far more FPS than a GPU. However, such situations are going to be limited to old games, low resolutions, flight sims and some RTS games.

So, when a gamer who has $400 total to spend on an upgrade comes to this crossroad, which is he going to choose? Probably the path that is going to net him the most gain in the most games... AKA GPU.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Everything i wanted to play is in a game that came out last year.

Sifi setting , Galaxy to explore , Create your own story , Have sexual relation , Shoot some aliens , badass space peace enforcer with no remorse of his action.

Mass effect : )

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: SickBeast
Some games are indeed CPU limited, but from what I've seen, a CPU limited game will get 100FPS instead of 200FPS. Generally when you're GPU limited you'll only get 10FPS or something.

I'm pretty sure that even my Opteron 165 isn't pegged at 100% CPU usage in a few games.

Ideally a system should be balanced, but there are times when it makes more sense to just upgrade the GPU (or vice-versa).

I couldn't say it better. CPU limited games usually runs well beyond 100fps, GPU limited games in which most modern games are, for example, Crysis, it will run well below 30fps, I saw a lot of CPU scaling test around the net, and a Core 2 Duo running at 2.2GHz and above shows minimal performance gains, below that, the performance drops linearly with the clock speed. There's no CPU that can increase the performance of a game that is running in single digits, unless if it runs entirely in software mode.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Are you parroting the general consensus or did you actually test it?

The CPU limited games mentioned here run in the 20-30fps at 1280x resolution... and are a slide show above it. Upgrading the GPU does nothing. Upgrading the CPU allows good fps at 1920x1200..

Sure there are really low requirement games that will get 60+ fps on a 2 year old CPU and video card... and those will more then likely be CPU limited for their 100+ fps. but those are not the games discussed in here.

Sure in MOST games a GPU upgrade will be better, but not all.

If someone had 400 I will tell him to upgrade both for 200$ each.

If someone had 200$ only I would tell them to upgrade the GPU usually, or depending on their favorite games, the CPU. But the GPU more often then not.

If someone asked "will my 2 year old GPU hold back my brand new 8800GTS" I will counter anyone who says "absolutely not, all newer games are GPU limited anyways so there is no reason for a gamer to upgrade the CPU"
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Are you parroting the general consensus or did you actually test it?

The CPU limited games mentioned here run in the 20-30fps at 1280x resolution... and are a slide show above it. Upgrading the GPU does nothing. Upgrading the CPU allows good fps at 1920x1200..

Sure there are really low requirement games that will get 60+ fps on a 2 year old CPU and video card... and those will more then likely be CPU limited for their 100+ fps. but those are not the games discussed in here.

Sure in MOST games a GPU upgrade will be better, but not all.

If someone had 400 I will tell him to upgrade both for 200$ each.

If someone had 200$ only I would tell them to upgrade the GPU usually, or depending on their favorite games, the CPU. But the GPU more often then not.

If someone asked "will my 2 year old GPU hold back my brand new 8800GTS" I will counter anyone who says "absolutely not, all newer games are GPU limited anyways so there is no reason for a gamer to upgrade the CPU"

Actually we DID test it ... with a P4 all the way to C2D ... do you want me to find it for you?

it is a *balance* ... generally upgrading the GPU has the most satisfying bang-for-buck results for gamers ... but i found many situations where a slow-ass CPU and a fast GPU simply won't cut it [period]
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: taltamir
Are you parroting the general consensus or did you actually test it?

The CPU limited games mentioned here run in the 20-30fps at 1280x resolution... and are a slide show above it. Upgrading the GPU does nothing. Upgrading the CPU allows good fps at 1920x1200..

Sure there are really low requirement games that will get 60+ fps on a 2 year old CPU and video card... and those will more then likely be CPU limited for their 100+ fps. but those are not the games discussed in here.

Sure in MOST games a GPU upgrade will be better, but not all.

If someone had 400 I will tell him to upgrade both for 200$ each.

If someone had 200$ only I would tell them to upgrade the GPU usually, or depending on their favorite games, the CPU. But the GPU more often then not.

If someone asked "will my 2 year old GPU hold back my brand new 8800GTS" I will counter anyone who says "absolutely not, all newer games are GPU limited anyways so there is no reason for a gamer to upgrade the CPU"

Any resolution beyond 1280x1024 will BE more GPU bound, not CPU bound. CPU only does Scripts, A.I, Collision Detection, Vertex indexes etc, the one that renders the graphics on screen is the GPU. Even my P4 EE 3.4GHz saw great improvements when I upgraded from X800XT PE to X1950XT, even though in FPS wasn't that great, was able to use FSAA, higher levels of detail etc. And any Pentium 4 is a bottleneck for a card beyond X800XT PE.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: taltamir
He didn't test it with the games I did.
Download lost cast through steam (free to anyone who owns an AMD or nVidia video card..) and try it yourself.

Tell me how many FPS you get on the build in test at 1280x resolutions with max everything with and without AA.
Try the same on the same CPU but with a much faster card...
Try 1920x1200 resolution...
Try it with a faster CPU with the older and with the newer card...

"I remember it being completely playable at low settings" is too vague, sorry.

Ok, test 1 completed. 3 runs:

E2180 at 2.66 mhz. I feel this models a typical OC of a low end CPU available today.

1600x1200, everything high, no FSAA, no wait for vsync. Average 116 frames/sec.
1600x1200, everything high, 4xFSAA, no wait for vsync. Average 115 frames/sec.
1600x1200, everything high, 6xFSAA, no wait for vsync. Average 114 frames/sec.

At no time did I experience slowdowns, lack of smoothness, jerkiness or any other *anything* which looked like 'unplayable' frame rates.

I will now do the same test at 9x200 which should be performance similar to a stock or slightly overclocked 3800x2. Let's see what happens.
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
2.66 isnt exactly low end :p Pretty much high/midrange for a non overclocker

2.0 is low end
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
And that's why I just ran one more test:

Same settings as last test. 1600x1200, everything high, 6xFSAA, no vsync.

CPU is the same E2180, but this time at 9x200 or 1.8 ghz.

Ready for this? 94.1 average frames/sec. I could not tell the difference between this and 116. I felt no need repeat the tests with other (lower) quality settings, since I'd bet money the results would be 94.1 average frames/sec, +-1 fps.

I noticed something else -- only one core seemed to be fully loaded. Which once again confirms my theory that HL2 was targeted for a ~3ghz single core P4, and any modern CPU -- even low end ones -- is all you need to experience it in full glory.

A craptastic video card won't let you run at 16x12x6FSAA all details high. My data point confirms the original party line. Get a *decent* CPU, and the best graphics card you can afford if your aim is to game with full on eye candy.

This is not to say that newer games won't be CPU hungry. Flight Sim X sure is. UT3 sure is. Supreme Commander likewise. But the majority of games are targeted for midrange CPUs. And high end video cards let you play the same game with all the eye candy cranked. While I couldn't tell the difference between 94 and 116 frames/sec I could definitely tell the difference between no AA and 4xAA -- once again, better video providing a much better experience.

There was something hideously wrong with your old 3800x2 box if an 8800GTS gave you no improvement over a 7900GS. 94 fps is half the frame rates BFG got with his setup, but it's still a world away from your claim of under 30 and unplayable.

BTW, just timed NWN loading of the shadow Mulsantir or whatever it is in the expansion campaign. 11 seconds from touching the portal & getting a loading screen to seeing my characters again.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Originally posted by: n7
In general, most newer games are CPU limited, but with UE3 games, you'll see CPU limitations in a hurry if you run an older CPU.
Which games?

Ever since the birth of 3D acceleration of T&L, the direction of rendering has always been off-loading graphics from the CPU. Since dual-core became norm (and to be frank the biggest impact coming from the consoles) devs started making use of more than one core. But they still don't let CPUs to render a frame because it'll take forever..
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Originally posted by: BassBomb
2.66 isnt exactly low end :p Pretty much high/midrange for a non overclocker

2.0 is low end
Yeah and imagine what kind of performance a low end GPU (or some creepy mid-range like 8600 GTS) will provide.
 

JayDeeJohn

Junior Member
Aug 1, 2006
8
0
0
Awhile ago, almost 2 years now, they did a bench test on athlons, using a FX60 and ATI and nVidias cards. Using the adjustable mutiplier, they found that at 2.2GHZ and below, the cpu limited the graphics cards of the day, but only the high end. The high end was the 1900xtx and the 7900gtx. Anything above 2.4 was considered goo enuff for no bottlenecking from the cpu. The newer cards are much faster, and thus need a better cpu , tho what the current cutoff point is, well guess we need to keep testing, or find an article
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
Originally posted by: lopri
Originally posted by: BassBomb
2.66 isnt exactly low end :p Pretty much high/midrange for a non overclocker

2.0 is low end
Yeah and imagine what kind of performance a low end GPU (or some creepy mid-range like 8600 GTS) will provide.

Don't worry I do know. When orange box dropped I beat it on a 6600GT trying to play at 1680x1050
 

amenx

Diamond Member
Dec 17, 2004
4,405
2,725
136
Minitest. Crysis. e6400 @ 2.13ghz then 3ghz. 1680x1050 res.

2.13ghz = around 23.4 FPS avg, 3ghz = around 28 FPS avg.

Thats 4.6 FPS or around 20% just by OC'ing my CPU to 3ghz.

Actually surprised me at that res, I would've expected that at like 1024x768. It would vary from game to game of course, with some likely to show little gain at that res or higher.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Ok, managed to stumble across a drive with xp pro on it. This let me run fraps and a few more tests. Under windows, with cpu at 2.66 ghz:

1600x1200x6xfssa max settings: 120.4 fps.
1600x1200x6xfssa max settings + fraps: 116 fps
1280x1024x6xfsaa max settings + fraps: 116 fps.

So it's pretty obvious that fraps has an overhead of its own, roughly equivalent to the 4% performance loss caused by running under Linux+Cedega. But the impact is not huge.

I did find something which may explain why OP's machine was unable to show any gains with the old cpu:

I reformatted after getting the E8400 and have used two cards since...

So really this isn't an apples to apples comparison. There may have been a starforce infection or any number of hardware problems with the old OS install used to test the AMD chips which are no longer present with the new Intel CPU. It's running on a fresh install of windows.

BTW, my frame rate performance is identical to yours. First frames on the beach are 47, the rate climbs to 57 within a second or so, stays there through the tunnel and quickly climbs into the 200s by the time you get to the top of the hill. So for this test, at least, I'd say a budget CPU is all you need.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
crysis does indeed claim to be CPU limited.

Those are some very high numbers for your lost coast test.
Maybe... mmm, I wonder how much difference there is between XP pro and vista 64 on those tests (all mine were vista64, I doubt it was that much since there is a 64 exe)
The OS before was pretty clean, I reformat regularly... but maybe there was something. I doubt it.

Both of them were running NOD32 in the backround, realVNC server, riva tuner, daemon tools, and second copy. Nothing was doing "activity"... (IE, nod wasn't scanning and second copy wasn't backing up files...)

Anyways, you say you experience similar frame rate... "First frames on the beach are 47, the rate climbs to 57 within a second or so, stays there through the tunnel and quickly climbs into the 200s"... this is indeed similar to what I get with the E8400.
A few notes those.
What video card are you using? I have been informed that 6x MSAA on nvidia doesn't exist, so if using that with an nvidia card its just without AA...
2.66ghz? what CPU are you using? is it the E6750 or E6700? That one is twice as fast as an X2 3800+... and it costs 190$ on the egg NOW (I bought my E8400 for 210)... the X2 3800 costs 55$ (I bought it for 130 at frys over a year ago... overpriced due to B&M, but I needed a working computer that day, and never bothered ordering something online and returning it)
http://www23.tomshardware.com/...2&model2=873&chart=434
http://www23.tomshardware.com/...2&model2=873&chart=420
http://www23.tomshardware.com/...2&model2=873&chart=422

E6750 vs E8400:
http://www23.tomshardware.com/...6&model2=873&chart=434
http://www23.tomshardware.com/...6&model2=873&chart=421
http://www23.tomshardware.com/...6&model2=873&chart=422


I chose those bechmarks because:
1. WinRAR is a decompression test, perfect for gouging loading time on heavily compressed games. (lightly compressed with be HDD limited, but that is rare)
2. UT and a warhammer are the only two games they list there, and they are both at resolutions that are CPU limited, so the idea here is to see how those CPUs compare in a CPU limited case...

We see almost double the performance from a X2 to a E6750 (mmm... I would say, 120% increase in one test, and 80 percent on the other two, I am not really in the mood to do the exact math...), and about 25% increase going to an E8400... so your CPU is much closer to my newer one then older one... and again, 190$ today isn't budget, I wouldn't recommend it either, its leftovers from older tech and you should get an E8xxx instead if you build a comp today... much more bang per buck... Or go for a 100$ C2D.

EDIT: ok I see now what you have, i didn't notice there were about 10 posts above that last one that I missed. so you have an overclocked E2180...
What was the intel budget chip a year and a half ago? think that would work fine?
Anyways, Try it with that chip underclocked to 1.9ghz... or at its stock 2.0ghz... its still a more expensive, better chip then the X2 3800+ windsor F2... but I am curious as to the results.

try it at 1280xsomething and also, please take note of the min framerate. When I said it was barely playable I was refering to the min frame rate dipping enough to stutter. You should see what happens when you actually try to play the game and break a box... The framerate in the flyby doesn't involve any box breaking and the FPS really takes a nosedive when it calculates the collisions for the fragments.

EDIT: Ok, i just ran some time demoes, with 4x I got 112, with 6x 142. So obviously with 6x it really isn't on.
Previous video card didn't even offer 6x, but with this one it does.
I noticed that the first run after I load them game will dip into the single digits when it goes near the rusty ship skeleton on the first beach sequence. Exiting the time demo and this point and running it again, it will stay at 30+... restarting the game and again the first run its dipping into single digits and subsequent runs are fine.

I played through, unlike with the 3800 breaking a box did nothing.
I noticed that there is occasional stutter.. My FPS is around 80 in most spots when playing it but the game stutters. Due to the locations where it happens I think it might be a case where new data is loaded... I am not certain.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: taltamir

What video card are you using? I have been informed that 6x MSAA on nvidia doesn't exist, so if using that with an nvidia card its just without AA...

The video card is an 8800GT 512M, 650 mhz core, 1900 mem, 1625 shaders.

I didn't look closely at the 6xFSAA runs (I was staring at the fraps counter), but the 4x *definitely* did AA on the first batch of tests, and frame rates were identical with or without AA. It's entirely possible that my last batch was entirely without FSAA -- but it's the video card, not the CPU that's doing FSAA work.

2.66ghz? what CPU are you using? is it the E6750 or E6700? That one is twice as fast as an X2 3800+... and it costs 190$ on the egg NOW (I bought my E8400 for 210)... the X2 3800 costs 55$ (I bought it for 130 at frys over a year ago... overpriced due to B&M, but I needed a working computer that day, and never bothered ordering something online and returning it)

Look above where I did just that. There was absolutely no difference in frame rates between 1280x1024 and 1600x1200, or 1600x1200 with or without FSAA. Which tells me I'm CPU limited, but the limit is so high it doesn't matter. In fact, with 4xFSAA at 16x12 our game experience should be identical. I'd even go out on a limb and say w/o FSAA the difference between 140 average fps and 100 is not perceptible.

I also did one run with the CPU clocked at 1.8 ghz -- which should match a stock 3800x2. Same thing. Started off with ~40 fps on the very first frame, climbs to mid 50s within a second or so, and just goes up from there.

EDIT: there is no stutter, the entire run is glass smooth. The lowest frame rate is seen only in the first second, it's a rock solid 57 panning to the rusty ship and going through it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
crysis does indeed claim to be CPU limited.

Those are some very high numbers for your lost coast test.
Maybe... mmm, I wonder how much difference there is between XP pro and vista 64 on those tests (all mine were vista64, I doubt it was that much since there is a 64 exe)
The OS before was pretty clean, I reformat regularly... but maybe there was something. I doubt it.

Crysis--XP32 vs. Vista 64--Why the Performance Difference??
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
i meant stutter during actual gameplay. there is 0 stutter on the time demo. But I still have a bit of stutter while playing on my E8400 + 8800GTS 512...

Notable locations are:
1. after the chopper, looking down.
2. When getting close to the man on the docks for the first time.
3. Various other locations.

I made a fraps movie showing high FPS combined with stutter... (a bit exacerbated by fraps itself, but still similar to what I saw playing) but I haven't got around to putting it on youtube or anything.
The unplayability with my old processor was due to constant stutter, I found it impossible to aim.

Stutter is greatly reduced with the new processor.

And god the tearing without vsync on this game is atrocious.

@appopin, I said it claims, I never tested that claim. The developers said the number one limiter is CPU...
Interestingly enough, supposedly there is lots of cheating from both nvidia and ATI in the crysis timedemo...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
i meant stutter during actual gameplay. there is 0 stutter on the time demo. But I still have a bit of stutter while playing on my E8400 + 8800GTS 512...

Notable locations are:
1. after the chopper, looking down.
2. When getting close to the man on the docks for the first time.
3. Various other locations.

I made a fraps movie showing high FPS combined with stutter... (a bit exacerbated by fraps itself, but still similar to what I saw playing) but I haven't got around to putting it on youtube or anything.
The unplayability with my old processor was due to constant stutter, I found it impossible to aim.

Stutter is greatly reduced with the new processor.

And god the tearing without vsync on this game is atrocious.

@appopin, I said it claims, I never tested that claim. The developers said the number one limiter is CPU...
Interestingly enough, supposedly there is lots of cheating from both nvidia and ATI in the crysis timedemo...

if they both do it, it is not cheating ... it is called "optimization" .. they are both RACING each other to oiptimize Crysis at the same time the Devs are patching it. It makes SENSE that the CryTek devs are doing what they promised ... first patching and refining it for their biggest fan base - DX9 and XP and then patching it for Vista 32 and finally getting it to run perfectly on Vista 64

Come back to me in a year when Crysis is *fixed* and then we will talk about 'cheating' :p
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I am talking about "optimizations" for the timedemo but not the actual game...
Regular optimizations that improve performance across the board are not cheating, just improving.

And I did say "supposedly"