I parroted that all modern games are GPU limited, but I think it is false from real experiences

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I have been parroting the claim that all new games are GPU limited...
But I recently got the chance to try to run half life 2 on a x2 3800+ with a 7900GS, and with a 8800GTS 512MB. and a E8400 with the 7900GS and 8800 GTS 512MB

Max settings with no AA/AF at 1280x800 I Was getting 20-30FPS on the 3800 cpu with both video cards... (less then 5 FPS faster on the GTS)
On the E8400 with the 7900GS I was getting ~30FPS on 1920x1200...
E8400 + 8800GTS 512MB max settings WITH 6x MSAA and 16X antistrophic (which I had on triliniar before) I am getting 35-55fps on early beach scene, and then 70-150fps elswhere... about 10fps difference in the early beach part of the test in all the other configurations...
I use fraps to measure... with forceware beta 169.28 for all tests.

I tried it with and without vsync it is getting the same fps as without (35-55, 40fps average) on early beach part.. On my first try I got a solid 55-58fps at the beach. but several repeated tests now show the exact same results with or without vsync... 40fps, drops to 35fps, goes up to 55... drops to 40.. then climbs on to the hundreds (or 60 with vsync on) when the timedemo gets away from the beach and unto the dock...
I have now tested all the results at all the configurations multiple times.. and except that one pass with vsync that got abnormally high FPS at the beach they were all the same... I think i know what it is... there is one configuration field that shows up blank (color correction)... I tried changing it, but the drop down window for it gets clipped... so I didn't see what I was doing... that was after the first vsync pass, and before the passess without vsync and the extra passes with. So I must have changed color correction without realizing it which dropped my early fps... its very annoying that it does that (clips the color correct field and showing it blank).. on ALL HL2 games I tried (lost coast, deathmatch, HL2 original, and portal)... I reformatted after getting the E8400 and have used two cards since... and no matter what its always giving me that invisible correction on all HL2 games...



Back to topic, this is hardly what I would call GPU limited...

Also my load times for neverwinter nights 2 levels went down from over a minute to around 10 seconds (I also tested with an X2 6400 downclocked to a 6000 because I did NOT know it doesn't come with a fan, and was running it using the stock fan from the 3800 CPU... anyways it was 20 seconds or so compared to over a minute despite clocking on 50% faster on the mhz.. go figure).

Now I am not saying CPU is everything, obviously there is a huge benefit for the 8800GTS... but only when paired with a capable CPU... there has to be a balance... and the constant claim I hear on the forums and otherwise, which I have parroted, is obviously incorrect. A modern CPU is definitely needed to get the benefit of a modern GPU.


For those that are having trouble following the upgrade path I took...
1. Started with a year old X2 3800+ with a 7900GS.
2. Bought an X2 6400+ black edition.
3. Found out that its NOT unlocked multiplier like the brisbane black edition, its only without fan... underclocked to 6000+ speeds using the 3800 stock CPU... bought an expensive cooler... it worked at stock now, but wasn't perfectly stable, and I wasn't satisfied with speed, returned CPU for refund and kept cooler.
4. Bought an eVGA 8800GTS at frys... the sticker melted right off of the video card and it got damaged. Returned for a refund.
5. Bought a second eVGA 8800GTS at frys (this one was 15-20c cooler at idle with same stock fanspeed of 29%), modded the config file on riva tuner to recognize it and upped the fan speed from 29% to 52%...
6. Bought an E8400 and a new mobo to go with.
7. Saw that I could buy it on newegg for 90$ less (I know they gouge, but I shopped around and newegg was the cheapest of 10 online stores! go figure). Figured the fry's convinence wasn't worth THAT much... returned for a refund and ordered an XFX one online for their transferable warranty so that it will sell better on ebay when I decide to upgrade (too many woes on the evga step up, and it might take too long till the next card I want is released)... Spent 1 week using the 7900GS on an E8400 system until the online order 8800GTS 512MB arrived...

So there you have it.. I managed to test some interesting combinations of CPUs and GPUs thanks to this... and that is how I saw all those results.

I noticed identical performance between lost coast, HL2 core, and portal... so obviously using the same engine counts... with nvidia now offering HL2: lost coast free to all nvidia card owners (And ATI/AMD doing it for a while now) there is not a single person who can NOT get that game and benchmark it... :p so go for it.

PS. It says "64bit mode" when I run the game... but it will not let me choose DX10 in the options... anyone knows how to get that to work?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The Source engine is quite dated so it's not really ideal to test GPUs.

See the tests in my sig.

|
|
|
v
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
A game that is well over 3 years old that is rightfully older due to frequent release date setbacks is not what I'd call a "modern" game.
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
DiRT is heavily CPU limited. Source games have always been CPU limited well since GF7/X1800 series.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Taltamir, all modern games become GPU limited "eventually". I think modern CPU's have more than enough juice these days to avoid becoming the limiting factor in most games.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,046
549
136
"Also my load times for neverwinter nights 2 levels went down from over a minute to around 10 seconds"

I was under the impression that load times were always more cpu than gpu dependent.
Q.What part of the system is responsible for decompressing the data and textures and placing them into memory?
A. The cpu.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
obviously loading times are CPU dependent, what I meant was that fps aren't the only measurement of gameplay experience... faster load times might be even more important in some cases...

Also, if an OLDER game is limited by a year old CPU... basically it depends on the game. Saying GPU GPU GPU is a bit unfounded as it is a combination of both.
Upgrading from a 7900GS to the second fastest GPU in the world should have increased performance... but it only did so after the CPU was upgraded to a high end part released a month ago.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I don't think NWN2 has near the load times for me as it does you... I'll check tonight if I get a chance and report back.

*edit - Game is installed but I can't find the disc, so I didn't get to check. :(
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
If you are continuing to bench, try some newer games. Hell, even Oblivion is newer than HL2, and at 1920x1200 you will notice the 8800GTS as a massive improvement over the 7900GS.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
with an X2 3800?

the GTS DOES show a massive improvement over the 7900GS with a E8400 in HL2... it doesn't show any improvement with a 3800... because its just too CPU limited at that point. That is why a balance is needed. Saying that its only the GPU is misleading, its both.

NWN2 you should know that load times are greatly affected by the quality settings you have... everything on lowest but 1920x1200 resolution i was getting 20 seconds load times on 3800... up quality though and suddenly the load times tank.

Lost coast also takes about 15 seconds to start, instead of a minute or so.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Actually, we did quite a few tests on this about a year ago with the P4s and x1950Pros and even 2900xt and 8800GTS 640

if anyone has a doubt that a CPU can *choke* a Graphics card no matter the detail settings, try a 2.4 P4 .... when we did out own tests, an Overclocked P4 [mine was an "EE" @ 3.74Ghz] did a heck of a lot better than a stock one with x1950p, nevermind 2900 series.

Generally, a X2 3800 is too slow to take full advantage of a 2900 series or old 8800GTS series unless it is really Overclocked - even at maxed settings with [in-game]AA/16xAF the bottom FPS would often lag [in STALKER for e.g.] on my P4 that would fly on my C2D with the same GPU.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: taltamir
obviously loading times are CPU dependent, what I meant was that fps aren't the only measurement of gameplay experience... faster load times might be even more important in some cases...

Faster load times are more dependent on hard drive performance than anything else I think...up to a certain point obviously. For example, a while back I remember I got smoother performance from a Raptor 74GB hard drive than a 40GB sata hard driver (the raptor is much faster) in Farcry.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
BFG pretty much ended this thread with his post. Of course a really slow CPU will hold you back, but the fact remains I won't see much of a difference (if at all) should I play games @ 4GHz on my E8400 vs. its stock 3GHz when my 8800GT is going to hold me back.

The whole point behind "modern games are GPU limited" is that the fastest CPUs available today are NOT the bottle neck of a gaming system's performance - if you knew what that statement meant you'd understand that doesn't mean just buy the fastest GPU and throw it into any old system no matter the processor.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: taltamir
with an X2 3800?

the GTS DOES show a massive improvement over the 7900GS with a E8400 in HL2... it doesn't show any improvement with a 3800... because its just too CPU limited at that point. That is why a balance is needed. Saying that its only the GPU is misleading, its both.

NWN2 you should know that load times are greatly affected by the quality settings you have... everything on lowest but 1920x1200 resolution i was getting 20 seconds load times on 3800... up quality though and suddenly the load times tank.

Lost coast also takes about 15 seconds to start, instead of a minute or so.

you play nwn2 at 19x12? What quality settings do you use for nwn2 now with the 8800gts 512? How are your fps?
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: taltamir
with an X2 3800?

The x2 3800 is slow enough to become the bottleneck in most new games, especially ones that aren't multithreaded. If a game is only using one core, you pretty much have a 2ghz A64 which by now is 4-5 year old tech.

It's especially bad with a 38xx because they go to 2d speeds when they aren't being stressed enough, like when you're being CPU bottlnecked, which makes the framerate really bad :(
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I can run virtually any game at 1920x1200, many with 2X or 4X AA on my 8800GTS 320mb on an Opteron 165 CPU @ 2500mhz.

I'd say that the 3800+ is the lower limit at this point, but once overclocked it's a decent CPU with some advantages over a more modern C2D (memory latency, on-die memory controller).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: bunnyfubbles
BFG pretty much ended this thread with his post. Of course a really slow CPU will hold you back, but the fact remains I won't see much of a difference (if at all) should I play games @ 4GHz on my E8400 vs. its stock 3GHz when my 8800GT is going to hold me back.

The whole point behind "modern games are GPU limited" is that the fastest CPUs available today are NOT the bottle neck of a gaming system's performance - if you knew what that statement meant you'd understand that doesn't mean just buy the fastest GPU and throw it into any old system no matter the processor.

Thats not how everyone is using it. Again and again I see people giving the advice of throwing in a new 3870x2 or a 8800GTS into an older x2 or E2xxx system and telling people that the CPU practically doesn't matter.

Heck, even reviews give the impression that games are large GPU limited...

Not that games are largely GPU limited ONLY with the fastest CPU's on the market.

For all we know quad crossfire and tri sli performance is CPU limited (at least partially) and we can't tell until some faster CPU comes along.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: taltamir
Thats not how everyone is using it. Again and again I see people giving the advice of throwing in a new 3870x2 or a 8800GTS into an older x2 or E2xxx system and telling people that the CPU practically doesn't matter.

That's because those ppl are asking about gaming upgrades and talking about getting an 8600GT or 2600XT.

If you had a so-called "mid-range" card (8600/2600 or older) you are pretty much guaranteed to be GPU-limited. That's why the standard recommendation is to move to a higher-range gfx card (e.g. 8800GT/3800). Sure, at some point a gfx upgrade will leave the CPU as the limiting factor - only in an ideal case would GPU and CPU be perfectly matched.

Heck, even reviews give the impression that games are large GPU limited...

Well, most reviews use Core 2 Extreme processors at ultra res/detail because they are trying to create GPU-limited scenarios that are stressful enough to highlight differences between high-end cards.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Well, Flight Simulator X is CPU limited, add a quadcore and it makes a world of difference, it all depends on what kind of data being processed in a thread.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
all the more reason it shouldn't be CPU limited... as a pretty old game it should be NOTHING limited and give me 60fps+ on everything... the fact it doesn't just further proves my point...
 

the unknown

Senior member
Dec 22, 2007
374
4
81
Older games and lower resolutions will always be "CPU bound" because its just not using the GPU to its full power. You're kind of missing the point though. This is always what happens on forums when asking to upgrade: "I want more fps, blah blah, what should I upgrade with a budget of $2xx." That's where people will always recommend a new GPU over a CPU. Even if they only have a C2D E2xxx or some X2 3800 or lower, its always better to recommend a new GPU for best results. They will see a huge jump in frames and image quality. Upgrading a CPU would only increase the frames by the single digits.

There are exceptions of course, but its probably only when you're already getting 80+fps or a game thats really heavily CPU dependent.

And if you do check charts of CPUs paired with the same GPU, you'll see small increments of fps regardless of how much you overclock. Its clear then that the game is GPU bound. Thats the case with all "modern" game engines, especially ones that aren't as refined as the HL2 engine. HL2 is the exception, not the rule.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
It depends entirely on the game.. How low of a resolution you play and how powerful of a card factors in as well.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
I noticed that when playing RFactor, my GPU barely gets above idle temps even at 1920x1200 whereas in Witcher it gets about 25C above idle...another example of CPU bottleneck or no?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: the unknown
Older games and lower resolutions will always be "CPU bound" because its just not using the GPU to its full power. You're kind of missing the point though. This is always what happens on forums when asking to upgrade: "I want more fps, blah blah, what should I upgrade with a budget of $2xx." That's where people will always recommend a new GPU over a CPU. Even if they only have a C2D E2xxx or some X2 3800 or lower, its always better to recommend a new GPU for best results. They will see a huge jump in frames and image quality. Upgrading a CPU would only increase the frames by the single digits.

There are exceptions of course, but its probably only when you're already getting 80+fps or a game thats really heavily CPU dependent.

And if you do check charts of CPUs paired with the same GPU, you'll see small increments of fps regardless of how much you overclock. Its clear then that the game is GPU bound. Thats the case with all "modern" game engines, especially ones that aren't as refined as the HL2 engine. HL2 is the exception, not the rule.

An in which a 300$ upgrade of a 8800GTS 512MB over a 7900GS showed no benefit...
In fact HL2 is just the game where I had benchmarks to back it up.. I noticed that there was a marginal improvement with the GTS over the 7900GS in many other games, like the witcher etc... and everyone told me I am on crack because it is supposedly a card that is several times faster...

The CPU was the limiting factor.
If someone is telling me "I have 300$ to upgrade an x2 3800 with a GF79xx" everyone will be parroting the "get a video card, the benefit is amazing compared to upgrading the CPU"... and they will be wrong.
The beneift of upgrading the CPU far outweighs it... and the CPU gives you speedup in non gaming situations too (which is an added value, but does not change the recommendation).

I have both the benchmarks I made and the personal usage experience to back that up.