They had Starcraft 2 and it was showing clear GPU bottlenecking, and they didnt even use AA. If theyd dropped a GTX480 into that system they wouldve gotten a large performance gain, especially if theyd been using AA.But if you play games like Civ5, SC2, Dragon Age: Origins, Far Cry 2, World in Conflict, Arma2, RE5 and get a Q6600 + GTX480 then a Core i7 860/920 + GTX470 system will destroy it.
The point was that a GTX460 is a bottleneck even with a dual-core CPU, and that something like a GTX480 would show a much higher framerate.They tested AvP at 1920x1080 8AF and Anno1404 at 1920x1080 8AA! on a GTX460 768mb and then measured CPU limitation on a Core i5 4.0ghz lol! Are they nuts? a 768mb GTX460 can't play AvP with tessellation smoothly regardless what CPU it's paired with. That would be the same as using 8800GT at 1920x1080 Entusiast setting in Crysis and then arguying that changing CPU speed from 4.0ghz to 2.0ghz made no difference - we already know the conclusion before testing!
A minimum framerate means squat if it doesnt have a benchmark graph putting it into perspective. Heres an example from your own sources:Plus no minimum framerates, which makes any CPU limitation article worthless.
It doesnt always work like that as Toms article showed. If theres a significant load on the GPU you can still see higher performance even if it isnt utilized 95% or more. The same applies to the CPU, except its the graphics card that does this far more often.Ok now, let's say I add a GTX480 into the 2 systems. I can increase AA, resolution, and still get the same 38 avg if I wanted to on the C2Q @ 3.6ghz, since I'll be transferring the load to the GPU. My CPU can still support 38 fps avg. Therefore, I'll be able to increase image quality and still maintain decent playability, or I will get faster framerates than 38 fps on a faster CPU at the same image quality (if I don't have a CPU limitation). Now if I add a GTX480 into the E6850 rig, it's still choppy and unplayable. I am already at < 30 fps without AA, with minimums at 21!
Im glad youre such an astute student of my benchmarking methods Russian, LOL!Quake 1-4
Doom 3
Far Cry 1
Serious Sam 2
Fear 1
Quake Wars
HL2
Unreal Tournament 99, 2004
Call of Duty 1, 2
Return to Castle Wolfenstein
Medal of Honor Pacific Assault
Next I dropped a GTX480 into my system and at the same settings it caused a 27.43% performance gain (57.60 FPS).
Next I re-tested my GTX480 with an i7 870 (14% clock advantage over an i5 750) and I got zero performance gain.
and the GTX480 has provided a far larger performance gain than all of my CPU upgrades combined.
I'm sorry if this has already been mentioned, but I'm not reading this thread. I saw they were testing a 768mb card at 1920x1200@8xAA and stopped reading. This isn't testing CPU/GPU bottlenecks, they are testing CPU/VRAM bottlenecks. Most useless article ever.
They aren't testing with the wrong graphics card at the resolution they are using.
Actually their test is very sensible. Take a graphics card positioned at a certain resolution, and test it at that resolution.
http://www.techpowerup.com/img/10-09-06/27b.jpg
They paired it with a sensibly priced CPU, the sort which you might see in GTX460 system, which is the easiest way to see which component you should think of spending more on (~$200 GPU vs ~$200 CPU).
The article doesn't really help anyone who plans on upgrading imo. Most people who are looking to upgrade the CPU are already using dual cores (C2D 1.86 - 3.0ghz variety - E6300 - E8400) or low end quads at stock speeds (such as Q6600/6700). These people are wondering if upgrading the GPU is worth it from their 4850/4870/GTX260, or if they are going to be bottlenecked by their slower CPUs. In other words, it would be completely wasteful to get a GTX480 for a stock E6600/Q6600 as such systems would produce almost identical framerates with a slower GTX460/5850 videocard, making a GTX480 wasteful.
From that perspective, the article did little to help these users decide what to upgrade. It would have been far better to see various systems such as C2D 1.86, C2D 3.0ghz, Core i3/i5 @ 4.0ghz, Athlon X4s + 4850 compared to the same CPUs with GTX460/480/5870 and then tested SLI/CF setups too. Then we would have seen which GPUs are wasteful for which CPUs and what's the minimum modern CPU clock speed/core count for modern games (not just FPS variety either).
Plus, you can't compare an i5 dual core processor to a dual core Phenom or C2D processor due to the differences in performance per clock and the effects of shared 8mb cache. And like I said, they didn't include minimums in most of their graphs - CPU plays a large role for minimum framerates.
This article would have been great if Xbitlabs, LegionHardware, PCgameshardware and Techspot already didn't produce far superior CPU/GPU articles. However, since the results of these websites constantly contradict the predominant view on our forum that CPU speed is not important, I only see Toyota, myself and a handful of others linking to them (with BFG on many occassions ignoring results from all 4 of those websites because they show both CPU frequency and core count dependence in a large variety of games; and they focus on minimum framerates - a metric BFG largely dismisses as 'inaccurate').
So we have 4 independent sources which continue to show that CPU speed is important and 1 source that shows that it isn't (on top of that using a $130 videocard paired with a $200 CPU to prove their point). It's almost the same as Wreckage trying to find 1-2 outlier benchmarks where a stock GTX460 beat an HD5870 and then claiming that GTX460 is as fast as an HD5870. Bottom line is, every game is different. The games one plays should be tested separately in order for us to answer if CPU or GPU is more important for a particular game.
Again, what is so surprising about a GTX460 768mb being the bottleneck of a Core i5 3.0ghz system in games tested at 1920x1080 4/8AA? They mysteriously omitted testing GTX480/5970 to show that the importance of the CPU.
:$: my i7 920 is now clamoring for a hotter girlfriend. thanks a lot.
The game was coded POS basically filled with memory leaks.
It still choked every 5 seconds since you have to change camera angles and perspectives all the time in environment and combat.
I am not so concerned with the data as I am with the the dangerous advice that can come out of it. I just don't want to see someone looking to build a system to last 2-3 years go out and buy a Core i3-560 3.33ghz ($150) + GTX460 1GB ($200) over a Core i5 750/1055T ($200) + GTX460 768mb ($150) because of that article. This is exactly the type of advice I fear because it will result in the person upgrading both the GPU and the CPU 2 years from now. The i3 system will be basically worthless while a 3.8ghz+ 750/1055T will be usable 2 years from now. You know what I mean? We have seen first hand single core A64 owners suffer this fate, and soon we will see all dual-core owners suffer the same fate. .
I can't even count how many times I have seen people who initially bought E6600/E6850 upgrade to Q9550s to ride this generation out with new GPU cards. On the AMD side, at least you can swap a new CPU since you can get a X4 940 for $100, for example. However, with price parity between a decent dual core and Quad-Core being about $50-70 now, it is too much of a gamble to get a dual-core now when already a lot of games benefit from quads (Starcraft 2, DA:O, Supreme Commander 2, World in Conflict, Civ5, GTAiv, Resident Evil 5, Mass Effect 2). Just my 2 cents on the matter.
I'm sorry if this has already been mentioned, but I'm not reading this thread. I saw they were testing a 768mb card at 1920x1200@8xAA and stopped reading. This isn't testing CPU/GPU bottlenecks, they are testing CPU/VRAM bottlenecks. Most useless article ever.
Yep, this is what I saw. I didn't see in the conclusion of the article where a dual core processor was being recommended over a 3 or 4 core. So, I'm not sure what all the fuss is about.they recommended that the optimum amt of cores was 2.75, so anybody building a rig today would reasonably get 3+cores, right? The only thing that they should have clarified is that HT makes little to no difference in gaming so don't think that 2+2 ~ 3 cores.
I think you went a bit overboard comparing bfg to wreckage. BFG has his opinion. for once I actually disagree with him, but that was like comparing keys to rollo b/c they're both focus group members.
yeah I moved away from that level of cpu 2 years ago this month. it certainly wont get the job done when running decent settings in many modern games. even in games where it is perfectly playable, it will still noticeably limit any modern mid range or better gpu.I've been finding myself CPU limited lately to the point that I can no longer run several current games. I've got an Opteron 165 overclocked to 2.6ghz.
I've been finding myself CPU limited lately to the point that I can no longer run several current games. I've got an Opteron 165 overclocked to 2.6ghz.
yeah the Opty at 2.6 it would be about like an E6400.Was that a dual core? IIRC that is has a 25% IPC deficit to the original Core 2 Duo which puts you between a E6300 and E6400 Conroe. Yep, I'd say you need to upgrade your CPU. But you were able to get quite a bit of life out of it... That ah heck is darn near 4? years old.
I think you went a bit overboard comparing bfg to wreckage. BFG has his opinion. for once I actually disagree with him, but that was like comparing keys to rollo b/c they're both focus group members.
Yeah, that is totally not cool.
Nah, I guess I didn't make myself clear. I wasn't comparing BFG to Wreckage. I was comparing Tom's Reviewer by using extreme AA settings in his game testing to predictably "prove" that most games are GPU limited akin to Wreckage's extreme examples of GTX460 beating an HD5870. Tom's Review automatically assumed that people will slap high AA+Tessellation whenever possible (like Metro 2033, AvP, Just Cause 2), even if it means sacrificing all playability in those games (50-60 fps).
Tom's never contemplated that some gamers may want 60 fps+ at reduced or no AA. Under those gaming situations, CPU speed does matter because a slower clocked CPU often can't get you 60 fps regardless if you disable all filters. Also, I am pretty sure the $130-150 GTX460 768mb card is not the target market for 1920x1200 8AA.
Yeah, the problem is I would have to buy new memory and everything, so I'm holding out.Was that a dual core? IIRC that is has a 25% IPC deficit to the original Core 2 Duo which puts you between a E6300 and E6400 Conroe. Yep, I'd say you need to upgrade your CPU. But you were able to get quite a bit of life out of it... That ah heck is darn near 4? years old.
Yeah, the problem is I would have to buy new memory and everything, so I'm holding out.
My GPU is outdated as well. I'm waiting for a bit and them I'm going to get an entirely new computer.
This is true.Gaming gets old though... For the first time in my life, I don't really care about upgrading my computer. I have no plans to upgrade my GTX 280. Heck, I don't even have plans to setup my desktop after the move. This laptop I use with a SU7300 handles what I need too, even King's Bounty: Crossworlds for when I decide that gaming is not so boring, only to find out that it is boring again.
Gaming gets old though...