- Jun 21, 2005
- 12,039
- 2,251
- 126
Originally posted by: Scholzpdx
Absolutely. I liked reading that article.
But then i look at my rig and think my buying choices are justifiable.
Originally posted by: Scoop
"Overclock results with a single card configuration provided little benefit in most cases. Simply overclocking the video cards resulted in better performance numbers than overclocking the processors with this configuration."
They tested with reasonable settings and look at the conclusion. I hope this once and for all would kill the myths of improving framerates with CPUs in a single-GPU system. Not saying that a 2Ghz dual core does the same job as a 2.66Ghz Core i7 but it's pretty obvious now what you need to make your system GPU bound in games.
Originally posted by: chizow
Originally posted by: Scoop
"Overclock results with a single card configuration provided little benefit in most cases. Simply overclocking the video cards resulted in better performance numbers than overclocking the processors with this configuration."
They tested with reasonable settings and look at the conclusion. I hope this once and for all would kill the myths of improving framerates with CPUs in a single-GPU system. Not saying that a 2Ghz dual core does the same job as a 2.66Ghz Core i7 but it's pretty obvious now what you need to make your system GPU bound in games.
There's plenty of benchmarks that show a single GPU is still very dependent on CPU speed:
GTA4 - 13 CPU round-up
COD4 + GRiD - Intel CPU Clock for Clock Comparison @ 2GHz
COD5 - 12 Intel and AMD CPUs
Far Cry 2 - various speeds
Left 4 Dead - various speeds
There's even more results showing multi-GPU solutions based on the fastest single-GPUs are even more CPU bottlenecked, even up to 2560 with AA in some titles. This is only going to get worst with the next generation as its been clear for generations that GPU performance is accelerating at a faster rate than CPU performance or game requirements.
The next generation GT300 and RV870 should come close to doubling their current-gen counterparts, meaning you'll get exponential performance from the multi-GPU versions, but they're going to show smaller returns unless we get faster CPUs or much more demanding games more on the level of Crysis.
I don't think you understand the point of testing at 1280, its to show that if 30FPS on an Athlon X2 or slower C2D isn't enough, your FPS aren't going to get any better at higher resolutions or with AA, they're only going to get worst. It also shows that if you're getting 60-70FPS at a low resolution with a single GPU, and you see many reviews showing similar FPS with a multi-GPU set-up with the same CPU, you may not see more performance in the form of higher FPS, only in the form of more AA or higher resolutions at similar FPS.Originally posted by: Scoop
Like I said, AT tested with reasonable settings. Those tests you provided were all done @ 1280x1024 without AA/AF so of course a GPU like GTX 280 is going to be left hungry. I seriously doubt many people game at that resolution with a GTX 280. Or if they do, they have problems.
Originally posted by: chizow
I don't think you understand the point of testing at 1280Originally posted by: Scoop
Like I said, AT tested with reasonable settings. Those tests you provided were all done @ 1280x1024 without AA/AF so of course a GPU like GTX 280 is going to be left hungry. I seriously doubt many people game at that resolution with a GTX 280. Or if they do, they have problems.
Again, what do you think happens when you throw in another monster of a GPU, like the GTX 285 into the mix? It stops scaling because its just meant to fool people? The "reality" of it is, faster GPUs do require faster CPUs, otherwise there's very little difference between a fast single-GPU and multi-GPU. In order to show any benefit of multi-GPU you'd need a faster CPU to increase frame rates, you'd need to increase resolution/AA to push GPU limits, or you'd need to test more demanding games.Originally posted by: Scoop
True, I don't. Not with a monster of a GPU like GTX 280. It's not reality. It just fools people to thinking it makes a difference. Seriously, many people just look at the results and go 'WOW, man I gotta OC my system to 5GHZ so I can play Crysis maxed out'. That's why I don't like it. It's misleading.
Props to AT once again on the Gigabyte MB tests for going with reality instead of trying to get some differences out of the boards just for the sake of it.
Originally posted by: nRollo
The main problem facing these new Phenoms is the lack of motherboards installed for people to buy them as upgrades.
If a person has a LGA775 motherboard (most people) this pricing and these numbers won't convince anyone to pull their motherboard and get a Phenom board/Phenom.
If a person needs a new motherboard, the majority will buy Intel on name alone. The high end people, or people considering long term, will buy i7 as AMD has no answer.
Crossfire provides no incentive as Intel boards offer it as well.
So AMD is left with the same problem they've always had with Phenom: a very, very narrow market of people who will buy their products for any reason other than "I want to support AMD".
This is a big problem for AMD. These are the chips they needed to launch as the original Phenoms to succeed in my opinion. At this stage of the game I think it's too little, too late.
I have had no problems with my Phenom 9850BE, and think Phenoms are good processors. Hopefully I'm wrong about Phenom 2's chances, because a market without Phenoms will likely spell higher prices across the board on CPUs.
For the record, I've supported AMD with my own purchases and convincing friends to try them since 486 days.
Originally posted by: nRollo
The main problem facing these new Phenoms is the lack of motherboards installed for people to buy them as upgrades.
Originally posted by: thilan29
Originally posted by: nRollo
The main problem facing these new Phenoms is the lack of motherboards installed for people to buy them as upgrades.
As I've said before for desktops yes but for servers there's already a large installed base of Opterons and the new ones are drop in replacements and are very competitive against anything Intel is offering.
Originally posted by: thilan29
http://www.anandtech.com/mb/showdoc.aspx?i=3506
Seems the Phenom is fairly competitive.
Originally posted by: Phynaz
When was the last time you heard of any company of significant server purchasing power doing cpu upgrades on commodity x86 equipment?
Originally posted by: taltamir
competitive to WHAT, prices change all the time. Don't tell me its a "good buy", tell me which intel CPU you think it equates to.
Originally posted by: thilan29
Originally posted by: Phynaz
When was the last time you heard of any company of significant server purchasing power doing cpu upgrades on commodity x86 equipment?
I obviously don't know which companies would and wouldn't buy equipment, I was simply stating nRollo's statement doesn't necessarily apply to servers.
Originally posted by: taltamir
competitive to WHAT, prices change all the time. Don't tell me its a "good buy", tell me which intel CPU you think it equates to.
Competitive to whichever CPUs they were comparing to (in multi-GPU gaming)...what else could I have meant? I'm not telling you to buy anything. If it fits your budget and makes sense then buy it. At the time I bought mine, it was cheaper than what I could have got a comparable 775 system for. Don't think that just because I have one I'm telling you to go buy one...that's nowhere near what I said with my statement.
Originally posted by: nRollo
Intel has 72% of the server market in Q4 2008
AMD having a 28% installed market base in a market that by definition is much smaller does not change my position on the subject.
Originally posted by: thilan29
Originally posted by: nRollo
Intel has 72% of the server market in Q4 2008
AMD having a 28% installed market base in a market that by definition is much smaller does not change my position on the subject.
That 72% is Intel's share of only Dell and HP servers. Also it says right in the article:
"It should be noted that J.P. Morgan Securities does not precisely track corporate PC suppliers, such as Lenovo Group in the U.S., as well as channel vendors."