• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

THG FINAL stress test update...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
With the AMD platform you've only got one option, which fortunately is stable: to combine the AMD platform with a motherboard based on NVIDIA's nForce4 SLI chipset.

Heh, I first read "unfortunately is stable" lol.

When multiple applications are running, the clear conclusion is that the Intel Pentium 840 Extreme Edition is superior to the AMD Athlon 64 X2 4800+.

We all know that multiple applications means only 4 threads, one with a low priority. Running 2-3 applications,4 application with equal priority and 5 or more applications at the same time isn't called "multiple applications", so they didn't test any of those situations...
 
A few comments...

1. HP has announced that they will have X2 based workstations very soon (weeks). They have also announced new Turion laptops and Opteron budget servers. Sun has followed suit with their soon to be announced $895 Opteron server...IBM isn't far behind (they have already announced their new Opteron blade products).

2. The current X2s are low-end workstation chips and surpass the dual Xeon workstations in performance/price/power. The Pentium D series are budget desktop chips. Comparing the prices of the 2 is like comparing the price of a Sempron and a P4EE...

3. THG chose exactly 4 apps (as opposed to 1-3 or 5-x) based on Intel's "suggestions" I am sure. This is the configuration that HT enabled chips are at their very best (i.e. any other configuration wouldn't have even been THIS close).

4. Intel is promoting platforms, but THG did not allow the AMD chip to use it's best platform. It seems to me that both systems should have been allowed to run at their best.

5. Tom Pabst hasn't run THG for years...it has been run by the self-proclaimed Intel fanbois Omid Rahmat. In addition, most of the old controversial articles (Intel 1.13 GHz scandal, Rambus problems, etc...) were actually written in whole or in part by Van Smith.
 
1. With intel you get better support....because you'll need support with the instability of the intel dual cores (sarcasm).

2. HP and SUN offer opteron dual cores which are essentially X2's (with more HT links) so i think availability is not an issue.

3. and yes only people with THG will call the PXE w/HT vs X2 4800+ with 4 apps running as a victory for intel....actually it was a tie. AMD wins 1 intel wins 1 and they are very close in the other two (for non-SLI)....with SLI AMD wins 3 of 4 or if you're intel fanboi, AMD wins 2 ties in 1 and loses in 1 (divX).

4. and is HT a good thing??? we would have found out if they ran 3 or 5 or more apps/threads. but, they didn't.

That's THG.
 
Why the hell do you people all assume its the fault of the Intel CPU and mobo here? Its not. Tomshardware can't get anything working, while ALL other review sites can, that already tells about the reliability of their test results, however you people decide to believe the Intel part of the story(the crashes and everything), while in reality is because they can't get it working.
 
Originally posted by: IntelUser2000
Why the hell do you people all assume its the fault of the Intel CPU and mobo here? Its not. Tomshardware can't get anything working, while ALL other review sites can, that already tells about the reliability of their test results, however you people decide to believe the Intel part of the story(the crashes and everything), while in reality is because they can't get it working.

Then how come they got the AMD system working with out any problems? And how do you explain a motherboard literally baking the voltage regulators due to heat. That's not the fault of the processor? And it's not just one motherboard that failed. Not only that, NVIDIA is not known for making defective chipsets.

I guess the lesson is, if you want an Intel dual core box you better hire an Intel technician to help you. And forget about SLI while you're at it.
 
Originally posted by: Capt Caveman
Originally posted by: yacoub
Only a retard is going to suggest the Intel system is the better option for businesses to run when the most important thing to businesses - stability/reliability - was the least proven on the Intel platform. Remind me again how many parts and pieces they had to keep switching in and out before they got the system up and running (kinda) stable? No IT department would put up with such a hairy piece of crap. They would buy the solution that's essentially stable out of the box, which in this case was clearly the AMD system. It performed just as well and cost less - added bonuses to its stability.

So the conclusion is that Tom is a retard.

My only guess is that you work at home, in school or not involved in IT in a business setting. Most companies buy particular models from particular vendors. i.e. IBM, HP, Dell They do this b/c of the synergies of supporting a smaller number of models from a reputable vendor, product and support contract pricing. None of these companies sell X2's out of the box. And businesses aren't going to be buying X2 out of the box systems from Alienware or Velocity Micro.

Just like how everyone is critizing the testing done by THG, businesses aren't going to look at it either. It's been widely known that the Nvidia nF4 Intel chipset isn't mature and no business is going to use it. From reading other forums, folks don't seem to have these problems w/ their Dell Pentium 840EE systems or one's using the Intel chipset motherboards.

Unless you're a tiny company that would consider building their own X2 systems, the X2 isn't an option yet in the business world.

The problem here that you discuss the option of a system based on availability while THG has based it on a test. So while you might be right (remember Opterons are available fomr IBM and HP) about Intel being more out on the market, THG was wrong in giving any credit to the Intel platform looking at the test.
 
The Intel System is the clear winner, cry as much as u want...
Both Intel and AMD released their Dual Core products for the multitasking multimedia segment. This is where AMD clearly loses... no1 in this world needs a Dual Core processor 2 play FarCry... any normal buyer would grab the Pentium M 770. The same goes for the single app benches. If theres only 1 app 2 run (which doesnt even support multithreaded processing) y not just buy a cheaper single core processor?

I agree that THG did some crp before and during the test but afterall the conclusion isnt much far away from reality.
 
Originally posted by: Duvie
BAlso question why does the INtel P D 840 have the same TDP as the XE 840...Common logic and truth say it wont and that is why Intel exceeds the TDP in their white papers with cpu consumption...
Actually AMD gives all processors with the same core the same TDP too. I'm not sayign that Intel's TDP numbers aren't misleading, but the TDP was never meant to indicate the maximum power used by a specific chip. Rather, it's a design spec to let motherboard and heatsink manufacturers know what they need to provide for. The TDP is to help people design components that will be compatible with a whole line of processors, so it's not surprising that more than one CPU has the same TDP.
 
A previous poster said it best. Sigh

Every review I have seen in multitasking (heavy multitasking) shows the Pentium IV 840EE with a slight (slight) lead in performance. When it came to almost all single threaded programs (games) or multithreaded programs it was domination by the AMD cpus. If this review had been "fair" they would have at the very least kept the SLI running on the AMD machine because that was stable. If the Intel machine wasn't stable in SLI that isn't AMD's fault. It was a ridiculous AD performance campaign disguised as a stability test. Stability = AMD, period according to this test. Cooler, less electricity as well.

How come people don't take into account electric bills over the course of two years in their cpu purchase. Over 100watts difference is a lot, especially if the computer runs continually. That would certainly eat into the difference, LOL.
 
Originally posted by: Ycon
The Intel System is the clear winner, cry as much as u want...

No it isn't. Read Anand's custom multitasking test at the end of the X2 multitasking review, located here:

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2410&p=9

If you read that comparison, the 840 Pentium D and 840EE trade places as the winner in different tests(EE wins tests 1, 3, and 4; D wins both segments of 2; X2 wins gaming multitasking).

Given the fact that the X2 makes a strong showing in all of those tests(unlike the D and EE which trade places a bit), I'd have to say that the X2 was more reliable as a multitasker than the EE(look at it falter in multitasking test 2). The X2 also walks away with all the single-threaded and multithreaded benchmarks, as well as the multitasking synthetics.

Tom's test seems to have been configured to make the EE look good. They had to run 4 CPU-intensive apps at once in order to accomplish this, which is pretty stupid. Even by Tom's own admission, the X2 wins in tests running two intensive apps at once, which is precisely what any AMD or Intel zealot would want from their dual-core CPU. Never mind the fact that the XP scheduler has always favored HT chips . . .

any normal buyer would grab the Pentium M 770.

Huh? Do you know how few people actually run Pentium Ms outside of the notebook segment? Do you really think a "normal" user would run out and buy a relatively expensive mobile CPU, pop it in a now-dated motherboard with the help of an adapter, and overclock the hell out of it to justify the purchase? I think not. Are you going to tell us that normal users have phase-change cooling too?

I like the Dothan as much as anyone else, but it is not currently a viable desktop CPU for "normal" users.

I agree that THG did some crp before and during the test but afterall the conclusion isnt much far away from reality.

The problem that Tom's Hardware has is that they have no real sense of reality. Apparently that problem is not exclusive to Tom's Hardware.
 
Originally posted by: Otter
Originally posted by: Duvie
BAlso question why does the INtel P D 840 have the same TDP as the XE 840...Common logic and truth say it wont and that is why Intel exceeds the TDP in their white papers with cpu consumption...
Actually AMD gives all processors with the same core the same TDP too. I'm not sayign that Intel's TDP numbers aren't misleading, but the TDP was never meant to indicate the maximum power used by a specific chip. Rather, it's a design spec to let motherboard and heatsink manufacturers know what they need to provide for. The TDP is to help people design components that will be compatible with a whole line of processors, so it's not surprising that more than one CPU has the same TDP.

Not true...

the 820 is 115w and the 830 and 840 are 130w as well as the 840 XE...The XE has HT on and we all know runing with HT on does geenrate more heat..we have known this for some time...I would not say the XE 840 is the "same" as the other chips...

A big reason why Intels TDP numbers are known to be suspect and that it appears most when they do buy retail coolers have "little to no" headroom...often times leading to thermal issues if the person has decent to below average case cooling....very sad IMO...
 
I've been reading the forums over at THG and some of their veteran members are rather angry that THG would post such a biased article. Some of them are disgusted and are talking about not even associating themselves with THG.
 
Originally posted by: Duvie
Originally posted by: Otter
Originally posted by: Duvie
BAlso question why does the INtel P D 840 have the same TDP as the XE 840...Common logic and truth say it wont and that is why Intel exceeds the TDP in their white papers with cpu consumption...
Actually AMD gives all processors with the same core the same TDP too. I'm not sayign that Intel's TDP numbers aren't misleading, but the TDP was never meant to indicate the maximum power used by a specific chip. Rather, it's a design spec to let motherboard and heatsink manufacturers know what they need to provide for. The TDP is to help people design components that will be compatible with a whole line of processors, so it's not surprising that more than one CPU has the same TDP.

Not true...

the 820 is 115w and the 830 and 840 are 130w as well as the 840 XE...The XE has HT on and we all know runing with HT on does geenrate more heat..we have known this for some time...I would not say the XE 840 is the "same" as the other chips...

A big reason why Intels TDP numbers are known to be suspect and that it appears most when they do buy retail coolers have "little to no" headroom...often times leading to thermal issues if the person has decent to below average case cooling....very sad IMO...

What do you mean "not true"? The numbers you gave support what I said. TDP is a mobo and heatsink spec, and does not necessarily reflect the heat from a specific CPU. It should reflect the the power consumed by the hottest CPU the manufactuer intends to release for the spec.

The problem with Intel's TDP numbers is that they are lower than the power consumed by the hottest CPU for the spec by a good margin. While AMD leaves room for Murphy, Intel claims their CPUs are so fast no one will run them at 100% long enough to cause them to overheat (or at least they were a year or so ago when I dug around to find out what the TDP numbers from both companies actually meant). This of course results in thermal throttling if your cooling solution can't remove a lot more heat at reasonable temperatures than the TDP. It may also lead to instability if the mobo manufacturer was naive enough to trust Intel's numbers and the circuit powering the CPU is overloaded.

That the TDP is the same for several Intel processors says nothing about the validity of the number. OTOH, that the processor uses quite a bit more power at load and will overheat without cooling that far exceeds the TDP says everything that matters about the validity of that number.
 
Originally posted by: Otter
Originally posted by: Duvie
Originally posted by: Otter
Originally posted by: Duvie
BAlso question why does the INtel P D 840 have the same TDP as the XE 840...Common logic and truth say it wont and that is why Intel exceeds the TDP in their white papers with cpu consumption...
Actually AMD gives all processors with the same core the same TDP too. I'm not sayign that Intel's TDP numbers aren't misleading, but the TDP was never meant to indicate the maximum power used by a specific chip. Rather, it's a design spec to let motherboard and heatsink manufacturers know what they need to provide for. The TDP is to help people design components that will be compatible with a whole line of processors, so it's not surprising that more than one CPU has the same TDP.

Not true...

the 820 is 115w and the 830 and 840 are 130w as well as the 840 XE...The XE has HT on and we all know runing with HT on does geenrate more heat..we have known this for some time...I would not say the XE 840 is the "same" as the other chips...

A big reason why Intels TDP numbers are known to be suspect and that it appears most when they do buy retail coolers have "little to no" headroom...often times leading to thermal issues if the person has decent to below average case cooling....very sad IMO...

What do you mean "not true"? The numbers you gave support what I said. TDP is a mobo and heatsink spec, and does not necessarily reflect the heat from a specific CPU. It should reflect the the power consumed by the hottest CPU the manufactuer intends to release for the spec.

The problem with Intel's TDP numbers is that they are lower than the power consumed by the hottest CPU for the spec by a good margin. While AMD leaves room for Murphy, Intel claims their CPUs are so fast no one will run them at 100% long enough to cause them to overheat (or at least they were a year or so ago when I dug around to find out what the TDP numbers from both companies actually meant). This of course results in thermal throttling if your cooling solution can't remove a lot more heat at reasonable temperatures than the TDP. It may also lead to instability if the mobo manufacturer was naive enough to trust Intel's numbers and the circuit powering the CPU is overloaded.

That the TDP is the same for several Intel processors says nothing about the validity of the number. OTOH, that the processor uses quite a bit more power at load and will overheat without cooling that far exceeds the TDP says everything that matters about the validity of that number.

Fist of all I didn't say that wasn't the case and I very well kno that since I had this discussion in the PM's with an INtel engineer who works on these about 2 weeks ago...



T he numbers I am showing are trying to prove the point that no way should the XE 840 and the 840 D have the same TDp...I am saying in example the XE and the 840 D are not the saame and anyone with half a brain would assume that with HT enabled for both cores would and should have a higher tdp....Even if all it is is a spec for mobos and HSF to be designed for....Enabling 2 sets of Ht with their millions of additional transistors building heat is a no brainer....From the results I am seeing the TDP should have been like 145w but since they like to be "conservative" at least say 140

I guess we will just agree that INtel's numbers are misleading...

I agree AMDs numbers are more easy to compare since other then the added cache which can generate more heat as well they are realtively identical...or at least more so then the Intel chips...
 
Originally posted by: Duvie
I guess we will just agree that INtel's numbers are misleading...
No question about that.

Personally, it seems to me that Intel could use the same TDP for a CPU with HT, larger cache, higher clockspeed, onboard waffle iron, etc., as long as the TDP was higher than the power used by the hotter CPU at full load. Then the heatsinks would keep the CPU's cool, mobos would not undervolt or overheat, and maybe Tom's would even be able to get their Intel setup to work. If Intel is going to build power hungry monsters, they should just publish realistic specs so that the other manufacturers can properly support the CPU's. Whether you believe that Intel should have given the XE a different TDP than the D, or you just think Intel refuses to admit how hot their chips run, the real problem is that their TDP is unrealistic. It wouldn't surprise me if it was too low for the 840 D too.

It occurs to me this might have been going on for quite a while. The last Intel CPU I owned was 486 dx2-50 that came in an NEC laptop. It ran hot enough to burn my legs, and I had to send that computer back to NEC three times. I always blamed NEC for that. I thought they were slow to get their minds around the idea that a laptop might actually need a CPU fan. But maybe they were slow to realize that Intel was lying to them and when the new CPUs came out, they were stuck with a computer that couldn't easily be modified to take a fan.
 
Back
Top