Socket 939 Sempron found........

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

porkster

Member
Mar 31, 2004
141
0
0
Well lets work out each factor, 1 by 1.

* Who agrees that it took Tom to correctly setup the Intel system to see it crash free?

This question is based on the many readers that think the Intel system has had 3 reboots with the current system hardware configuration to what it is running now.

.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,264
16,119
136
OK, you are changing the subject again... After they went thru 8 motherboards, and removed SLI (won't work stable on any INTEL mobo), YES, its now apparantly stable putting out a ton of heat at over 130 watts more than the AMD system. So what's your point ?, we have been over this 100 times !!! I have to go to bed and get up in the morning and make a living !!!
 

Aenslead

Golden Member
Sep 9, 2001
1,256
0
0
Originally posted by: porkster
Originally posted by: AensleadIts very dissapointing to see those power requirements, the failure of 4 motherboards, and the heat disipation.

Sorry but you're misguided. The Intel hasn't crashed once, yes that's ZERO CRASHES since they built the system properly.

The scores on the live pages at THG are tainted values as he forgot to reset them after rebuilding the Intel system after their mistakes.

.

Misguided? Well... It said there that 4 motherboards have died on them... nVidia nForce 4 Intel Edition. Yes? Yes.

Then, it said they needed a much better cooler because the one originally bundled with got temps up to 69ºC. That a lot, isn't it? You need a *larger, better fan*. Yes? Yes.

They changed the power supply because the otherone simply wasn't able to keep up with the power requirements. Yes? Yes.

So why am I misguided? Its ok to like Intel... just don't try to justify it without a basis... and if your argument is that it hasn't crashed since it was "correctly" installed... well, that sucks, because the AMD system got it in the very first try.

I'm not bashing Intel nor saying that AMD is waaay better nor anything by the like, but that Intel system looks like it needs a lot of attention to get right. Or so the page says. Care for some guidence?
 

Aenslead

Golden Member
Sep 9, 2001
1,256
0
0
Originally posted by: porkster
Originally posted by: HDTVMan
THG is biased toward Intel. The rumor is he got a CPU bonus from them. So instead of doing legit benchmarking he skews the result in favor of his buddies at Intel.

Don't be so ridiculous. Germany is the location of the AMD plant making the X2's and THG has got AMD banners all over the site. There are no indicators that he's bias towards Intel, more so for AMD.

If he hated AMD so would have finished the stress test properly for all to see the results.

.

Well... I've been around for about 10 years, already... and somehow, I've always seen/felt THG has a very strict appeal to Intel processors. They always seem to shed a better light in their products than what normaly is.

For example, when the A64 came out, most of the benchmarks on the web showed how A64 won over most of the benchmarks, but THG added an OCed Intel P4 processor so it could, inspite of being overclocked, stay on top. Its those little things that give that "bias" impression.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: porkster
Originally posted by: ProviaFanwould have to be extremely dense to not comprehend that the reason everyone is sticking with X86 is primarily due to the large installed base of software. To pass this off as a "fault" of AMD would require a level of stupidity not previously seen in mankind.
I was making one point in why the market is restricted by AMD's presence. There are many more.
And you were wrong.
Rambus(was good ram),
"Ok" RAM, horrible company. Intel dumped them after DDR got fast enough. Explain that one.
And BTX is only "necessary" because Intel can't keep their thermal dissipation under control. I'd like for you to clearly explain how AMD must blindly follow everything Intel does even when it's not in their best interest, or the best interest of all of their customers.
PCIe(Amd getting now),
Both had it at about the same time, case dismissed. Next?
DDR2(AMD moving towards now).
AMD went from SDR to DDR. Intel has gone from SDR to RAMBUS to DDR to DDR2 already. Not that there's anything wrong with either approach, but I do appreciate the AMD way, since I haven't had to swap RAM each time I've upgraded. I won't expect you to post any proof of how AMD's "failure" (notice the quotation marks, that indicates sarcasm) to adopt DDR2 has hindered anything in the market, because you can't, because such "proof" does not exist.
Enough of this, so don't reply please.
Yeah, we shouldn't reply, because we might ask you to back up some of these baseless statements. Oops. ;)

 

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
Dont post about THG anymore. When one site gets different benchmarks than a dozen others its apparent the guy sold out to Intel for free hardware. Its a shame I used to like his site but now I just dont trust it. He is rumored to be selling out in video cards now and taking video game money too. THG is a sellout. Various sites have posted about his funky testing procedures. He is about as truthfull as Apple is in their benchmarks. Making his site look more pathetic with each article posted. Im betting his Hits have gone down tremendously since he is becoming the laughing stock of the tech websites. The sad part is he is misleading people on their potential purchases.
 

Aenslead

Golden Member
Sep 9, 2001
1,256
0
0
Originally posted by: ProviaFan
And BTX is only "necessary" because Intel can't keep their thermal dissipation under control. I'd like for you to clearly explain how AMD must blindly follow everything Intel does even when it's not in their best interest, or the best interest of all of their customers.
PCIe(Amd getting now),

I have high expectations on what a nice, cool, BTX setup can do for AMD. If the cooling does get better, we can now trully imagine quite computing for most of us.
 

michal1980

Diamond Member
Mar 7, 2003
8,019
43
91
Originally posted by: porkster
I work for intel, and am an intel brown noser. All I have to say is
AMD=BAD
INTEL=GOOD

and thats the bottom line, no ifs, ands, or buts, take backs, etc, double stamped with no stamp brakers or take baks

 

michal1980

Diamond Member
Mar 7, 2003
8,019
43
91
double post, serisouly porkster you look stupid. you have no good facts.

on one hand you say toms study is the worst ever

on the other hand you say toms study proves amd sucks, and intel rules.

can you make up your mind? is the study good? or bad?

and really the intel system is not 'great' at all, i have never seen a chip fry so many mother boards. i mean the voltage regulators on one gave the ghost. the intel chip uses twice the power, and sucks up wayy more juice.

briefly put, if the p4 desgin was so good, intel would keep using it. instead there switching to the p-m desgin, because they relize that the p4 is not good. in fact without hyper thread i'm assuming the amd chips would smoke intel in every test.

oh and geuss what tom your god, or devil, is going to run a test comparing a x2, vs any of the other NONE hyperthreadyinf pentium D's.

so hmm, lets see, i'm willing to put 100 bucks down that in a intel pentium d test, just like toms, vs a x2 (lets say 1 chip below the top, or whatever, theres 4 each so an equivalant level), out side of the 1 hyperthreading chip. the amd will win.

and i really don't want your money, just for you to never post again
 

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
By the way Intel Dumped rambus because no one was buying into Intel Propretary propeganda. Just like no one bought into IBM and Microchannel. No one likes to be lead into propretary except those who use sony products for some reason.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: Aenslead
I have high expectations on what a nice, cool, BTX setup can do for AMD. If the cooling does get better, we can now trully imagine quite computing for most of us.
I have no problems with BTX; my problem is with those who blame AMD for not jumping on the BTX bandwagon just because it's the "latest and greatest." Compared to the way Intel moved right on to PCI-E and DDR2, it almost seems like they're ignoring BTX in comparison (they're not, I know, but that's not what I said).

Besides, after purchasing a US$130 case (no PSU included in that price) a few years ago, I'm in no hurry to replace it just to keep up with the latest tech, when my "old" case continues to work splendidly.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
If he isn't be dishonest like everything else he posts I would say I knew it all along...now if Dothan was only man enough to admit it.....
 
Jun 10, 2005
39
0
0
Originally posted by: ProviaFanmy problem is with those who blame AMD for not jumping on the BTX bandwagon just because it's the "latest and greatest."
If a company that has 90% of PC market can't make BTX a success, then how the hell AMD can make any difference ?
It's OEM's that were not intrested.

Those dummies should blame Dell if any.
 

Aenslead

Golden Member
Sep 9, 2001
1,256
0
0
Originally posted by: ProviaFan
Originally posted by: Aenslead
I have high expectations on what a nice, cool, BTX setup can do for AMD. If the cooling does get better, we can now trully imagine quite computing for most of us.
I have no problems with BTX; my problem is with those who blame AMD for not jumping on the BTX bandwagon just because it's the "latest and greatest." Compared to the way Intel moved right on to PCI-E and DDR2, it almost seems like they're ignoring BTX in comparison (they're not, I know, but that's not what I said).

Besides, after purchasing a US$130 case (no PSU included in that price) a few years ago, I'm in no hurry to replace it just to keep up with the latest tech, when my "old" case continues to work splendidly.

I agree.

The true advantage is that we can have something very similar to what Apple has in their PowerMAC G5 boxes. The cooling in those machines is simply impresive, and I beleive that its the correct aproach: many, high density, low RPM fans, cooling the whole system instead of a large, noisy fan for the CPU/Video Card.

That's very appealing.
 
Jun 10, 2005
39
0
0
Originally posted by: Duvie
http://www.hothardware.com/viewarticle.cfm?articleid=682&cid=1

It appears that this review site has same TT heatsink with heatpipes supplied to them with the 4800+...It is likely Toms is fvcking up again...It is getting downright ridiculous now....

It wasn't before this ? :D

They can't say anymore that it is difficult to tell apart those coolers, just look at those, HUGE difference.

http://koti.welho.com/pnystro2/somepics/cooling3.jpg

http://koti.welho.com/pnystro2/somepics/testbench.jpg

I honestly believe that they have tried to make X2 crash, seriously.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: AnandThenMan
Is this the cooler THG is using?

Thats the one.....Hahahaha...those fvckers...i borrwed one like that from Mark with my A64...It was nice but nowhere near waht it could be....

The 70mm x 15mm fan is pathetic and very much the reason it needs to run high 4000's to push enough air and even then it is likely in the 30cfm range only....

Well at leasst they were stupid for both of them....


The real question is was this the cooler that AMD bundled with all of their test boards??? Or did tey get an engineering sample without cooler and so they ASSummed what it would be.....

I guess we need to ask sites like Anandtech, TechReport, Xbitlabs, etc.....I believe all of them got bundles sent to them from AMD....

I bet the temps with the one with heatpipes could be in the 49-50 range if not lower...
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: Opteron
Originally posted by: Duvie
http://www.hothardware.com/viewarticle.cfm?articleid=682&cid=1

It appears that this review site has same TT heatsink with heatpipes supplied to them with the 4800+...It is likely Toms is fvcking up again...It is getting downright ridiculous now....

It wasn't before this ? :D

They can't say anymore that it is difficult to tell apart those coolers, just look at those, HUGE difference.

http://www.xtremesystems.org/forums/attachment.php?attachmentid=32389

http://www.hothardware.com/image_popup....e=big_platform_3.JPG&articleid=682&t=a

I honestly believe that they have tried to make X2 crash, seriously.



It is still a lower profile fan but likely an 80mm fan and not the very small 70mm....
 
May 30, 2005
142
0
0
The funny thing about Intel's hyperthreading is, with all the other stuff they did to the NetBurst architecture, they ended up negating most of the advantages of hyperthreading. It would be far more powerful if it had been designed to work out the potential chip-level bottlenecks. But no, Intel's long pipelines, combined with branch mispredictions, and instruction-prefetch mechanisms that produced code errors at the hardware level forced them to add a "replay" mechanism whose sole job is to correct the code in execution to make it work correctly.

It seems to me that Intel's philosophy with Netburst was to scale to as high of clock speeds as possible, and use hyperthreading, branch prediction, instruction prefetch and such to improve the efficiency of their processor; that is, make them do more stuff while under "full" load. Problem is, their plans completely backfired and that is what forced Intel to include correcting mechanisms for branch misprediction and erronious instruction prefetch; correcting mechanisms that increase die size, heat and slow their processor down. As far as I can tell, there's nothing that can fix a slow hyperthread or speed it up, except for an architecture redesign. But that's not all; not only do the Intel processors deal with these issues, but the disadvantages of the correcting mechanisms for branches and prefetching stack to make hyperthreading even less efficient than it would be otherwise.

So, it's no wonder that Intel's abandoning Netburst. The way I see it, Intel had a very good idea with Netburst, but it didn't match with reality. Their simulations were likely faulty if they "verified" that Netburst would improve the Pentium line. The ironic thing is, they would likely be much more competitive if they just stayed with whatever architecture they used in the Pentium 3; remember the early benchies where the slower P3 beat the then-new, faster P4.

Then there's Intel's Itanium, where Intel cleverly thought to market a very expensive chip that would cost their clients millions in code re-writing and ended up giving marginal improvements. All that these moves of Intel did was give AMD the chance they needed to expand into the market, and AMD utilized every bit of that chance that they could. Case in point: simultaneous 32-64-bit Opterons for servers, with lower cooling costs and not costing millions in re-writing code. Then the Linux community seemed to develop a fondness for AMD when they developed their 64-bit OS's last year. But we should be reminded that AMD moved to 64-bit computing in 2003 specifically to get marketshare in the lucrative server market, what's happening now in the personal computer realm is the result of that push. Now AMD also had the foresight to develop multi-core processors in mind for some time now, which likely wouldn't be here now if Intel was our only player. They are also

This brings me to the reason why Intel is losing ground, technologically and in the market, while AMD is gaining ground in both. Intel has lost sight of what made them succesful: giving the software community what they wanted. Instead of serving the market's desires, Intel's trying to control the market's desires by pushing all this stuff that won't really add massive benefit while at the same time making it more inconvenient to adopt. Case in point: Itanium, new motherboard for Intel's dual-core, constantly changing RAM memory support, etc. AMD is becoming more succesful because they adopted this very approach to the market, by offering more of what people want while making upgrading relatively painless. It is true that AMD used to be the x86 clone, but they've finally broken ground to become truly innovative in their technology and perceptive to market needs and desires. In the meantime, Intel's chasing perfectionist pipedreams that turn out not to be. Unless Intel gets their act together, AMD technology is going to be superior for some time to come, and that technology is also going to be spot-on with what people want. This is true for 32 to 64-bit computing, true for moving to multi-core processors, true for support of memory standards, true for not forcing developers to waste millions in writing code that can work only on your own proprietary processor with only marginal improvements.

This probably deserves its own thread, but I thought I'd place it here. In fact, I'm going to place it in its own thread so it's easy to find; I find Anandtech's thread-scrolling to be rather dodgy.