How could BD pull AMD up?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106

PreferLinux

Senior member
Dec 29, 2010
420
0
0
Intel reported the TDP of Nehalem chippery as 130W. Then again, if you actually do the math yourself, which is: TDP = v.core X amperage, the TDP is = 1.37 X 100 = 137W. So Intel point blank lied about TDP, have done so for a long long time in past. Which is why AMD came up with ACP to counter it. When you're talking about 140W TDP max on an AMD chip, the chip is usually not reaching that much under normal/ heavy load conditions. I tried a lot, but i couldn't find the bloody amperage of SB chippery. My point is, why you would so easily believe claims of one company and not the other. It's not like you've actually verified internal temperatures by placing your own devices. You're reading off a screen, so really, what makes it believable. As far as i'm concerned Intel has not been honest about TDP for a while now, and they continued to do so even when they has better performing products. Ask why, and you will come to conclude that it makes them look good, and so they lied.
No, it isn't lying. It is Thermal Design Power (the power level it is designed to have to dissipate for a period of time) they are talking about, not maximum power. Also, the current isn't necessarily for that voltage.
 

john3850

Golden Member
Oct 19, 2002
1,436
21
81
Which is what i was trying to tell those brown nosed Intel shills... did you not read? I never understand why they are so willing to bend over for one and not even give a chance to the other company. They swear by Intel, like its word is gospel and chips are manna from heaven. The forums here are too blue (Intel hued)

forums are changing
from the 939 into the x2 right to the C2D release this form had the most amd people you ever saw.
Then people were into the fastest cpu and the X2@$400 was hot.
Things were never the same here after C2D
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
Personally I think Intel CPU's are faster and thats why they make more heat. They can also deal with it more because they have better internal designs. This is coming from somebody who has an X4 955 @ 3.8Ghz.

Not to put down AMD though my CPU is very powerful. I havn't run accross a game where it hasn't been able to handle it.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,726
421
126
Personally I think Intel CPU's are faster and thats why they make more heat. They can also deal with it more because they have better internal designs. This is coming from somebody who has an X4 955 @ 3.8Ghz.

Not to put down AMD though my CPU is very powerful. I havn't run accross a game where it hasn't been able to handle it.

Not really.

AMD CPUs generally (I mean 45nm Phenom II) consume more power than equivalent and more expensive icore offerings.

But that is measured in Watts not CPU temperature. CPU and GPU temperature are directly related to the cooling solution - if you grab a GTX480 and water cooled it, the temperature registered on the GPU is going to fall like a rock but an aircooled GTX480 consumes as much power as a water cooled GTX480 and both will increase the temperature of its surroundings by the same amount. Or like a 4850, a card seen has very hot simply because the cooler was tuned for silence instead of max cooling.
 

Spikesoldier

Diamond Member
Oct 15, 2001
6,766
0
0
forums are changing
from the 939 into the x2 right to the C2D release this form had the most amd people you ever saw.
Then people were into the fastest cpu and the X2@$400 was hot.
Things were never the same here after C2D

seconded. id say k7/p4 was an almost 50/50 split.

this forum usually follows an ideal of price vs performance, and you bet a significant amount of the people here are aiming straight for that sweet spot.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Yeah, AMD caught Nvidia with their pants down that round as far as launch prices go. Its not like AMD dropped the bar for price vs performance when they had no competition with the 5xxx series. I recall 5770 launching at a higher price than 4870 had been going for for the longest time.

I also owned both a 4870x2 and a GTX280 and the 280 was by far a better card. I remember Morrowind with a few mods running like poo on the X2 while it ran like a champ on the 280, I remember Crysis running smoother on the 280 thanks to microstutter on the x2, I remember a 9800gtx 65nm outperforming the X2 in FSX.

You can throw whatever slant that you want to on things.



Like I said in the other thread thats in select games, on average that is far from the case. I'll say it again for every Crysis (which artifacts, I have videos) or Metro 2033 where the 6970 is within 5% there is a GTA4 or NWN2, or Sims 3 which barely runs. Yeah, cayman is very good for its price and is the king of multi-monitor support that I'll never use but you are delusional if you think that there are no advantages with Nvidia's 5xx series.

TPU compares a lot of games and they show that 6970 is only 9% slower overall at 25x16. When you ramp up to 8xAA and higher details it drops even more to 5-6 %, so his statement is pretty accurate imho.
http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/29.html
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
As for the 4850, it sucked even if it was $1, because like I said, nvidia's IQ is much better. ATi uses some depth range optimization (lossy Z compression and call it lossless anyway?, i don't know, but nvidia's depth range is longer) and nvidia has always had better filtering and better feature set.

I owned a 4850 and a gtx 260. The gtx 260 was better to me because of the dual slot cooling fan (a huge improvement imho), but to say that the nvidia design had better IQ inherently is incorrect based upon my own personal experience. And filtering? Come on, you read that in an article somewhere. Even BFG10k can't tell the difference in filtering btwn AMD and Nvidia.

BTW, I still own the gtx 260 and in fact have bought a 2nd one. They are great cards, and I'm not trying to say anything negative about NV here. All I'm saying is that it's stupid to claim that NV is/was intrinsically "better" overall during the past couple of years.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Yes, given identical coolers. With a cooler like that you'd be seeing around 35-40C on a Phenom II X4.

Given the same cooler, a Phenom II X4 is always 25-30C cooler than a Core i5 as long as they're within normal voltages. Anyone with knowledge on the matter will tell you this.

There's obviously a downside, though: Phenom II X4s become unstable above 60C or so, while a Core i5 will become unstable above 85C.

Yes, tjmax is 25c lower on phenoms. However, my 1055t @3.35 is currently at 75c running seti 24/7 in my office at work, and I've seen it over 80c many many times. It does ridiculous amounts of work, and the only time I've had it restart in 4-5 months of usage like this was when I left the AC off a few weeks ago when I went home. No slowdowns, no errors, just good production. I very rarely get over 70c on my i7 920 @ 4.0, though that system has enormously better cooling. The best rule to go by on a cpu is that if it's throttling then you need to get the heat down, otherwise keep going (within reasonable vcore limits of course).

I think both of you are exaggerating the cooling ability of both vendor's processors. You want your Phenom II to be cooler than room temperature... That is quite a task with both vendors. 75f is 23-24c, which is not what I would consider a cool room temperature.

Come visit me sometime in the summer... inside my house routinely approaches 80f during the day. I would LOVE to be at 75f consistently. Don't worry, you get used to it, and it's much easier to deal with than your constant cold/rain ;)

I have both i5 2500 & Phenom 2 x4 965,both with the same Hyper 212+ cooler

i5 2500 stock @ prime = 43C
Phenom 2 x4 965 @ prime = 48C

Don't know why you're making things up,but given the same cooler + stock frequencies i5 is always cooler than phenom

Keep in mind that your personal experience (as well as my own) are anecdotal at best. Even with the exact same cooler, you can easily vary 10-15c at load after/before reseating it. Plus, some cases are much better than others at getting rid of heat. Heck, even my multiple antec 900's are not consistent. Last year I had a bad run of computer shutdowns on my i7 920 rig over a 10 day period. Once I bothered to check it out I found out that my rear exhaust fan wasn't working... But it's still unfair to compare that i7 920 with an antec 900, ultra 120 in push pull with both fans on high, and 5 case fans on high, to my work rig with a CNPS 9700 and antec 300 with fans on low.

i completly agree, the distance to tjmax is way more important that the actual temp of the chip. my i7 runs fine at 90c, my old phenom II started going wonky on me at anything over 60c. So it really depends on the CPU, and the properties of that exact CPU.

This is quite right, though when I got my i7 920 up to 4.2 the temps got close enough to 100c that I backed it off... :)

LOL Nobody knows what the "Real" temperature of AMD Phenom chips are. The temperature you get from the various monitoring programs is not the "real" temperature its the number reported by the AMD diode. That's why you think your AMD CPU is running cooler.

I have posted before if you bother to read the whitepaper AMD CPU's temperature sensor you will know:
- There is only 1 temp sensor
- The sensor is located in the Northbridge area and not in each core
- Software that shows AMD temperature is just reading the value given by this sensor and extrapolating it to "guess" what each core temp is
- The readings are in Celsius, a value of 41 means it 1 degree Celsius hotter then a value of 40... not that the "real" temperature is 41 C
- The usually maximum sensor value for Phenom is approx 62 C (varies by chips)

This is how AMD fools there followers into thinking there CPU is running super cool... some even think there AMD CPU's are air conditioners since they report temperatures lower then ambient. Intel Core and higher CPU's have accurate "real" temperature readings because Intel has released the real Max Temp for there CPU so an accurate reading can be derived from the sensor. If your temp program shows your Intel CPU is at 80C it means its really at 80C, if your temp program shows your AMD CPU is at 60 C it means its 2 C away from shutting down, the "real" temp can be 70, 75, 80, 100 nobody knows cause AMD won't release the offset value.

If you think what I'm saying is BS here's some quote from the author of CoreTemp:

"Core Temp displays the temperature reported to it by the CPU.
I've explained it many times, these processors report temperatures which are not absolute. There is usually a 10-15C delta that should be added to the readings to see the real temperature value."
http://www.alcpu.com/forums/viewtopic.php?f=62&t=800

"K10 does not report actual core temperature, it reports a "floating" temp, since without knowing this offset you won't be able to get the real, absolute temp."
http://www.alcpu.com/forums/viewtopic.php?f=88&t=577#orb

Oh crap, so my cpu was supposed to shut down 14c ago??? Somebody needs to tell it that! In my defense, coretemp reports tjmax on my 1055t as 70c and I'm only at 76c now, but clearly tjmax is meaningless in this instance. How many people would run their intel cpu at 106 or 113c? No argument here that intel does a much better job reporting accurate temps to us, however.
 
Last edited:
Status
Not open for further replies.