4890X2 coming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: error8
Originally posted by: theAnimal
Originally posted by: error8
Originally posted by: theAnimal
4890 uses less power than 4870, so 4890x2 should be lower than 4870x2.

Where did you get that? It uses less power in idle over 4870, but loads at 30 Watts more. It has a TDP of 190 W, where 4870 has 160 W. link

TDP is not power consumption. My numbers are from xbitlabs which has the 4890 using 10W less than the 4870 at load.

TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.
Physics 101 fail
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Originally posted by: error8
Originally posted by: theAnimal
Originally posted by: error8
Originally posted by: theAnimal
4890 uses less power than 4870, so 4890x2 should be lower than 4870x2.

Where did you get that? It uses less power in idle over 4870, but loads at 30 Watts more. It has a TDP of 190 W, where 4870 has 160 W. link

TDP is not power consumption. My numbers are from xbitlabs which has the 4890 using 10W less than the 4870 at load.

TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.

FAIL, TDP is NOT power consumtion it is by definition the amount of heat that can be dissapted through cooling, the below quote taken from wikipedia:

"The Thermal Design Power (TDP) (sometimes called Thermal Design Point) represents the maximum amount of power the cooling system in a computer is required to dissipate. For example, a laptop's CPU cooling system may be designed for a 20 W TDP, which means that it can dissipate (either via an active cooling method such as a fan, a passive cooling method via natural convection, via heat radiation or all three modes of heat transfer) 20 watts of heat without exceeding the maximum junction temperature for the chip."

see here for full TDP page

http://en.wikipedia.org/wiki/Thermal_Design_Power
 

thilanliyan

Lifer
Jun 21, 2005
12,057
2,272
126
Originally posted by: error8
TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.

TDP is what it's rated at...not necessarily (exactly) what it consumes.

Originally posted by: MrK6
Physics 101 fail

Although TDP and actual power consumption may be different, what is the electrical energy converted to other than heat?
 

fffblackmage

Platinum Member
Dec 28, 2007
2,548
0
76
Originally posted by: thilan29
Originally posted by: error8
TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.

TDP is what it's rated at...not necessarily (exactly) what it consumes.

Originally posted by: MrK6
Physics 101 fail

Although TDP and actual power consumption may be different, what is the electrical energy converted to other than heat?

TDP is not an indicator of power consumption, Only heat dissapation.

and the electrical energy is used to turn millions of transistors on and off?
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: MrK6
Originally posted by: error8
Originally posted by: theAnimal
Originally posted by: error8
Originally posted by: theAnimal
4890 uses less power than 4870, so 4890x2 should be lower than 4870x2.

Where did you get that? It uses less power in idle over 4870, but loads at 30 Watts more. It has a TDP of 190 W, where 4870 has 160 W. link

TDP is not power consumption. My numbers are from xbitlabs which has the 4890 using 10W less than the 4870 at load.

TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.
Physics 101 fail

Read this .

TDP as thermal dissipated power, is the thermal power that a specific component generates. Now, if that specific component, doesn't do any work, and a fricking GPU doesn't moves anything, it doesn't have any little electrical motors inside that spin around, then all the electrical power that goes into it, get's transformed into heat. Energy doesn't evaporates, just gets transformed into another kind of energy ( second grade physics man ). So, now, when we see the TDP of the particular card at 190 W, that means this particular card WILL SUCK 190 WATTS FROM THE STUPID PSU. SO you and Riferut both fail at physics!
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: thilan29
Originally posted by: error8
TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.

TDP is what it's rated at...not necessarily (exactly) what it consumes.

I agree with you on this. It isn't probably the 100% power consumption of the card, but the real number is around the value of that TDP. I would incline to say that is a bit over the actual TDP.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: error8
Originally posted by: MrK6
Originally posted by: error8
Originally posted by: theAnimal
Originally posted by: error8
Originally posted by: theAnimal
4890 uses less power than 4870, so 4890x2 should be lower than 4870x2.

Where did you get that? It uses less power in idle over 4870, but loads at 30 Watts more. It has a TDP of 190 W, where 4870 has 160 W. link

TDP is not power consumption. My numbers are from xbitlabs which has the 4890 using 10W less than the 4870 at load.

TDP is power consumption, since all the electrical power the card sucks gets transformed into heat.
Physics 101 fail

Read this .

TDP as thermal dissipated power, is the thermal power that a specific component generates. Now, if that specific component, doesn't do any work, and a fricking GPU doesn't moves anything, it doesn't have any little electrical motors inside that spin around, then all the electrical power that goes into it, get's transformed into heat. Energy doesn't evaporates, just gets transformed into another kind of energy ( second grade physics man ). So, now, when we see the TDP of the particular card at 190 W, that means this particular card WILL SUCK 190 WATTS FROM THE STUPID PSU. SO you and Riferut both fail at physics!
You don't think there's any work done in a GPU (hint: transistors) *facepalm*

All I was commenting on is the second portion of the statement "since all the electrical power the card sucks gets transformed into heat." If this was the case, video cards would be nothing more than space heaters (not that they are much more right now :p) and would not produce any graphics. The heat arises from the resistance in the circuitry encountered by the flowing current as it does work in the GPU.
Originally posted by: error8So, now, when we see the TDP of the particular card at 190 W, that means this particular card WILL SUCK 190 WATTS FROM THE STUPID PSU. SO you and Riferut both fail at physics!
Nope, incorrect again. TDP is just the specification made by the company to ensure that the heatsink used is adequate under "standard" (which isn't standard at all) operating conditions. It doesn't consider the power being drawn by the part, but rather it's an estimate at the amount of thermal energy that must be dissipated by the heatsink assembly.

No harm no foul, but honestly, you should read more into this so you're sure you aren't the one failing at physics^^
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
So in your opinion, how much does a card suck from the wall, regarding the actual TDP of the card? How much from the actual power consumption of a card is "work" and how much is thermal energy?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: error8
So in your opinion, how much does a card suck from the wall, regarding the actual TDP of the card? How much from the actual power consumption of a card is "work" and how much is thermal energy?

TDP is pretty much worst case scenario.
How much of the TDP it draws from the PSU depends on what the load level of the card is like, and it's unlikely that you will hit enough of a load, with enough variety, to make the card max out (given that different parts of the GPU do different tasks), ignoring how much of the power being drawn goes to work vs heat.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Weird thing is how these people in this link consider the TDP of the card as the actual power consumption of the card. And, using furmark, quite about every card out there, seems to get past its rated TDP value: "Our measurements show clearly that the TDP-values shared with the press are exceeded noticeably in many cases while using FurMark as load."

I don't really get the concept of "work" in a GPU, where all you have is electricity going from one transistor to another and so on and in the end all you get is a signal for the monitor.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: error8
Weird thing is how these people in this link consider the TDP of the card as the actual power consumption of the card. And, using furmark, quite about every card out there, seems to get past its rated TDP value: "Our measurements show clearly that the TDP-values shared with the press are exceeded noticeably in many cases while using FurMark as load."

I don't really get the concept of "work" in a GPU, where all you have is electricity going from one transistor to another and so on and in the end all you get is a signal for the monitor.
Ok, so TDP = Thermal Design Power, the amount of heat (described as power) that the heatsink/cooling system for said part must be able to cope with. Like I said, the problem with this is there is NO standard for TDP. You can't compare TDP between different manufacturers or even sometimes across different parts from the same manufacturers.

Now, in that link, the guys seem to have missed the boat on the concept of TDP. Basically, to put all this in perspective, think of a GPU as a standard car engine. A said amount of power is inputed into the engine while out comes work done by the engine as well as the by product heat. In the case of GPUs, power from the PSU is put into the card to do the work of the GPU, and part of that power must overcome the resistance of the circuitry and is lost as heat. TDP, set by the manufacturers, defines a quantitative maximum value for this heat under standard operating conditions, mostly to define a parameter for the capacity of the heatsink used to cool the GPU.

From that, you can see that there are three power levels mentioned - power inputed, and power outputed, which is a combined value of power used in the work done by the GPU and power lost as heat due to the work of overcoming the resistance of the circuitry. Most graphics card reviews measure power consumption of the entire system through a Kill-a-watt, and use a basis of comparison of multiple cards to see how they affect the overall power consumption of the PC. Now in the article you linked to (http://ht4u.net/reviews/2009/p...consumption_graphics/) they used some nifty methods to measure actual GPU-only power consumption. I think their data is pretty accurate as well, although their analysis is flawed. However, using this data, you now have a quantitative value for the power put into a GPU to do work. Now you have to figure out, out of all that power, what percentage is used to do work in the GPU and what percentage is lost as heat due to overcoming the circuitry (note that I've only broken down this into two instances, but anyone more versed in electrical engineering than I could probably add in other sinks that take away power from actual GPU work). Now in a perfect world (that defied the laws of physics :p), you had a GPU that was 100% efficient, it would only consume the power it needed to produce graphics, and wouldn't even need a heatsink because no power would be lost as heat. This isn't the case.

To get a quantitative value for how much power the card actually uses and how much it loses as heat, my guess would be you would need to make some kind of calorimeter to measure the BTUs the card puts out, convert it back to power, and subtract that from the measured power inputed to find out how much of the power is actually used to do work.

EDIT: You know what is a good comparison here is a light bulb. Your standard light bulb is at most 10% efficient. Out of the power (watts) it consumes, probably only 10% is expressed as light (lumens) while the rest is lost as heat (watts/BTUs/whatever)

Now in the article, they say that the cards inputted power surpassed the rated TDP of the part from the manufacturer. However, they didn't measure the actual heat outputted by the card. So of course power consumption can exceed the TDP at maximum usage because power is being used to do work AND is lost as heat. The major problem would be if the heat being lost exceeds the TDP, because then the heatsinks designed for the cards might now be able to cope with the heat output, etc.

Anyway, if something is confusing here, let me know and I'll try to explain it better. Also, if anyone is more versed in this than I and would like to add or correct me, please do, I'm very interested in learning more as well.

Couple of things that might help:
http://en.wikipedia.org/wiki/Thermal_Design_Power
http://en.wikipedia.org/wiki/Work_(physics)
http://en.wikipedia.org/wiki/Electrical_work
http://en.wikipedia.org/wiki/Electrical_resistance

 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
You fail at physics do You seriously think your GPU doesnt do anything other than create heat? You think its nothing more than a space heater? then why do you have one? why do people need them? They need them to make millions of calculations which does not result in a 100% disapation in heat because the transistors need power to operate. Just because it does not move does not meant its not doing anything.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I'll be very interested to see the cooling setup if the GPUs do in fact run at 1ghz.

Make no mistake: this will be the card to own, albeit for the short term.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
yeah can you imagine two of these in crossfire! maybe we will be able to play crysis at 16X AA lol.
 

thilanliyan

Lifer
Jun 21, 2005
12,057
2,272
126
Originally posted by: MrK6
You don't think there's any work done in a GPU (hint: transistors) *facepalm*

Originally posted by: Rifterut
You fail at physics do You seriously think your GPU doesnt do anything other than create heat? You think its nothing more than a space heater? then why do you have one? why do people need them? They need them to make millions of calculations which does not result in a 100% disapation in heat because the transistors need power to operate. Just because it does not move does not meant its not doing anything.

The transistors don't move mechanically so the energy consumed is converted to heat.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
It seems like the wrong timing to me to release such a card, would be better if AMD shrinks the nice RV790 chip to 40nm first. I mean they already have 40nm cards out.

An X2 based on RV790 @ 40nm would be much smaller, cooler, and clock higher, think +1GHz. Oh well, maybe they will do that later on anyways.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Gates do not move within the transistors when electricity is applied or denied?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: error8
4890X2

It is said to be the first card with 2X8 pin power connectors. This will probably be the hottest card on earth. :)
I would hope the 2x8pin power bit is wrong. Who the heck has a power supply with 4 of those things for Quadfire?:confused: :Q
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Keysplayr
Gates do not move within the transistors when electricity is applied or denied?

No they don't. The term "gate" is employed in the semiconductor industry as an analogy. The gate is simply a conductor that places a voltage bias on the gate oxide and subsequently the channel. The channel is a resistor when the gate is not applying a voltage bias to it, and the channel becomes a conductor when the gate is applying a voltage bias to it. Hence the name semiconductor, it (the channel) is sometimes a conductor and other times it is a resistor...i.e. a semi-conductor.

There does seem to be a bit of misconception in this thread regarding power consumption versus heat.

In my view this misconception stems from the fact no one has defined what they view the term "heat" to mean.

Are you guys defining heat here as infrared photons, phonons, temperature or more precisely as the specific heat capacity?

Once we have an agreed upon metric definition of "heat" then we can have a meaningful discussion on how the energy provided by the PSU is divided up across the varying domains of the energy manifold ( a dynamic process) represented by the GPU "system" and how a significantly large part of the energy comes to occupy the portion of the energy manifold most folks would consider to be "heat".

Electrical Power

And perhaps the next largest communication barrier here is that I see some folks are talking about where the energy is on the energy manifold "at an instantaneous point in time" (the light bulb example) versus some other folks are talking about where the energy eventually ends up at the point of thermodynamic equilibrium for the system.

Excepting for the power consumed by your hard-drive in the semi-permanent storage of bits, all energy consumed by your computer (and light bulb) is eventually converted to heat by way of phonon-phonon coupling and phonon-photon coupling. It all comes down to standard statistical mechanics, if your GPU consumes 180W of power then after some period of time you have 180W of heat to dissipate into the world, conservation of energy combined with energy manifolds and thermodynamics sees to it. (I'm sure some of you can see why you are basically arguing that perpetual motion machines could exist if this weren't true)

Whether this period of time is that of the switching time of the transistor (picoseconds) or that of the refresh rate for the memory on your GPU housing the only "temporary" work created by the consumption of electrical power of your GPU is surely something that can be debated...but in the end if your GPU is consuming 180W of power you can rest assured you are dissipating 180W of heat, the energy has no where else to go in the system but into the lowest energy manifold and that is the phonon manifold we humans characterize as "heat".
 

fffblackmage

Platinum Member
Dec 28, 2007
2,548
0
76
@Idontcare
I don't understand. Basically, you're saying if the GPU consumes 180W of energy, then we'll see 180W of heat being dissipated? I think it makes more sense to say 180W of energy will be dissipated. Energy can be expressed in other forms other than just heat.

What about energy used for electrical work?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: fffblackmage
@Idontcare
I don't understand. Basically, you're saying if the GPU consumes 180W of energy, then we'll see 180W of heat being dissipated? I think it makes more sense to say 180W of energy will be dissipated. Energy can be expressed in other forms other than just heat.

What about energy used for electrical work?
That's what I was getting at as well. Some of the energy has to be used to do the electrical work in the card otherwise you just have a glorified space heater that produces no graphics.
 

thilanliyan

Lifer
Jun 21, 2005
12,057
2,272
126
Originally posted by: MrK6
That's what I was getting at as well. Some of the energy has to be used to do the electrical work in the card otherwise you just have a glorified space heater that produces no graphics.

It will all be degraded to heat eventually though (I think that's the gist of Idontcare's post if you're considering the situation at equilibrium).
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: fffblackmage
@Idontcare
I don't understand. Basically, you're saying if the GPU consumes 180W of energy, then we'll see 180W of heat being dissipated? I think it makes more sense to say 180W of energy will be dissipated. Energy can be expressed in other forms other than just heat.

What about energy used for electrical work?

No I am saying 180W of energy will eventually (and rather quickly) become 180W of heat. There is no way around it.

There may be some transient electrical work done in the meantime, but the length of that "meantime" is rather fleeting in the world of electronics (on the order of microseconds).

Your car does mechanical work while burning gas, but in the end every BTU of power generated is eventually converted to heat as the entropy of the system seeks to become maximized during the process of equilibration. Whether 90% of it is heat at time-zero and the remaining 10% is mechanical work that devolves into heat or whether 10% of it is heat at time-zero and the remaining 90% is mechanical work that devolves into heat is irrelevant, power-consumption means heat (in the end) and in a confined electrical system such as your computer box all that energy become heat inside your computer box which is why everybody on the planet cares to equate power-consumption with heat. For all practical considerations they are the same, given time, and the time here is mere microseconds.

For all the current flow across resistors in your GPU to do calculations, the near immediate end result is 99.99% heat and 0.01% work represented in the form of stored bits in the chip's on-die memory...which rapidly decays via leakage pathways into more heat, and requires the memory cells to be "refreshed" (more energy) every few microseconds.