Cloudfire777
Golden Member
Like Im gonna bother replying to all of that. No thanks. Didnt even read it all.
Like Im gonna bother replying to all of that. No thanks. Didnt even read it all.
Does it matter? The vast majority of cards won't have reference coolers anyway.
If I drop my GDDR5 speeds from 1750mhz to 800mhz, my power usage falls 30W on a single 7970 card which has a 384-bit bus.
For what it's worth, some rough back-of-the-envelope calculations seem to indicate that this number actually matches up to AMDs claim of 85W at 8Gbps@512bit.
The following assumes that voltage is unchanged when going from 1750 to 800 MHz, and power scales linearly with frequency and the number of I/O pins.
We have a 54% drop in frequency corresponding to a 30W drop in power which would put the original power usage at 30W/0.54=55W.
Scaling this up to 8Gbps (2000MHz), would get us to 55W*8/7=63W.
Scaling from 384bit I/O to 512bit gets us: 63W*512/384=84W, or almost exactly the same as AMDs claim of 85W.
Scaling this back down to the 5Gbps@512bit of the 290X, we get 52W, or about 20-25W more than the 1Gbps@4096bit HBM solution.
Like Im gonna bother replying to all of that. No thanks. Didnt even read it all.
Yes, as RS was using those measurements to extract an imaginary performance clock over and above Titan, via a thermal envelope.
And I wasnt talking about dual chip cards with CLC.
Yes, as RS was using those measurements to extract an imaginary performance clock over and above Titan, via a thermal envelope.
And I wasnt talking about dual chip cards with CLC.
Keep in mind.
1) Scaling is not linear.
As far as the power usage of the actual memory modules goes.
http://www.tomshardware.com/reviews/intel-core-i7-5960x-haswell-e-cpu,3918-13.html
DDR4 uses about 1.5W for a 4 GB DIMM. Even if GDDR5 uses 10x the power its still well below 20W for 4 GB VRAM.
However, literature points to GDDR5 using less power than DDR4.
There is something about stacked on-die memory power savings, but its too late for me to find any useful numbers showing GDDR5 power consumption:
http://www.cse.psu.edu/~juz138/files/islped209-zhao.pdf
We propose an energy-efficient reconfigurable in-package graphics
memory design that integrates wide-interface graphics DRAMs with
GPU on a silicon interposer. We reduce the memory power consumption
by scaling down the supply voltage and frequency while
maintaining the same or higher peak bandwidth. Furthermore, we
design a reconfigurable memory interface and propose two reconfiguration
mechanisms to optimize system energy efficiency and throughput.
The proposed memory architecture can reduce memory power
consumption up to 54%, without reconfiguration. The reconfigurable
interface can improve system energy efficiency by 23% and throughput
by 30% under a power budget of 240W.∗
Actually, I found this. Not sure how accurate:
We computed the maximum power consumption
of GPU processors and memory controllers by subtracting
the DRAM power from the reported maximum power consumption
of Quadro R FX5800 [15], resulting in 124W. The power of 4GB
DRAM is calculated as 60W, based on Hynix’s GDDR5 memory [8].
DRAMs on HD6990 eat 21,8% of 375 Watts card consumes = 81 Watts for memory chips alone!
Just look at the study and stop this pointless arguing and out of a rear power consumption calculations.
here is my post from way back when hbm was in the news:
May 1st. Hard details plz.
International workers' day.
As difficult it may be to understand for you north american`s,Labour Day is in September.
Labour Day is in September.
As difficult it may be to understand for you north american`s,
Canada/USA is not the the entire world
I am looking forward to R9 390X WCE because the benefits are truly huge.
Hybrid cooled 980 @ 1393mhz Boost operates 25*C cooler than a stock 980 with the reference Titan blower!
![]()
![]()
Also, should the fan/pump fail, it'll be easier to replace it with an off-the-shelf 120mm AIO. Right now replacing a heatsink on a reference blower card/or even an after-market one essentially means buying the expensive $70-90 Accelero Xtremes.
The EVGA GTX980 Hybrid costs $100 more compared to the standard blower 980, which means the higher factory pre-overclock + 120mm AIO CLC + warranty is about a $100 premium that we likely can expect the R9 390X WCE to have over the standard 390X.
If $549 R9 390 standard = 1.4Ghz 980 and R9 390X is 15% faster than a 1.4Ghz GTX980 for $650, that would be a nice improvement from where we are currently sitting.
Hybrid cooled 980 @ 1393mhz Boost operates 25*C cooler than a stock 980 with the reference Titan blower!
EVGA's design is decent, but it's wasted on the 980 - which runs cool, quiet, and with plenty of room for overclocking on the high-quality stock blower. It would make a lot more sense on the Titan X.
What's you're next card Rs. Know it's impossible to say without reviews but preliminary guess as to your next card/when purchasing