[VC][TT] - [Rumor] Radeon Rx 300: Bermuda, Fiji, Grenada, Tonga and Trinidad

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Where is the confirmation from CM (or AMD) that the REFERENCE card will use this oft-mentioned liquid cooling?

I've seen plenty of rumors, and plenty of "sources" that agree with said rumours. But solid confirmation? Please do correct me if you have hard evidence for this

https://www.zauba.com/import-c880-hs-code.html

12-Feb-2015 84733030 AMD C880 PRINTED CIRCUIT BOARD ASSEMBLY (VIDEO GRAPHIC CARD)WITH COOLER MASTER HEATSINK P/N.102-C88001-00 (FOC) Canada Hyderabad Air Cargo NOS 12 909,022 75,752
 

Timorous

Golden Member
Oct 27, 2008
1,969
3,850
136
TL;DR. Let me rephrase my previous post for you. You don't care about power consumption. I get that. That's fine. But I said "what if Fiji runs too hot / consumes too much power and basically requires water. What if AMD clocks it so high (to look awesome for reviews, and because WATERRRRRRR) that there is basically very little headroom left. And what if, despite it's ultra low temps, it still consumes a shitload of power and the extra 8% OC might get out of it exponentially shoots up power consumption

So you are basically asking what if AMD releases what would normally be a factory overclocked card as their stock solution and because they are using water they can keep the temps and the noise down on such a card? The answer depends entirely on its performance/price/power profile vs GM200's performance/price/power profile.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Ok cool, I'm convinced it's Fiji XT, thanks. Still a stubborn sceptic about the water cooling, though
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So you are basically asking what if AMD releases what would normally be a factory overclocked card as their stock solution and because they are using water they can keep the temps and the noise down on such a card? The answer depends entirely on its performance/price/power profile vs GM200's performance/price/power profile.

Exactly. I don't think he read the thread carefully as everything he brought up was already discussed. He makes an inference of what if the card requires water-cooling, but this thread already has plenty of NV and AMD examples where 280-300W is a peace of cake for a high quality after-market open air cooler. He talks about the negatives of the card not fitting SFX cases, but none of the positives of how much better it would be to exhaust 600W of heat in CF out of a mid-tower with exhausting radiators. He ignores examples of 295X2 fitting inside a BitFinix Prodigy. As you said, without knowing stock performance or price of GM200, OC vs. OC performance is just a thing to explore later. He also ingored that XDMA is smoother than SLI.

Just like we don't make inferences on the potential of 780Ti's overclocking only based on the reference blower design, why would we assume that only WC 390X cards will be the only options? It's interesting how 7970 925mhz overclocking was downplayed and 680 "won" despite 7970 OC > 680 OC, it now seems stock performance doesn't matter if GM200 OC better. It's like the goal posts keep moving. I agree that overclocking matters too but to call a card an automatic failure if it only overclocks 8-10% without knowing its stock performance or price relative to a 980/GM200 is premature.

Remember X850XT PE or 6800 Ultra? Those cards overclocked less than 10% and are hardly considered failures. They were good cards. All things being equal, extra overclocking headroom is good if it's there to take advantage of, but I would much rather take a factory warrantied & guaranteed 750mhz GTX470, 1.15Ghz 7970, 1.4Ghz 970/980, than having to net those gains myself with no guarantee of stability and warranty. Why wouldn't I want GM200/390X pushed to 95% of their maximum clocks right out of the box? Whichever is the fastest wins. It's more probable that GM200 will OC better based on NV leaving more headroom in the last couple generations, but since we don't know the stock vs. stock performance, we can't tell what's faster 390X+10% OC or GM200+20% OC.

He didn't even talk about NV's prices as if they don't even matter. 780Ti cost 27% more, but today 290X and 780Ti are tied.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Remember X850XT PE or 6800 Ultra? Those cards overclocked less than 10% and are hardly considered failures. They were good cards. All things being equal, extra overclocking headroom is good if it's there to take advantage of, but I would much rather take a factory warrantied & guaranteed 750mhz GTX470, 1.15Ghz 7970, 1.4Ghz 970/980, than having to net those gains myself with no guarantee of stability and warranty. Why wouldn't I want GM200/390X pushed to 95% of their maximum clocks right out of the box? Whichever is the fastest wins. It's more probable that GM200 will OC better based on NV leaving more headroom in the last couple generations, but since we don't know the stock vs. stock performance, we can't tell what's faster 390X+10% OC or GM200+20% OC.

well said. I think its one thing to compare factory overclocked cards like Lightning , Kingpin , Vapor-X and another thing to say a chip is better than another because it has more OC headroom. With factory overclocked cards they work at those speeds out of the box. With manual overclocking you are talking about a lot of variables - binning, air or water cooling, voltage overclocking and finally the silicon lottery.

He didn't even talk about NV's prices as if they don't even matter. 780Ti cost 27% more, but today 290X and 780Ti are tied.
I think you are being generous. Look at the games released in the last 6 months. Its a one way traffic in favour of R9 290x against 780 Ti even in many Gameworks titles like AC Unity, Farcry 4, COD AW, Dying Light and Evolve. The 4GB VRAM on R9 290X allows it to play smoothly at 1440p at the highest settings without MSAA which cannot be said for the 780 Ti which runs out of VRAM. :thumbsup:
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Quote:
He didn't even talk about NV's prices as if they don't even matter. 780Ti cost 27% more, but today 290X and 780Ti are tied.

I think you are being generous.

Didn;t he say that in context of power consumption.
power_average.gif
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Exactly. I don't think he read the thread carefully as everything he brought up was already discussed. He makes an inference of what if the card requires water-cooling, but this thread already has plenty of NV and AMD examples where 280-300W is a peace of cake for a high quality after-market open air cooler. He talks about the negatives of the card not fitting SFX cases, but none of the positives of how much better it would be to exhaust 600W of heat in CF out of a mid-tower with exhausting radiators. He ignores examples of 295X2 fitting inside a BitFinix Prodigy. As you said, without knowing stock performance or price of GM200, OC vs. OC performance is just a thing to explore later. He also ingored that XDMA is smoother than SLI.

Just like we don't make inferences on the potential of 780Ti's overclocking only based on the reference blower design, why would we assume that only WC 390X cards will be the only options? It's interesting how 7970 925mhz overclocking was downplayed and 680 "won" despite 7970 OC > 680 OC, it now seems stock performance doesn't matter if GM200 OC better. It's like the goal posts keep moving. I agree that overclocking matters too but to call a card an automatic failure if it only overclocks 8-10% without knowing its stock performance or price relative to a 980/GM200 is premature.

Remember X850XT PE or 6800 Ultra? Those cards overclocked less than 10% and are hardly considered failures. They were good cards. All things being equal, extra overclocking headroom is good if it's there to take advantage of, but I would much rather take a factory warrantied & guaranteed 750mhz GTX470, 1.15Ghz 7970, 1.4Ghz 970/980, than having to net those gains myself with no guarantee of stability and warranty. Why wouldn't I want GM200/390X pushed to 95% of their maximum clocks right out of the box? Whichever is the fastest wins. It's more probable that GM200 will OC better based on NV leaving more headroom in the last couple generations, but since we don't know the stock vs. stock performance, we can't tell what's faster 390X+10% OC or GM200+20% OC.

He didn't even talk about NV's prices as if they don't even matter. 780Ti cost 27% more, but today 290X and 780Ti are tied.

Agree.

From my perspective, I just want SOMEONE to release a big-die performance GPU so I can hand my money over. Since the 2015 cards are really just stop-gaps until 14nm is here, I don't care about efficiency, just want more performance. 40-50% more than what I have now, preferably. :cool:
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Love this discussion. Not sure why there are a few thread crappers here, but this thread is clearly titled "Rumor", so discussing possible scenarios while citing previous examples is exactly what I am looking for.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Love this discussion. Not sure why there are a few thread crappers here, but this thread is clearly titled "Rumor", so discussing possible scenarios while citing previous examples is exactly what I am looking for.

Definitely agree.

Can't wait until a few of these cards get in the hands of some 'unofficial reviewers' and we get some leaked performance.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Ok cool, I'm convinced it's Fiji XT, thanks. Still a stubborn sceptic about the water cooling, though

Why else would they use a Cooler Master heatsink?

This release is going to be so disappointing. Needing a 700-800W PSU for a single card system...
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Love this discussion. Not sure why there are a few thread crappers here, but this thread is clearly titled "Rumor", so discussing possible scenarios while citing previous examples is exactly what I am looking for.


Fair call, but this thread has become "the" discussion point for the 300 series until it is either released or AMD otherwise confirms the specs. The rumors in the OP are not the only ones discussed in the past few pages.

As I result, if we were to open up new thread for every new rumor that came along, I daresay the mods would close it and ask us to "take it to the existing thread."

Result: This is the place to discuss all rumors. On behalf of all other grumpy bastards on these boards, I was expressing a general desire for folks to link us to the specific rumors they happen to be discussing, so that we can tell whether the specific scenarios in question are likely/feasible, or whether they're more in the realm of fantasy/nonsense/WCCFTech.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Pointless arguing if fun to read at times.

Will be nice to see some performance leaks here and there to offset the bickering.

Not really sure why all the hate towards AIO coolers. About the only real scenario I can come up with is the GREEN team doesn't offer it for the devoted.

Looking forward to the anti-water fear mongering " It'll leak and destroy your whole rig " smear campaign! Oops can't forget the once the water leaks out flames will shoot out and ignite your whole house.

I predict being 30% faster than the 980 will be washed away by the AIO fear campaign. A larger performance gap will just step up the smear campaign. I can picture the devoted sacrificing a couple of AMD's offering like lambs. Nice visual sob story how it destroyed their whole rig. Of course they won't show the pinholes they modded the cooler with.

On another note GTX 970 #2 is being yanked and returned today. The grass is greener on the NVIDIA side....Too bad ii's artificial.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
care to share your sources?

If you are using a highly-OCd CPU, this is definitely the case. My CPU alone maxes above 300w, add another 100w for MB and peripherals and I am at about 400w without a GPU. That means a 300w GPU monster added makes it 700w. Add another 100w or so for OC overhead and we are at 800w. That's just me though....

My 860w PSU is probably limited a single, highly-OCd 2015 big-die GPU or (2) GM200s running at stock (if those are 250w parts) and that latter is likely pushing it...
 

garagisti

Senior member
Aug 7, 2007
592
7
81
If you are using a highly-OCd CPU, this is definitely the case. My CPU alone maxes above 300w, add another 100w for MB and peripherals and I am at about 400w without a GPU. That means a 300w GPU monster added makes it 700w. Add another 100w or so for OC overhead and we are at 800w. That's just me though....

My 860w PSU is probably limited a single, highly-OCd 2015 big-die GPU or (2) GM200s running at stock (if those are 250w parts) and that latter is likely pushing it...
GM200 wouldn't be much better either, but people don't criticize that much. It's been a few years since top of the line gpu draws about 300w.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
GM200 wouldn't be much better either, but people don't criticize that much. It's been a few years since top of the line gpu draws about 300w.

Agree.

I am excited about the possibility of HBM for the 3xx series because that should help allocate some of the overall TDP to shaders rather than memory. That's definitely a win. :)

I could certainly dial-down my CPU OC a bit, but that just somehow goes against my nature. :p

My ideal 14nm card will be a 2nd-tier card ~200w that I can throw a pair in at launch. I think OC headroom and overall performance will be my big driver for either 3xx or GM200 purchase. Really hoping Fiji allows for nice overclocks like the 5xxx/6xxx/7xxx series.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
If you are using a highly-OCd CPU, this is definitely the case. My CPU alone maxes above 300w, add another 100w for MB and peripherals and I am at about 400w without a GPU. That means a 300w GPU monster added makes it 700w. Add another 100w or so for OC overhead and we are at 800w. That's just me though....

My 860w PSU is probably limited a single, highly-OCd 2015 big-die GPU or (2) GM200s running at stock (if those are 250w parts) and that latter is likely pushing it...

That's not really sharing a "source" as much as it is giving a situation where a 300W GPU wouldn't work on a 700-800 W PSU.

Also, if you don't have at least 600-700 W, chances are you aren't a regular purchaser of high end GPUs anyway.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why else would they use a Cooler Master heatsink?

This release is going to be so disappointing. Needing a 700-800W PSU for a single card system...

Why would you say a 300W card needs a 700-800W PSU? One of my systems ran i7 860 @ 3.9Ghz and GTX 470 @ 750mhz on a Corsair 520W. To this date most PC gamers don't realize that high quality PSUs are actually rated to run at 40-50C at max load 24/7. If you buy a 650W Corsair PSU, that means by "old standards" it is really an 800W unit. By modern standards it is a real 650W unit that will handle a 600-650W load not for 15 seconds but for years.

Since in CF/SLI you do not just double the power usage of a single card as scaling is often 65-90%, a SeaSonic, Corsair, Enermax, Antec, EVGA Gold unit rated at 850W will power an overclocked i7 4790K and dual 390Xs. Your PSU estimates are related to manufacturer recommended specs, which have been worthless since NV recommended a 480W PSU for a GeForce 6800 Ultra. Those are conservative for crappy quality Apevia OEM units, etc.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I could not give a flying (hey! language!) about power consumption and doubt too many folks spending 500+ bucks on a card care much either as compared to performance. To each their own though.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Why else would they use a Cooler Master heatsink?

This release is going to be so disappointing. Needing a 700-800W PSU for a single card system...

The TDP is rumored close to the 290x level.

Tell that to my 290x crossfire system on an 850w PSU. *with multiple drives, fans, etc.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I could not give a flying (hey! language!) about power consumption and doubt too many folks spending 500+ bucks on a card care much either as compared to performance. To each their own though.

:D Someone should have a poll going.

I always found it amusing when people spend $550-700 on each single flagship GPU, when in 1.5-2 years that level of performance can be purchased for $300-350, meaning their card just lost 40-50% of its value but an extra 50-100W should somehow matter a great deal when paying such massive premiums? We saw these same arguments to justify 980's $550 price-tag for 10-15% more performance over a $300 Sapphire Tri-X 290X. I personally would never pay 83% more for 10-15% more performance even if the card used 1W of power.

If you are using a highly-OCd CPU, this is definitely the case. My CPU alone maxes above 300w, add another 100w for MB and peripherals and I am at about 400w without a GPU. That means a 300w GPU monster added makes it 700w. Add another 100w or so for OC overhead and we are at 800w. That's just me though....

My 860w PSU is probably limited a single, highly-OCd 2015 big-die GPU or (2) GM200s running at stock (if those are 250w parts) and that latter is likely pushing it...

The entire X99 platform along with a 4.3Ghz 5820K uses 300W, but this is under extremely CPU heavy encoding task that is fully multi-threaded to take advantages of all 12 threads.

power-3.png


300W GPU x 2 x 90% scaling = 540W + 300W for your platform OC = 840W. Your 860W PSU would be fine because: (1) I presume it's a Seasonic/Corsair unit and (2) x264 encoding loads every single core and HT to > 95% usage, something no game in the world will ever do. Therefore, that 298W peak load in x264 is impossible to hit in videogames.

Hardware Canucks used a Intel i7 4930K @ 4.7GHz, which would use more power than any platform in the chart above (>331W) and when paired with a 500W R9 295X2, this is the peak gaming at load:

R9295X2-2-78.jpg

www.hardwarecanucks.com/forum/hardware-canucks-reviews/65973-amd-radeon-r9-295x2-performance-review-18.html

Let's assume the worst case scenario for you that your 5820K OC would use the same amount of power as that 4930K @ 4.7Ghz, add 100W more for dual 300W cards over the 295X2, you'd be at 759W, still well below 860W your PSU is rated for 24/7 operation. You would actually have slight OCing headroom. :thumbsup: A lot of people get way too paranoid about PSU requirements.

Remember, if you measure 730W load at the wall and your PSU is 90% efficient, your load at the PSU level is 730W x 90% = 657W. Therefore, you have to be very careful when you look at peak load numbers in reviews. Do they mean at the wall or after accounting for the PSU's efficiency? This one aspect starts to matter a great deal because you might see 850W peak load at HardOCP but they state at the wall measurements, meaning the actual load on a 90% PSU would be 765W. You need to understand the peak power load methodology. Both measurements make sense depending on what you are trying to show. HardOCP for example uses AT THE WALL measurements because they are trying to gauge the impact of electricity costs from a financial perspective. If you want to gauge what size PSU you would need from the same review, you would have to adjust the at the wall reading based on the efficiency of the PSU used in that particular review (i.e., a 90% efficient PSU would draw 10% extra from the wall since it is wasted energy hence we take 850W load measurement from a Kill-a-Watt and multiply by 90% PSU efficiency to get the actual load at the PSU level). PSU ratings by the top firms are stated at the PSU level, not at the wall level. Keep this point in mind.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
:D Someone should have a poll going.

I always found it amusing when people spend $550-700 on each single flagship GPU, when in 1.5-2 years that level of performance can be purchased for $300-350, meaning their card just lost 40-50% of its value but an extra 50-100W should somehow matter a great deal when paying such massive premiums? We saw these same arguments to justify 980's $550 price-tag for 10-15% more performance over a $300 Sapphire Tri-X 290X. I personally would never pay 83% more for 10-15% more performance even if the card used 1W of power.



The entire X99 platform along with a 4.3Ghz 5820K uses 300W, but this is under extremely CPU heavy encoding task that is fully multi-threaded to take advantages of all 12 threads.

power-3.png


300W GPU x 2 x 90% scaling = 540W + 300W for your platform OC = 840W. Your 860W PSU would be fine because: (1) I presume it's a Seasonic/Corsair unit and (2) x264 encoding loads every single core and HT to > 95% usage, something no game in the world will ever do. Therefore, that 298W peak load in x264 is impossible to hit in videogames.

Hardware Canucks used a Intel i7 4930K @ 4.7GHz, which would use more power than any platform in the chart above (>331W) and when paired with a 500W R9 295X2, this is the peak gaming at load:

R9295X2-2-78.jpg

www.hardwarecanucks.com/forum/hardware-canucks-reviews/65973-amd-radeon-r9-295x2-performance-review-18.html

Let's assume the worst case scenario for you that your 5820K OC would use the same amount of power as that 4930K @ 4.7Ghz, add 100W more for dual 300W cards over the 295X2, you'd be at 759W, still well below 860W your PSU is rated for 24/7 operation. You would actually have slight OCing headroom. :thumbsup: A lot of people get way too paranoid about PSU requirements.

Remember, if you measure 730W load at the wall and your PSU is 90% efficient, your load at the PSU level is 730W x 90% = 657W. Therefore, you have to be very careful when you look at peak load numbers in reviews. Do they mean at the wall or after accounting for the PSU's efficiency? This one aspect starts to matter a great deal because you might see 850W peak load at HardOCP but they state at the wall measurements, meaning the actual load on a 90% PSU would be 765W. You need to understand the peak power load methodology. Both measurements make sense depending on what you are trying to show. HardOCP for example uses AT THE WALL measurements because they are trying to gauge the impact of electricity costs from a financial perspective. If you want to gauge what size PSU you would need from the same review, you would have to adjust the at the wall reading based on the efficiency of the PSU used in that particular review (i.e., a 90% efficient PSU would draw 10% extra from the wall since it is wasted energy hence we take 850W load measurement from a Kill-a-Watt and multiply by 90% PSU efficiency to get the actual load at the PSU level). PSU ratings by the top firms are stated at the PSU level, not at the wall level. Keep this point in mind.

Thanks for that!

Yeah, my PSU is the Corsair AX860, so its good-quality and pretty efficient as well. I still that I would be on the lower-end of PSU requirements for a dual 300w setup, but it is good to know it is at least feasible. :D