[VC][TT] - [Rumor] Radeon Rx 300: Bermuda, Fiji, Grenada, Tonga and Trinidad

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Between you and me, i dont expect HBM at 28nm in 2015.

Could be wrong though.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It does matter to enthusiasts that run their cards at 100% load all the time. I do gpu crunching with BOINC and when I was running 290s the heat from those cards would make the room hot, I don't even want to discuss the heat from bitcoin/scrypt mining. With my 980s the room barely warms up. The difference is significant in this use case.

This 100%. Its not so much for many people (myself included) that the power use is higher, its that the heat gets exhausted into the room which becomes uncomfortably warm.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
It should matter to enthusiasts to some extent, if for nothing else other than HEADROOM. Hawaii OCs like crap and GK110 overclocks like a champ. If Fiji comes along with water cooling standard and uses 265 watts under load, neato you got a 60 C quiet card taking up 4 spots that can't fit into many SFX cases and I can almost guarantee it won't have much headroom left. AMD will already be squeezing out as much as they feasibly can with regards to yield.
No, you can't guarantee anything. The heat generated by a chip is only a small portion of the the overclocking potential equation. Also, why are you trying to put enthusiast level hardware in a SFX? You have an entire desktop to use.

The problem with a lot of the arguments I see here is they're trying to apply absolutes on something that is a spectrum. The question isn't "Does power draw matter?," as it clearly does; a better question is "when does power draw matter?" For example, if Fiji draws 300W but is 60% faster than anything on the market, how many unbiased enthusiasts are going to sit around twiddling their thumbs and not pick one up? If it's another Fermi that's 10% faster for 50-100% more power usage then obviously it's a poor trade off, we will see.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Fantastic post RS.

This is one of the best-written, well thought-out posts I have seen here on AT. I agree with your analysis and I think you are spot-on..

Thanks!!

It should matter to enthusiasts to some extent, if for nothing else other than HEADROOM. Hawaii OCs like crap and GK110 overclocks like a champ.

Overclocking is not as simple as drawing a direct correlation between temperatures and OCing headroom as you want to imply. It's a lot more complex than that. If you add the world's best water block and add max voltage to Hawaii, it still overclocks way worse than GK110 on air with minimal voltage control, even if you get R9 290X to run at 50-55C.

Average overclocks from HWBot:

R9 290X (default clock 1000mhz)
Air = 1145mhz
Water = 1196mhz (just 51mhz more), or less than 5% over air
Cascade = 1304mhz
LN2 = 1423mhz

GTX780Ti (default clock 902mhz)
Air = 1204mhz (!!!)
Water = 1369mhz (13.7% more than air)
Cascade = 1605mhz
LN2 = 1694mhz

As I said, temperatures are not the biggest limitation in Hawaii overclocking but something else is the bigger bottleneck. For example, electron-migration, transistor density/leakage, transistor composition/frequency scaling, etc. The average water overclock on GTX780Ti is almost as good as LN2 on 290X, and you know which one runs cooler under those conditions.

If Fiji comes along with water cooling standard and uses 265 watts under load, neato you got a 60 C quiet card taking up 4 spots that can't fit into many SFX cases and I can almost guarantee it won't have much headroom left. AMD will already be squeezing out as much as they feasibly can with regards to yield.

What 4 spots? This thread has countless examples of a 1x120mm radiator fitting inside miniITX cases. I am not sure why it matters to you specifically though because you would not purchase an AMD videocard so you aren't even the target customer. As I said already except for the 0.05% of PC gamers running Raven RVZ01 and RVZ02, almost anyone can fit a 1x120mm rad into a modern case if they really wanted to. A $100 NZXT 440 will fit 3x120mm rads and 1x280mm rad too. You think someone who can afford $1200-1300 on 2 of these cards can't afford to spend $100 on a new case? Give me a break.

Also, with regards to heat output, I remember back when Fermi hit the market and EVERYONE laughed off Fermi's temps and power draw, calling it a space heater.

This is 100% not true. A lot of people on our forum laughed at people who just focused on the reference designed 470/480 cards and these stubborn individuals that only focused on reference designed 470/480 cards were pointed out that after-market 480 cards are available and they easily solved the noise and temperature issues. I guess some people are just close-minded and only buy blowers. Please, don't make revisionist history because not everyone bashed Fermi. I bought 3 470s myself. The reason people laughed at Fermi wasn't because it used 275W of power, but because it used that much power for 15-18% more performance over a 145W 5870. It's the idea of so little extra performance for 90% higher power usage while being 6 months late -- that's why people laughed at it.

Do you remember anyone laughing at GTX780Ti's 278W-286W power usage? Cuz that's what an after-market 780Ti uses. That card was very popular indeed and high-end enthusiasts purchased it in droves at $600-700. If you only want to use sub-250W cards, that's fine but don't make statements like 265W of power is some ludicrous amount when NV's flagship cards have used that much for years and years and sold like hot cakes.

Being inexperienced at the time with really high power use cards, I dismissed those arguments as well.

It has little to do with experience. If you were truly the target market for 250-300W flagship cards and 2-3 of those cards in CF/SLI, you wouldn't care much if your card used 250 or 300W of power as long as the performance, price/performance, VRAM, features, stability, etc. were there. You would need to buy a case, PSU, cooling system to accommodate >700W anyway and more if we include an overclocked i5/i7 CPU and monitor. That's why these are extreme systems not for everyone.

It's not anything against you, or anyone, but some people are obviously not the target market for these type of balls to the wall cards. Yet, plenty of people ran dual Titans, dual 290Xs, dual 780Tis, dual 580s, dual 7970Ghz overclocked, even triple and quadruple cards. People purchase cases to accommodate those setups, get WC, and so on. I don't know the size of the room your computer is located at. In my house, when my 7970s max overclocked ran at full load in the winter time the room temperature was still just 20-21C. In the summer time, my tower is placed right under a central air vent. When the central air A/C goes on which it would anyway because the humidity factor and 30C weather around the Great Lakes makes life without A/C unbearable, there is no impact to room temperatures. Obviously not everyone has AC or lives in cold climates, but if those individuals really wanted top-of-the-line performance, GM200 with 250W or 390X with 300W with near 970SLI level of performance is an amazing deal even from a perf/watt point of view, minus the SLI scaling issues. For those willing to go dual or more cards they would buy AC...it's that simple. Who spends $1000s of dollars on flagship GPUs but doens't have money for AC, a proper case, proper cooling? These cards aren't exactly targeted at a BestBuy $500-750 pre-build PC tower buyer.

It's just that YOU are not the target market for 500-750W CF/SLI setups and that's fine. But you are making it sound that 250-300W flagship cards are something new, which it isn't, by any means. A lot of high-end enthusiasts overclocked their 580/780Ti/7970Ghz/290X cards and all of those use > 250W in overclocked states.


But I was clearly wrong, My twin frozr GTX 465, despite running very cool and quiet, put off quite a bit of heat after an hour or two of gaming in a normal sized bedroom. Power draw matters, no matter how much you say it doesn't.

I think you missed my entire point of relative power usage. Did you not see when I said when a system already draws 400-450W+ of power, what's another 50-100W?

My LCD draws 210W at max power usage alone. My overclocked i5 for sure draws at least 150-160W with the motherboard. I am already at 370-380W before even counting a single GPU. Add 190W for a hypothetical after-market GTX980 GPU at peak, and my gaming rig with the monitor with a super efficient modern NV GPU would dissipate 550W of power in the room.

Now do you think it matters if all my components draw 550W or 650W of power in terms of heating up my room? Think about that for a second. In the winter time in my office room 100W wouldn't make a squat of a difference. In the Northern US/Canada, it's pretty cold for 5-6 months out of the year. In the summer time, well the AC/central air is already on anyway, whether I would have had a gaming PC or not.

If you are talking about a 250W GM200 vs. a 300W 390X in terms of impacting your room temperatures due to extra 50W of power on what will likely already be a 400W overclocked i5/i7 system to begin with, you would need to start using the laws of thermodynamics and room volume to prove to me that my room temperature will be impacted significantly.

Fiji is coming with cooler master heatsink.
Anyone knows of a good cooler master heatsinks for GPUs?
Does heat-sink mean air (non-water) cooling?

Supposedly it's Hybrid WC, with 1x120mm rad from either Asetek or CM.

AMD charged just $40 extra for a similar unit on an FX9590 at launch.

CPU%20Newegg.png


A single 1x120mm rad easily copes with > 300W of power usage at load.

66158.png



It does matter to enthusiasts that run their cards at 100% load all the time. I do gpu crunching with BOINC and when I was running 290s the heat from those cards would make the room hot, I don't even want to discuss the heat from bitcoin/scrypt mining. With my 980s the room barely warms up. The difference is significant in this use case.

So how in the world did anyone on our forum run multiple overclocked 480s, 580s, 7970s, 780s, 780Tis, 290s, 290Xs as ALL of those cards use > 250W each in max overvolted/overclocked states?

Are you implying you will skip 250W cards this round? That's fine as no one forces you to buy GM200/390X but since Spring 2010, AMD's and NV's flagship cards used ~ 225-280W of power at load.

For any serious bitcoin miners, since the amount of $ that was made was so ridiculous on a per card basis, the heat was 100% irrelevant. If you weren't greedy and didn't sell all your coins at once, a single path of HD4870/4890->6950/6970->7970->290 of non-stop mining made anywhere from $3,000-10,000 USD. That's just 1 card upgrade path. If you had 2 or 3 flagship cards from HD4870 days of mining when a single 4870 made > 1 bitcoin a day, then double or triple that value.

It's interesting to see how so many of you now think 265W of power is a lot. So what do you guys think GM200/390X should use? 185W? :hmm:

An after-market 980 already uses > 200W.
power_peak.gif


IF you guys are seriously complaining about 250-300W flagship cards, you are obviously not the target market for those products, and for sure not SLI/CF configs of them. Every single combination of flagship NV/AMD cards in the last 5 years used that much power when overclocked and it goes up in pairs. I don't know what exactly do you expect out of flagship products? That's why NV makes GTX970/980 for people who don't want 250-300W GPUs....why is this so hard to understand?
 
Last edited:
Feb 19, 2009
10,457
10
76
I don't think R9 390X will match GM200 in perf/watt but I personally don't care about 50-100W of extra power when discussing 400W+ overclocked i5/i7 rigs. Some people on our forum seem to care though.

Why would you assume such a large power use gap for GM200 vs 390X?

Do you think its going to be a 200W part? Hardly, its a monster die, bloated for compute.

Recall the GK104 -> GK110 situation, where GK104 was ~180W and GK110 ~230W, but with a 35% performance lead (which was higher at 4k). It was also unique in that GK110 had a much larger die than GK104, so it sacrificed perf/mm2 to reach the same perf/w efficiency.

As I posted before, 980 is ~180W (similar to 770), how much higher performance do you expect GM200 to be? 35-50%? Then add that to the W because efficiency at best will remain similar, at worse, the extra compute "fat" may make it less efficiency perf/w. Note that at 600mm2, its ~50% larger, a smaller jump than GK104 -> GK110 die size.

If NV wants performance (980 + 40-50%), we're looking at a 240-270W part from GM200. Otherwise its a milder one 980 + 30% and we're getting ~780ti power use.
 
Feb 19, 2009
10,457
10
76
For the record, my prediction for NA review sites will be to compare GM200 vs 390X at 4K resolution with 8x MSAA or DSR/VSR and push that 4GB vram limit so the 390X will have bad "frame latency" via FCAT as its flushing vram. But the GM200 will retain smooth "unplayable low fps" due to 6/8GB vram. :0

Otherwise at normal 4k settings, 390X should be a beastly GPU with insane bandwidth & low latency vram keeping those shaders fed.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
You misunderstand, RS. I'm saying that if all things performance wise are equal I'm going to buy the lower wattage card because if I have the choice I want to be more comfortable in my office. Power usage (as it relates to heat output) isn't a primary consideration, but it is one factor I consider. If for instance NV cards would have mined equally to AMD cards and used less power of course my farm would have been composed of them. However, I am saying that since I'm down to SLI now rather than CF and I do crunch with my cards I'm happy to say my office is much cooler with the 980s over the 290s I had.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,584
14
81
As I said, temperatures are not the biggest limitation in Hawaii overclocking but something else is the bigger bottleneck. For example, electron-migration, transistor density/leakage, transistor composition/frequency scaling, etc. The average water overclock on GTX780Ti is almost as good as LN2 on 290X, and you know which one runs cooler under those conditions.

The bigger density on Hawaii chip makes harder to the card catch GK110 clocks. With a decent cooler surely Hawaii would be still hotter than GK110(density is not the problem causing this), but not this bad.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You misunderstand, RS. I'm saying that if all things performance wise are equal I'm going to buy the lower wattage card because if I have the choice I want to be more comfortable in my office. Power usage (as it relates to heat output) isn't a primary consideration, but it is one factor I consider. If for instance NV cards would have mined equally to AMD cards and used less power of course my farm would have been composed of them. However, I am saying that since I'm down to SLI now rather than CF and I do crunch with my cards I'm happy to say my office is much cooler with the 980s over the 290s I had.

I get what you meant now. :thumbsup:

All things being equal I would also prefer a card that used less power. But chances are pricing, VRAM, overclocked performance, SLI vs. CF scaling will not really be equal. Add to that different features for NV and AMD, rendering and compute performance. Then there is GSync vs. FreeSync, GW titles and promo game bundles. Then there is also XFX's lifetime warranty vs. EVGA's warranty, display outputs (will 390X have HDMI 2.0, DP 1.2a/1.3?), idle power usage with multiple-monitors, etc.

What about how do 2nd tier 390 non-X matches up to a 2nd tier GM200?

Imo, in the context of all of these factors, the 50W of power usage seems like a more minor point, no?
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Why would you assume such a large power use gap for GM200 vs 390X?

Do you think its going to be a 200W part? Hardly, its a monster die, bloated for compute.

Recall the GK104 -> GK110 situation, where GK104 was ~180W and GK110 ~230W, but with a 35% performance lead (which was higher at 4k). It was also unique in that GK110 had a much larger die than GK104, so it sacrificed perf/mm2 to reach the same perf/w efficiency.

As I posted before, 980 is ~180W (similar to 770), how much higher performance do you expect GM200 to be? 35-50%? Then add that to the W because efficiency at best will remain similar, at worse, the extra compute "fat" may make it less efficiency perf/w. Note that at 600mm2, its ~50% larger, a smaller jump than GK104 -> GK110 die size.

If NV wants performance (980 + 40-50%), we're looking at a 240-270W part from GM200. Otherwise its a milder one 980 + 30% and we're getting ~780ti power use.

Remember Maxwell reuses DP shaders for SP work, like gcn does. It does not have seperate 32 and 64 bit shaders like Kepler does. Thus die bloat will be greatly decreased.

GM200 should be under 600 mm^2. While the compute resources of the chip (shaders, ROP, L2, and Memory controller) are doubled, the video engine, display, etc. are not.

I expect efficiency on par with the 980. Size really does not matter. GM204 and GM107 are pretty much on par in terms of efficiency. GM206 is a little behind.

I expect GM200 to be capable of +40-50% performance over the 980 at around that much more in power. Who knows, Nvidia might implement some other performance features.
 
Feb 19, 2009
10,457
10
76
Remember Maxwell reuses DP shaders for SP work, like gcn does. It does not have seperate 32 and 64 bit shaders like Kepler does. Thus die bloat will be greatly decreased.

GM200 should be under 600 mm^2. While the compute resources of the chip (shaders, ROP, L2, and Memory controller) are doubled, the video engine, display, etc. are not.

I expect efficiency on par with the 980. Size really does not matter. GM204 and GM107 are pretty much on par in terms of efficiency. GM206 is a little behind.

I expect GM200 to be capable of +40-50% performance over the 980 at around that much more in power. Who knows, Nvidia might implement some other performance features.

Thats my point, efficiency in the best case scenario is similar to the already efficient GM204. Thus, performance gains will come at similar power increases.

If you expect 40-50% faster than 980, then its 40-50% more in power use, at which point its at least a 250W GPU.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Thats my point, efficiency in the best case scenario is similar to the already efficient GM204. Thus, performance gains will come at similar power increases.

If you expect 40-50% faster than 980, then its 40-50% more in power use, at which point its at least a 250W GPU.

Pretty much. Though Nvidia may (and likely will) make further improvements like they did with GM204 over GM107.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Before I quote your massive wall of text, I just want to point out (and I will also point it out several times below) that you ENTIRELY missed what I was trying to convey. ENTIRELY. It flew way over your head. You either didn't look up or it was just waaaay to high for you to see.

TL;DR. Let me rephrase my previous post for you. You don't care about power consumption. I get that. That's fine. But I said "what if Fiji runs too hot / consumes too much power and basically requires water. What if AMD clocks it so high (to look awesome for reviews, and because WATERRRRRRR) that there is basically very little headroom left. And what if, despite it's ultra low temps, it still consumes a shitload of power and the extra 8% OC might get out of it exponentially shoots up power consumption?

You gotta breath a little bit and fully analyze what some people are saying before you go into your novel-like responses.

Overclocking is not as simple as drawing a direct correlation between temperatures and OCing headroom as you want to [/b]

You're right. It has other variables like surface area in which to dissipate the heat, the transistor density, the type of transistors used, the overall engineering, etc. Surface area and power consumption are probably the two most determinant factors of heat.

But I'm not taking about heat. I'm talking about AMD backing themselves into a corner with respect to TDP on a single chip and forcing themselves to use water, not because it's cool, but because they might have no other choice. Such a chip would not have much headroom.


What 4 spots? This thread has countless examples of a 1x120mm radiator fitting inside miniITX cases. I am not sure why it matters to you specifically though because you would not purchase an AMD videocard so you aren't even the target customer.

Russian, quit making stuff up. You analyze things great sometimes, but then other times in your massive walls of text you drop snide comments like "so and so would never buy AMD." I have never once said I'd never buy AMD, and in fact I have bought several used AMD cards when I was months between systems and/or had my previous card fail on me. Please stop saying idiotic stuff like that in an attempt to demean others or just make yourself feel better in some weird way.

As I said already except for the 0.05% of PC gamers running Raven RVZ01 and RVZ02,

Anyways, Node 304 won't fit 295x2 unless you sacrifice CPU cooling. As you said, RVZ01 won't work, ML07 won't work, SG05 won't work, Tiki-style systems won't work, many SFX Lian Li's won't work, several other upcoming and great Sharkoon cases as well as other Silverstones won't work either. N1M case won't work, A4-SFX won't work, the list goes on and on and on. It's barely scratching the surface. You try to shoehorn SFX setups into this some infinitesimally small userbase, but the fact is it's not nearly as small as you EXAGGERATE it to be, and it's growing way faster than outdated over-sized cases like Corsair and Antec mid-towers.

This is 100% not true.

You could not be anymore wrong. It 100% is true. AMD made viral videos poking fun of Fermi's power draw. Every AMD fanatic on here went to town on Fermi's power draw. Nonstop. Reference or not, it did not matter. It was a massive source of ridicule (and looking back on it, rightly so).

The reason people laughed at Fermi wasn't because it used 275W of power, but because it used that much power for 15-18% more performance over a 145W 5870. It's the idea of so little extra performance for 90% higher power usage while being 6 months late -- that's why people laughed at it.

It was laughed at for ALL those reasons. It used way more power GPU before it, ever. It's going to get laughed at regardless if the performance is in line or not.

If you only want to use sub-250W cards, that's fine but don't make statements like 265W of power is some ludicrous amount when NV's flagship cards have used that much for years and years and sold like hot cakes.

I don't think 480's and 580's "sold like hot cakes." I think 460's and 560 TI's sold like hot cakes. I think 5850's and 5770's sold like hotcakes. But I guess our definitions are different there. If AMD releases a card that requires water cooling, and with water cooling and ~55-60 C temps it averages 265w power draw, and only manages 10% OC, then it's a piece of SH!T. If you want to stick a piece of SH!T in your computer be my guess. I guess if AMD sells it at break even or console margins they might get people to overlook how far they are falling behind the curve, but whatever.

If it's a closed-air design exhausting all hot air, runs 85 degrees, not too terribly loud, and has 15+% OC headroom, then awesome water cooling will just make it that much better. But require water cooling? Yeah no thanks.

It has little to do with experience. If you were truly the target market for 250-300W flagship cards and 2-3 of those cards in CF/SLI, you wouldn't care much if your card used 250 or 300W of power as long as the performance, price/performance, VRAM, features, stability, etc. were there. You would need to buy a case, PSU, cooling system to accommodate >700W anyway and more if we include an overclocked i5/i7 CPU and monitor. That's why these are extreme systems not for everyone.

I'm not arguing with you, those are all valid and important factors. But so is 30% faster performance OC vs. OC and 200 less watts. 200 less watts after 7-8 hours of straight gaming that these EXTREME SYSTEM GAMERS do will have a noticeable impact on non-basement dwelling gamers.

It's not anything against you, or anyone, but some people are obviously not the target market for these type of balls to the wall cards. Yet, plenty of people ran dual Titans, dual 290Xs, dual 780Tis, dual 580s, dual 7970Ghz overclocked, even triple and quadruple cards.

By plenty, you mean plenty of people on forums. Not plenty of people in comparison to the entire video card buying market. Not even close. And don't even try to compare people who use Trifire or 3-way SLI to people who use SFX systems. Neither of us knows jack squat.

It's just that YOU are not the target market for 500-750W CF/SLI setups and that's fine. But you are making it sound that 250-300W flagship cards are something new, which it isn't, by any means. A lot of high-end enthusiasts overclocked their 580/780Ti/7970Ghz/290X cards and all of those use > 250W in overclocked states.

Sigh. No I'm not making anything about flagship cards sound like an atrocity. Read between the lines Russian. Read what I said. It wasn't difficult to interpret. If FIJI requires water cooling, runs super cool with that water, still consumes 265 watts despite it's very low temps, and doesn't have any OC headroom, then it sucks. Do you not agree with that assessment?


I think you missed my entire point of relative power usage. Did you not see when I said when a system already draws 400-450W+ of power, what's another 50-100W?

No I saw it, and you're right another 50w doesn't matter. But, as you demonstrated, and I pointed out just above, you missed my entire point of what if Fiji requires water cooling, still doesn't have headroom, and still draws more power than the competition despite it's ultra low temps.

And 50 watts is never 50 watts. Not with what you call "the target market." 50 watts becomes 200 watts or more per card once OC'ing is factored in.

t's interesting to see how so many of you now think 265W of power is a lot. So what do you guys think GM200/390X should use? 185W? :hmm:

Sigh... point was still missed here I see.


IF you guys are seriously complaining about 250-300W flagship cards, y

I give up. When did I ever say 250 watts is too much?
 
Feb 19, 2009
10,457
10
76
I think its going to come down to how much NV is willing to push GM200 above GM204. If its tame at ~30%, then power wouldn't be much of an issue at ~=780ti levels.

It's exciting times ahead either way, as we'll be with these big 28nm GPUs for awhile with signs of 14/16nm ff being pushed out further. I just hope both GM200 & 390X are tuned for performance, because in many recent games, 980 performance class isn't cutting it for normal resolutions and basically worthless at 4k (or 1080/1440p with DSR/VSR).

Also, its the first time AMD has gone with a huge monolithic GPU approach since acquiring ATI. That's a milestone right there.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I would not be so quick to say that

If AMD releases a card that requires water cooling, and with water cooling and ~55-60 C temps it averages 265w power draw, and only manages 10% OC, then it's a piece of SH!T.

Is necessarily true as it depends greatly on performance. That would be awesome for 70% over a 980 and terrible for 20% over a 980.

I will say that I agree with the assessment that requiring watercooling will limit adoption. Personally I do not want a watercooled card and would never buy one.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I would not be so quick to say that



Is necessarily true as it depends greatly on performance. That would be awesome for 70% over a 980 and terrible for 20% over a 980.

I will say that I agree with the assessment that requiring watercooling will limit adoption. Personally I do not want a watercooled card and would never buy one.

I'll wager to say that if the chip outperforms a 980 by 70%, the chip won't REQUIRE water cooling because AMD will have a gold mine on their hands with necessitating the need to push the chip to the max and leave no headroom.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Are people really arguing overclocking headroom and power consumption of unreleased card?
Some of you should take a break.

The card is coming. Even if it beats nv there is nothing to be afraid of, relax. Maxwell will sell like hot-cakes regardless of its performance and competitiveness, as long as it is branded with nvidia.

AMD may gain some bragging rights, but marketshare and finacials will continue to grow on the green side.

Even if amd releases whole new lineup that destroy maxwell - which they are not going to do, will not make a dent on nv - like GCN couldn't compete with fermi on the market level.

Relax.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm surprised at just how little marketshare changed when Bitcoin mining was big and AMD was ruling it. Hardly a few percent at most, from the mindshare they had I would've thought they went up a lot more than that...

Rory Read missed the boat and lacked the foresight to plan production to meet demand. They sold every chip they had and still lost market share. Then, when AMD finally upped supply the mining market dried up. Anyone on these boards that was following the mining could have predicted when the market was going to dry up. Read though, couldn't predict what time the sun was going to rise the next day with the weather channel helping. Now they are left with unsold inventory and have to delay their next product. If he had done nothing and realized he simply missed the boat he would have been better off. How incompetent do you have to be to have been better off to have done nothing? /rant
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I check this thread daily for updates to the potential release date. Instead, I find pages of arguing over a GPU the majority of whose specs are still TBC. Wish I had the same level of passion as some of you but there will be MORE than ample time to 'discuss' real-world performance/price/etc. when the cards actually hit the market.

Btw, last I heard we still don't have confirmation precisely when the flagship will arrive, right? The consensus seems to be Computex but AMD themselves have not confirmed it. Even if it is Computex, that could easily just be the unveiling, with actual cards only going on sale at a later date.

Right? Right?!

With that in mind, let's remember that we are currently in the month of February. Long time to go yet.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Yay...a red vs green discussion based on nothing but speculations? Let me jump in.

The R9 390X is going to clock at like 2000 mhz and have like 24gb HBM Vram on a 1024 bit bus...Nvidia clearly has no chance against that.

The card will also be sold for $100 only.

With those superior stats it is clear that AMD will own 95% of the market share by the end of the year.


*siiiiigh*


Can we just wait until we can actually benchmark GM200 and 390X? Or rather 380X and GTX 980 and 380 vs GTX 970...because that matters far more than some 1000-2000+$ dual GPU card...because we all know that the highest profit is not with those gigantic e-peen cards.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I agree with you on all of the excessive posturing. What's the point? If you like it you'll buy it, if you don't, you won't. I'm not saying not to discuss the possibilities, etc., that's why we all are here. But some people just want to show how right they are when by the time the cards come, nobody will even remember what anyone said.

As far as when we'll see cards? It's hard to say. AMD is so Ninja about their GPU's.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yay...a red vs green discussion based on nothing but speculations? Let me jump in.

The R9 390X is going to clock at like 2000 mhz and have like 24gb HBM Vram on a 1024 bit bus...Nvidia clearly has no chance against that.

The card will also be sold for $100 only.

With those superior stats it is clear that AMD will own 95% of the market share by the end of the year.


*siiiiigh*


Can we just wait until we can actually benchmark GM200 and 390X? Or rather 380X and GTX 980 and 380 vs GTX 970...because that matters far more than some 1000-2000+$ dual GPU card...because we all know that the highest profit is not with those gigantic e-peen cards.

Nah. People will still buy nVidia. ;)
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Cooler Mater better do a good heatsink for Fiji.
Everyone uses terrible performing reference 290X for performance comparison in their (NV's :p) reviews . Real cards in the mean time provide much (I mean MUCH) better performance (30Mhz Factory OC):
shadowofmordor.jpg


That is 20% faster with only 3% clock increase. This gain comes from good cooler and no throttling mostly. It is like a whole new level of gpu performance and we are talking about the same Hawaii gpu.

They need to nail reference design to avoid nv continue exploiting it in their reviewers guides.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Where is the confirmation from CM (or AMD) that the REFERENCE card will use this oft-mentioned liquid cooling?

I've seen plenty of rumors, and plenty of "sources" that agree with said rumours. But solid confirmation? Please do correct me if you have hard evidence for this