[Sweclockers] Radeon 380X coming late spring, almost 50% improvement over 290X

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Do you work for AMD or an AIB? How do you know that Fiji is 30-40% faster, not 45-50% compared to the 290X?

I don't get how a 550mm2 chip with 4096 SPs, 256 TMUs, and HBM is not 45% faster at high rez unless we are talking CPU limited benches?

I am quite sure that AMD has > 50% perf improvement on the high end R9 390X. the benches we saw yesterday are quite realistic - 60% on pure shader limited benchmarks and an avg of > 50% (51%) perf improvement in games. I am quite sure that number will be closer to 60% in the most demanding games which are primarily limited on shader performance.

http://www.guru3d.com/news-story/amd-r3xx-benchmarks.html

GM200 and R9 390X should be quite close at 1080p and 1440p/1600p. the R9 390X should win decisively at 4K though.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
AMD-putting-the-finishing-touches-on-300-series.jpg
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Also, it appears to me that the island used to cover up the name on the latest leak is Treasure Island (located in the Fiji Islands).

X6ti8JY.jpg


DkefhB4.jpg

Would be cool if true. I like those kind of leaks where it doesn't say it in your face, like most PR "leaks" do.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
First off, this is all speculation about the perceived efficiency gains of this card, we don't have any real numbers, just stuff purportedly leaked by ChipHell. Secondly, he is saying that if AMD needs AIO to compete with an air cooled NV solution, then they have failed, especially if GM200 has similar efficiency gains and doesn't rely on water cooling to attain that level of performance.

Secondly, I care about aesthetics when I build PCs, I don't need or want a couple ugly AIO's bolted outside my system and I suspect many others don't either.

This is *exactly* what I am saying. What I want AMD to get is something to compete with the current NV offerings, not only in performance, but in efficiency gains. My PC is already a space heater while gaming with 2x 670s. That's fine right now in winter, but in summer....I don't want to turn the AC up. I would LOVE to pick up a single card that gets me a small perf gain and lowers my power requirements. Right now, that's a 980...which I am holding off on.

Believe me: I've had my problems with AMD in the past...but they need to continue to exist. NV is showing that with the 970 debacle.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This is *exactly* what I am saying. What I want AMD to get is something to compete with the current NV offerings, not only in performance, but in efficiency gains.

Did you miss all the points I made in this post? I'll re-iterate this point -- context matters:

Videocard efficiency (An engineer's perspective) vs. Total system gaming efficiency (A gamer's perspective)


Engineer: 165W TDP 980 vs. 300W TDP 380X. If 380X is only 35% faster, it sounds absolutely horrible. R9 380X is horribly inefficient architecture/SKU. Doom and gloom. Big bonuses to NV engineers everybody!! :thumbsup::thumbsup:

PC Gamer: But wait, I am not an engineer, all I care about is my total system's efficiency in games, because well that's what I bought my card for. Architectural efficiency is fine and dandy but how does that translate to my gaming rig's overall efficiency?

So what do we get?

Power_03.png


What if in a game the i7 4770K system with a 380X was 35% faster than an i7 4770K + a 980, and the total system power usage was 35% greater or: 290W x 1.35 = 392W?

Notice, in that case the i7 4770K+380X system's power efficiency is just as good as the i7 4770K+980 system's overall power efficiency.

Conclusion: From a gamer's, not an engineer's point of view, the perf/watt HAS be considered in the context of the overall system power usage relative to the performance because that is the efficiency a gamer actually experiences!

>>> If you compare perf/watt ONLY on a graphics card basis, you are comparing SKU/architectural efficiencies of the cards, not the gaming rig's efficiency. And since you can't game just on a videocard without CPU/Mobo/RAM/HDD/SSD, what do you think actually matters more? Add in the power usage of your gaming monitor too btw!

My PC is already a space heater while gaming with 2x 670s. That's fine right now in winter, but in summer....I don't want to turn the AC up. I would LOVE to pick up a single card that gets me a small perf gain and lowers my power requirements. Right now, that's a 980...which I am holding off on.

You are looking at the wrong card then. Flagship 300 series isn't about 670 SLI performance at 180W. It's about exceeding 980's performance.

If you want a card as fast as 680 SLI at ~180W, that's a 980. Get a reference blower 980 and it's a 165W card.

power_peak.gif


perfrel_2560.gif


The flagship R9 300 is about getting as close as possible to the 500W 295X2 at 300W. You are just looking at completely different class of cards, which is why what you want out of a graphics card upgrade and what the flagship R9 300 card aims to achieve are misaligned.

Also, if you upgrade every 2.5 years, you can just get 970s today and get something faster in 2.5 years. The 4GB of VRAM is unlikely to be a major impact if you are the type who upgrades within 2-2.5 years. Alternatively, you can wait for GM200/R9 390 cards and just buy a discounted 980. No matter if you guy a GM200 or R9 390X (name?) in the summer of 2015, in 1.5 years or so a mid-range 14nm Pascal card will match their performance at 180W and make both of them look inefficient. That's just the nature of the GPU market.

If you want the most efficient gaming rig and most future-proof, the best way to achieve that is to upgrade more often. That's all I can say.
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
The monitor point is pretty sound. I had my monitors set brighter than I wanted or needed, and I turned them down to comfortable/low a few days ago. The result was that my power consumption went down by 50-60W. That's a lot more than any change in video card load power would do.

Also compare gaming load performance. The single biggest thing AMD could do to improve their TDP figures and make them competitive with NVIDIA's would have no bearing on performance because it's adopting the NV metric.

Flagships are for performance. Every step down from there is to trade that performance away for price and power. If the flagship doesn't hit the power numbers you want, that doesn't matter. What matters is if the part that hits the power target you want hits the performance numbers you want. The existence of a higher performance, higher power part does not prevent you from having a lower power part that you want.
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
@RussianSensation
You're all about logic, cold hard math, which somehow rhetoric will just not understand. The same points have been answered several times over, in the last few pages. I really appreciate and respect your tenacity.

@RampantAndroid
Believe me, i've had more driver/ hardware problems on Nvidia, and so did all my friends who're not willing to touch anything nvidia with a bargepole, and i shared this much in the other thread.

What is efficiency? Performance per watt. Is 380x or whatever it is to be called an excellent performer on that metric? Yes, as good as any other card on market, and as performance doesn't increase linearly with increased power consumption. You could even say that it is better than any other product on the market, and you would be right.

As RS advised, if you want something that could replace your 670 SLI, and at a certain TDP, then you're evaluating hardware incorrectly. You need to look at specifically cards that will do that, and not top of the line cards which are yet to be released, and certainly not with full blown chips. Do you think that a GM200 will draw as much as a 980? I certainly don't, and i think people who think that a card with 50% more shaders and more performance, and infact will have certain compute abilities compared to 980, it won't touch 980 in TDP. Does that make GM200 a horrible card too as you keep repeating incessantly that 380x is? No. Neither of them cards are bad (we don't know yet, if someone gonna mess up, we just don't), and while GM200 may be priced horribly as Nvidia cards are of late, it still doesn't make it a bad card. Very certainly a bad buy if you have any brains, but bad card, no.

On the other hand, i'm genuinely puzzled by your behaviour, as you keep incessantly bringing up performance/ watt, but never mention performance/ dollar (or any currency). So much so for a level headed discussion on upcoming cards and their merits.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
My prediction: Power consumption won't matter again if the 380x uses less than GM200
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Does that make GM200 a horrible card too as you keep repeating incessantly that 380x is?

Where in the hell have I said the 380 is a bad card? I seem to understand better than the rest of you that it's an unreleased card and I put absolutely no faith into any of the leaks. I'm speaking generally. I'm waiting to see what the 380 is.

The flagship R9 300 is about getting as close as possible to the 500W 295X2 at 300W. You are just looking at completely different class of cards, which is why what you want out of a graphics card upgrade and what the flagship R9 300 card aims to achieve are misaligned.

If you want the most efficient gaming rig and most future-proof, the best way to achieve that is to upgrade more often. That's all I can say.

You seem to be lumping the 380 and 390 in together here. Moreover, your entire example of the 380 vs 980 w/ 4770 is based on the idea that performance and power scale at a 1:1 ratio...which I highly doubt will be true.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Where in the hell have I said the 380 is a bad card? I seem to understand better than the rest of you that it's an unreleased card and I put absolutely no faith into any of the leaks. I'm speaking generally. I'm waiting to see what the 380 is.

You seem to be lumping the 380 and 390 in together here. Moreover, your entire example of the 380 vs 980 w/ 4770 is based on the idea that performance and power scale at a 1:1 ratio...which I highly doubt will be true.

Right, were you reading your own posts where you were criticising a card for using as much power as a 290x (almost) and providing about 40% more performance as not good (in as many words). That and only because it has a AIO water cooler for reference. Surely you didn't use the word 'bad', but implications were obvious enough. Not many replying to your posts missed that.

You kept banging on about efficiency, but certainly you haven't the grasp of performance/ watt, or we wouldn't be resorting to reply to the same posts, worded slightly differently, again and again.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
yeah I agree, after costs, drivers, smoothness, quietness and power consumption, I wonder what's next?
I have an answer, but the source is most likely under NDA... and the answer is, [drumroll], DX12. The source, Fottemberg, a contributor to bits and chips mentioned something along the lines of current Maxwell and a storm of fecal matter which may come to in April. Why i think that's the time Windows 10 is to hit the shelves.
 

utahraptor

Golden Member
Apr 26, 2004
1,078
282
136
yeah I agree, after costs, drivers, smoothness, quietness and power consumption, I wonder what's next?

We have hardly scratched the surface!

  • Does the side of the card feature a configurable light up LED logo?
  • Is the card too long / too short?
  • Is there a back plate?
  • You say the coils are not whining? Yes they are! I can hear them from here!
  • There may not be enough accessories in the box.
  • The chosen display outputs are clearly not optimal. They should be mini if they are full size and they should be full size if they are mini.
  • Does the factory of the brand you chose feature more or less suicides per hundred workers than my brand?
  • Your card blows out the rear of the case and it should exhaust internally to be quiet.
  • Your card exhausts in the case and causes the heat to just build and build and build!
  • You can hardly call that a 2 slot solution. That is clearly 2.25 or 2.5 slots. You could not TRI SLI/Crossfire that on most mobos.
  • Oh, I see the card has a 6 + 8 power connector set up. If you had that on nitrogen you are going to tap out on power before it even gets cold.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
We have hardly scratched the surface!

  • Does the side of the card feature a configurable light up LED logo?
  • Is the card too long / too short?
  • Is there a back plate?
  • You say the coils are not whining? Yes they are! I can hear them from here!
  • There may not be enough accessories in the box.
  • The chosen display outputs are clearly not optimal. They should be mini if they are full size and they should be full size if they are mini.
  • Does the factory of the brand you chose feature more or less suicides per hundred workers than my brand?
  • Your card blows out the rear of the case and it should exhaust internally to be quiet.
  • Your card exhausts in the case and causes the heat to just build and build and build!
  • You can hardly call that a 2 slot solution. That is clearly 2.25 or 2.5 slots. You could not TRI SLI/Crossfire that on most mobos.
  • Oh, I see the card has a 6 + 8 power connector set up. If you had that on nitrogen you are going to tap out on power before it even gets cold.
Your list is better than my post. Cheers!
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
We have hardly scratched the surface!

  • Does the side of the card feature a configurable light up LED logo?
  • Is the card too long / too short?
  • Is there a back plate?
  • You say the coils are not whining? Yes they are! I can hear them from here!
  • There may not be enough accessories in the box.
  • The chosen display outputs are clearly not optimal. They should be mini if they are full size and they should be full size if they are mini.
  • Does the factory of the brand you chose feature more or less suicides per hundred workers than my brand?
  • Your card blows out the rear of the case and it should exhaust internally to be quiet.
  • Your card exhausts in the case and causes the heat to just build and build and build!
  • You can hardly call that a 2 slot solution. That is clearly 2.25 or 2.5 slots. You could not TRI SLI/Crossfire that on most mobos.
  • Oh, I see the card has a 6 + 8 power connector set up. If you had that on nitrogen you are going to tap out on power before it even gets cold.
hahahahahaaa wtf bro, stop making me laugh. people around me is looking at me funny.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
This is *exactly* what I am saying. What I want AMD to get is something to compete with the current NV offerings, not only in performance, but in efficiency gains. My PC is already a space heater while gaming with 2x 670s. That's fine right now in winter, but in summer....I don't want to turn the AC up. I would LOVE to pick up a single card that gets me a small perf gain and lowers my power requirements. Right now, that's a 980...which I am holding off on.

Believe me: I've had my problems with AMD in the past...but they need to continue to exist. NV is showing that with the 970 debacle.

It sounds like what you want is for the 380X (or whatever their upcoming flagship is called) to have near the same TDP as the 980 and near the same performance with a good reference air cooler. Is that the efficiency you're talking about? If so, just get a 980.

There isn't anything exciting about staying at the status quo. We've had basically the same level of performance since the 780Ti came out almost 1.5yrs ago. From the looks of it, most people in this thread want Nvidia and AMD to release no-holds-barred flagships with as much performance as possible. If TDP has to hit 250W+ to get there, so be it. I'd much rather see Nvidia/AMD put the efficiency gains to good use in a true flagship card with killer performance.
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
But I guess mature conversation is hard to come by these days...

Case in point...

My prediction: Power consumption won't matter again if the 380x uses less than GM200

yeah I agree, after costs, drivers, smoothness, quietness and power consumption, I wonder what's next?

We have hardly scratched the surface!

  • Does the side of the card feature a configurable light up LED logo?
...

It always comes down to PhysX. Can't believe you guys forgot about that. Surely, NV's marketing is failing there.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Right, were you reading your own posts where you were criticising a card for using as much power as a 290x (almost) and providing about 40% more performance as not good (in as many words). That and only because it has a AIO water cooler for reference. Surely you didn't use the word 'bad', but implications were obvious enough. Not many replying to your posts missed that.

You kept banging on about efficiency, but certainly you haven't the grasp of performance/ watt, or we wouldn't be resorting to reply to the same posts, worded slightly differently, again and again.

I'm saying two things: cards that need an AIO to function are bad in my eyes. I'm looking at you, 295x2. If the 380/390 ran so hot that it needed a AIO to function, I would view it as a failure (I was quite clear on this in my first post that started this chain.) I view the efficiency improvements Maxwell came with as a very good thing; I would like to see AMD trying to make the similar improvements. If you want a non-GPU example, go look at Ivy Bridge compared to Haswell. Haswell (using numbers from AT) consumed 25% less power at idle, while under load it drew 11% more than IB, but at the same time the performance was 13% higher on Haswell.

Cards that ship with an AIO as default (regardless of reason) are in my eyes, a worse value than a proper heatsink solution. Moreover, I think if AMD's default solution was a AIO with a heatsink you need to mount on the back of the case, they will see serious issues. In past, cards like the 5970 were too long to fit some cases. With an AIO, some cases won't have a area to mount to in reach, or some users might already have a H100 or similar taking up that space. They limit themselves by adding the "will it fit" questions when buying their card. It's one thing for a 5970 or a 295x2 to have this problem, given it is such an expensive card. It's another thing if a more mainstream card, such as a 380, has that problem.

It sounds like what you want is for the 380X (or whatever their upcoming flagship is called) to have near the same TDP as the 980 and near the same performance with a good reference air cooler. Is that the efficiency you're talking about? If so, just get a 980.

There isn't anything exciting about staying at the status quo. We've had basically the same level of performance since the 780Ti came out almost 1.5yrs ago. From the looks of it, most people in this thread want Nvidia and AMD to release no-holds-barred flagships with as much performance as possible. If TDP has to hit 250W+ to get there, so be it. I'd much rather see Nvidia/AMD put the efficiency gains to good use in a true flagship card with killer performance.

I've looked at a 980 because it is the flagship out there right now that isn't a downgrade for me. A 970 would be a downgrade in some instances. A 290x is similarly not a real upgrade either.
 
Feb 19, 2009
10,457
10
76
Case in point...

It wouldn't have degenerated to that without users coming in here claiming even with +50% perf and efficiency gains, because the reference cooler is water, makes it automatically bad. That kind of stupid begets a non-serious response.

You are free to redirect the discussion back to sane topics if you want to contribute something useful.