R9 380x rumor and speculation thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Don't get me wrong. I couldn't care less how good of a deal it is for AMD. They are the ones who were dumb enough to reduce the price of the 290's instead of releasing the 390's last year when (or if they were really smart before) the GM204 hit. They did the same thing with the 7900's. You could get a 3gig 7950 for $200 in the 4th Q of 2013 IIRC. That was while people were mining with them too.

AMD needs to keep the engineers designing and building new product and get some people in marketing whose job it is to sell stuff. Engineers are way to inside the box thinking to be effective at sales.

For example, how many times do we hear that AMD is selling old tech because they stick with the GCN naming while nVidia goes from Kepler, to Maxwell, to Pascal, to Volta? People assume they are getting the same old thing from AMD and something entirely new and improved and exciting and rave and hype on from nVidia. In reality the industry still hasn't caught up too GCN yet. It's had hardware capability just sitting there waiting for Windows to take advantage of it. DX12 is here and AMD doesn't have to change their hardware at all to take advantage of it.

Meanwhile, despite claims that nVidia hardware has been DX12 capable since Fermi, we don't even know whether or not Pascal is going to be able to completely take advantage of DX12 in hardware. So nVidia with their marketing machine convince everyone that it doesn't matter. By the time you need it we'll have it and you'll want a new card by then anyway. And if we don't have it we'll emulate it in software. And because you know we are absolutely awesome with drivers and software (because we have everyone tell you that we are) you just know we'll do it. And it'll be even better than having hardware do it directly. Even though that's 1/2 the point of DX12, but don't worry. Here buy another card now anyway. We'll let you know when you need a new one.
 

Rannar

Member
Aug 12, 2015
52
14
81
Or maybe we'll see two variants:

  • Variant 1: 2048 shaders 256bit 4GB targeting price-conscious 1080p market.
  • Variant 2: 2048 shaders 384bit 6GB targeting 1440p. It will be more efficient and cheaper to produce than Hawaii.


380X 2048 shaders 384bit 3GB for around 240€ (Eastern Europe)would be perfect card for 1080p.
4GB and 6GB would reach 290 price range and be not worth it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Ya but then 380X will be crapped on for not having HDMI 2.0, 4K HVEC, and worse perf/watt against Maxwell. At least with a 290 the consumer gets amazing performance for the price. NV loyal customers won't move over to a 380X as they'll either save money and get 960 or pay more for a 970. The only way is if AMD offers more of everything which means they would need to price 380X 4GB at $159. Only then, NV loyalists might even consider it over the 950/960.

Think about it, all the same marketing that sells 950/960 will be intact against the 380X, but 380X will be even slower than a 290. How in the world would it prove to be more successful in sales than a $250 290 that failed to do so for 8-9 months?

Again, it's letting the engineers run things. They look at DP and can't understand why anyone would want to use HDMI2 instead. What's it cost to add a little chip and an electrical switch to convert one of the DP outs to HDMI2? A dollar? 2 dollars? Cheap, anyway. To be honest I don't know what they'd need to do for HEVC. As far as perf/W I'm pretty convinced when the software starts using more of the fixed function hardware in GCN it'll be just fine. Look at how much more efficient GCN has been than Kepler/Maxwell in certain compute tasks that are capable of using AMD's hardware.

We're back to marketing. Efficiency depends on the task. They need to show the benefit (not just describe the feature) of the things GCN is good at. Instead they just sit there and act like their customers buy hardware to find Easter eggs after the fact.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
380X 2048 shaders 384bit 3GB for around 240€ (Eastern Europe)would be perfect card for 1080p.
4GB and 6GB would reach 290 price range and be not worth it.

AMD's 384bit mem bus isn't cost efficient. The 512 bus for Hawaii actually uses fewer transistors and takes up less space than the 384bit bus. The 256 bus is less than 1/2 the size of the 384 and with compression techniques they have now offers plenty of bandwidth. It's not like you are going to need more than 4gig of RAM either.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
They need to show the benefit (not just describe the feature) of the things GCN is good at. Instead they just sit there and act like their customers buy hardware to find Easter eggs after the fact.

Yea, the problem is not their lack of technology. The problem is they can't turn their technology in to sales. And that is part because they are thinking too much in to the future.

I would release the Fiji (Nano first and then Fiji Fury, i wouldnt even bother with Fiji Fury X) with simultaneous release of 1-2 DX-12 games specifically optimized for the GCN architecture after one or two days of the Windows 10 release.
Give those games to reviewers to include and benchmark in their reviews and watch the crowd being amazed.

I dont believe anyone would say that Fiji NANO at $650 is overpriced after that ;)

edit: Also that would bring lots and lots of R9 3xx sales as well.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Yea, the problem is not their lack of technology. The problem is they can't turn their technology in to sales. And that is part because they are thinking too much in to the future.

I would release the Fiji (Nano first and then Fiji Fury, i wouldnt even bother with Fiji Fury X) with simultaneous release of 1-2 DX-12 games specifically optimized for the GCN architecture after one or two days of the Windows 10 release.
Give those games to reviewers to include and benchmark in their reviews and watch the crowd being amazed.

I dont believe anyone would say that Fiji NANO at $650 is overpriced after that ;)

edit: Also that would bring lots and lots of R9 3xx sales as well.

I think you guys may be too harsh on AMD. What was Mantle if not a direct attempt to maximise the value of GCN's inherent capabilities?

Yeah, AMD knew Mantle was never going to take off if NV wasn't on the train, but it also knew that it would force the industry to abandon DX11 much faster, which is what it did.

So I do think AMD played its hand fairly well given its constraints. Its just that the console contracts turned out to be a chain on AMD rather than a boon. NV could change their archs a lot more to suit the moment. Maxwell got so energy efficient by basically throwing out a ton of stuff. When it's 2014, if someone told you that by 2016, your brand new Maxwell card wouldn't be so great on DX12, would it impact you? I think most people upgrade on a 2-3 year cadence anyway.

The issue for AMD can't be blamed on Mantle, on GCN or even on console contracts. They have been underdogs in the GPU industry for a long time and I don't think that's a coincidence. Just saying it boils down to NV FUD is wishful thinking.

It's their plunge from 35% market share - which they've been at for many years - to just 15% which is what alarms me. That's new and unprecedent. And if they play their cards right, they ought to catch up with NV in the next year or so. Longer term, however, they probably need to split up and/or be bought by someone. Their GPU business has been artificially hamstrung for far too long. The recent changes which grants Raja and his team more autonomy and independence is a step in the right direction, but it's still not enough.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think you guys may be too harsh on AMD. What was Mantle if not a direct attempt to maximise the value of GCN's inherent capabilities?

Yeah, AMD knew Mantle was never going to take off if NV wasn't on the train, but it also knew that it would force the industry to abandon DX11 much faster, which is what it did.

So I do think AMD played its hand fairly well given its constraints. Its just that the console contracts turned out to be a chain on AMD rather than a boon. NV could change their archs a lot more to suit the moment. Maxwell got so energy efficient by basically throwing out a ton of stuff. When it's 2014, if someone told you that by 2016, your brand new Maxwell card wouldn't be so great on DX12, would it impact you? I think most people upgrade on a 2-3 year cadence anyway.

The issue for AMD can't be blamed on Mantle, on GCN or even on console contracts. They have been underdogs in the GPU industry for a long time and I don't think that's a coincidence. Just saying it boils down to NV FUD is wishful thinking.

It's their plunge from 35% market share - which they've been at for many years - to just 15% which is what alarms me. That's new and unprecedent. And if they play their cards right, they ought to catch up with NV in the next year or so. Longer term, however, they probably need to split up and/or be bought by someone. Their GPU business has been artificially hamstrung for far too long. The recent changes which grants Raja and his team more autonomy and independence is a step in the right direction, but it's still not enough.

Mantle was good and so is Freesync. The engineering is fantastic. Hardware and software. They just don't sell it worth a damn.
 

Rannar

Member
Aug 12, 2015
52
14
81
AMD's 384bit mem bus isn't cost efficient. The 512 bus for Hawaii actually uses fewer transistors and takes up less space than the 384bit bus. The 256 bus is less than 1/2 the size of the 384 and with compression techniques they have now offers plenty of bandwidth. It's not like you are going to need more than 4gig of RAM either.

Yes but 384bit mem controller is still there. why not use full potential?
costs for PCB would be higher but 1 less gig of memory would compensate that compared to 4GB 256bit version.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Mantle was good and so is Freesync. The engineering is fantastic. Hardware and software. They just don't sell it worth a damn.
This is what I've been saying about amd this whole time. The underlying idea is good. The execution is a joke.
 
Oct 27, 2012
114
0
0
Yes but 384bit mem controller is still there. why not use full potential?
costs for PCB would be higher but 1 less gig of memory would compensate that compared to 4GB 256bit version.

It was rumored that it had a 384 bit bus but I dont believe it does. If they had it dont you think they would use it, unless they had a large batch with bad controllers or something but I doubt it. Also I cant say for sure but iv read somewhere that it is actually cheaper to go with a 256 bit bus with 4gb of ram vs 384 and 3gb the 1gb less of ram does not compensate for it.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Yes but 384bit mem controller is still there. why not use full potential?
costs for PCB would be higher but 1 less gig of memory would compensate that compared to 4GB 256bit version.

I think when you compare 280X to 285, you can see that Tonga does just fine with the 256bit bus.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It was rumored that it had a 384 bit bus but I dont believe it does. If they had it dont you think they would use it, unless they had a large batch with bad controllers or something but I doubt it. Also I cant say for sure but iv read somewhere that it is actually cheaper to go with a 256 bit bus with 4gb of ram vs 384 and 3gb the 1gb less of ram does not compensate for it.

If this is Tonga it sure has 384bit Memory controller.

tonga-crys800.jpg


And Tahiti (HD7970)
Lc2Qf.jpg
 

xorbe

Senior member
Sep 7, 2011
368
0
76
I change my mind, those do look like 6 ports on Tonga. The internal tiles seem rotated.
 
Last edited:

gamervivek

Senior member
Jan 17, 2011
490
53
91
AMD's 384bit mem bus isn't cost efficient. The 512 bus for Hawaii actually uses fewer transistors and takes up less space than the 384bit bus. The 256 bus is less than 1/2 the size of the 384 and with compression techniques they have now offers plenty of bandwidth. It's not like you are going to need more than 4gig of RAM either.

384-bit bus for Tahiti or Tonga?

I doubt that AMD went the ROP mismatch with a crossbar for Tonga a la Tahiti. Tonga with 48ROPs would rock if AMD could somehow get those shaders working on par with Maxwell's.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
384-bit bus for Tahiti or Tonga?

I doubt that AMD went the ROP mismatch with a crossbar for Tonga a la Tahiti. Tonga with 48ROPs would rock if AMD could somehow get those shaders working on par with Maxwell's.

From the Tonga die above I dont believe there are more than 32 ROPs.
 

naukkis

Golden Member
Jun 5, 2002
1,020
853
136
If im not mistaken ROPs are decoupled from the MC from Tahiti onward, that means Hawaii, Tonga and Fiji.

Nope, only Hawaii and it's just plain stupid, running rop data through crossbar increases gpu's power consumption for small amount of saved die size.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Nope, only Hawaii and it's just plain stupid, running rop data through crossbar increases gpu's power consumption for small amount of saved die size.

Tahiti (GCN 1.0) ROPs are decoupled from MC

Hawaii (GCN 1.1) ROPs are decoupled from MC

click


Tonga and Fiji (GCN 1.2) ROPs are decoupled from the MC

AMDGeoFront_575px.jpg
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Don't get me wrong. I couldn't care less how good of a deal it is for AMD. They are the ones who were dumb enough to reduce the price of the 290's instead of releasing the 390's last year when (or if they were really smart before) the GM204 hit. They did the same thing with the 7900's. You could get a 3gig 7950 for $200 in the 4th Q of 2013 IIRC. That was while people were mining with them too.

AMD needs to keep the engineers designing and building new product and get some people in marketing whose job it is to sell stuff. Engineers are way to inside the box thinking to be effective at sales.

For example, how many times do we hear that AMD is selling old tech because they stick with the GCN naming while nVidia goes from Kepler, to Maxwell, to Pascal, to Volta? People assume they are getting the same old thing from AMD and something entirely new and improved and exciting and rave and hype on from nVidia. In reality the industry still hasn't caught up too GCN yet. It's had hardware capability just sitting there waiting for Windows to take advantage of it. DX12 is here and AMD doesn't have to change their hardware at all to take advantage of it.

Meanwhile, despite claims that nVidia hardware has been DX12 capable since Fermi, we don't even know whether or not Pascal is going to be able to completely take advantage of DX12 in hardware. So nVidia with their marketing machine convince everyone that it doesn't matter. By the time you need it we'll have it and you'll want a new card by then anyway. And if we don't have it we'll emulate it in software. And because you know we are absolutely awesome with drivers and software (because we have everyone tell you that we are) you just know we'll do it. And it'll be even better than having hardware do it directly. Even though that's 1/2 the point of DX12, but don't worry. Here buy another card now anyway. We'll let you know when you need a new one.

Agreed with a lot of this post, but woof that is some sour grapes at Nvidia.



I personally don't see AMD benefiting from this attempt to shake off the value image. It's basically burned on them at this point. For too long they sat smiling in their perf / cost crowns.

What they need to do is drop the AMD name and bring ATI back. With a decent product and a higher price point, people might just remember the ATI of old, and gladly pay the cost of admission (I know I would).

Or better yet, sell the GPU patents to Intel and let Intel curbstomp Nvidia. That would be the best :D
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Agreed with a lot of this post, but woof that is some sour grapes at Nvidia.



I personally don't see AMD benefiting from this attempt to shake off the value image. It's basically burned on them at this point. For too long they sat smiling in their perf / cost crowns.

What they need to do is drop the AMD name and bring ATI back. With a decent product and a higher price point, people might just remember the ATI of old, and gladly pay the cost of admission (I know I would).

Or better yet, sell the GPU patents to Intel and let Intel curbstomp Nvidia. That would be the best :D
I think they should at least try effective marketing first. Usually a good strategy