[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 45 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why do you always have to write a freaking article everytime you respond to someone? It makes atleast me less inclined to reply.

I only wrote 4 small paragraphs to you and the last one is a summary of where I would like the prices to be. The paragraphs above it explain why I think the prices should be where they are. If you don't want to read that info, you could have skipped to the pricing I provided instead. Is English not your first language? It takes me less than 3 min to type what I just typed which means it's < 3 min to read!

Who told you the Hawaii will continue to be priced at $249 in the 300 series?

No one. My poinst is R9 290X is already $280 today and R9 290 is $240. If AMD prices Hawaii rebrands at > $300, they should be all faster than a 970 which is essentially near 980 level of performance (380 = 290X, 380X = 980).

Shouldnt AMD be rewarded by putting out more efficient GPUs like Nvidia do with their cards? They are not running a charity.

Sure, some small reward for perf/watt. I personally assign very little value to perf/watt as electricity is dirt cheap for me and my PSU could handle 3x275W cards with ease. That's why I found 980's $550 price absurd compared to 290X/970. If someone does assign a lot more value to perf/watt, I guess this release sounds exciting to them. I care about performance and price/perf not power usage. If AMD can't deliver on those 2 metrics, R9 300 series is a fail to me. If R9 390X uses 1W of power, costs $549 and was as fast as a 980, I would not be impressed.

AMD have a low price now on the cards probably because the cards is bleak in comparison to Maxwell. And to remove inventory of current TSMC 28nm cards. Efficiency is a big part of the reason GTX 980 is selling like it does.

I think the efficiency hype behind it is mostly marketing. Review sites tarnished the image of all R9 200 cards, continuously ignored after-market options. In reality, an after-market 290X is both cool and quiet. So those 2 negatives are irrelevant for after-market buyers but the average Joe ignores reviews of after-market cards post launch.

As far as power usage goes, by the time you build an entire i7 system with an 980 or 290X, the power usage is not world's apart. Again, pure marketing at work. At North American electricity rates, this is pocket change and we are still taking an entire rig that uses < 400W. Perf/watt is one of the most misleading marketing tactics GPU makers use to sell cards. I was really hoping AMD doesn't go down this path of charging huge premiums for R9 300 cards because of that....

A $2xx R9 370X with HBM and TDP of 140W will sell a heck more than a R9 290 at say 270W. Trust me, the price is very good for what you are getting ;)

Ya, you are right. For me though I care about perf. not power usage. So a 140W R9 370X that costs the same as a 290 but uses half the power is an amazing engineering accomplishment but does little to move price/perf or overall perf. I guess that's what the market wants. I will focus my attention to R9 390/390X/GM200 because from what you are hinting, R9 380X does not sound like they will be even beat a 290X by much, with most of the focus on perf/watt. That's meh for me.

If it is ~GTX780 and costs 259$ ;) with 140W of TDP its a hell of a GPU.

How? That's too expensive. R9 290X is nearly 25% faster than a 780. At that point if I wanted a power efficient card, I'd try to find a GTX970 for $295-300. $40 more for 15% more performance. Most of the market will buy a slower NV card for more $. If R9 370X is $259, most people will keep buying a 970. NV can just drop GTX970 to $299 MSRP and with $20 rebates from AIBs, it's game over for the 370X at $259. I guess if we compare it to a 960 4GB, then it would be amazing. :awe: I still think the price is a tad too high.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
RS, I think R9 380 will be around GTX980, HP without purpose didn't put it there, in their next gen computers.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Maybe my joke that the definition of TDP for AMD was the low hanging fruit in addressing the TDP gap was something people were taking seriously.

That's another frustrating thing. Review sites slanted all their opinions on reference R9 290/290X cards by focusing on poor noise levels and temperatures by ignoring after-market cards. Conversely, they hyped up GTX970/980's perf/watt using reference cards, but hardly provided negative feedback to NV's marketing practices when it comes to power usage of after-market offerings. Double standards FTW!

TDP is such a marketing BS nowadays, AMD should just assign TDP ratings 30-50W lower like NV does.

Power_02.png


RS, I think R9 380 will be around GTX980, HP without purpose didn't put it there, in their next gen computers.

I don't expect 380 to be as fast as a 980. 380X.

Also, I checked how well performs GTX780 in comparison to R9 290. Its the same performance. At least in somewhere around FullHD, 1440p.

780 is slower than an R9 290 and 290 OC beats 780 OC too. This is where marketing needs to come in -- NV style. AMD needs to launch R9 380 with 7-10% faster factory pre-overclocked AIB versions. Send only those for reviews. No reference cards.

This is exactly what NV does a lot of the time. They send only after-market cards for reviews with superior coolers that result in lower temps, noise levels and because these cards are factory pre-overclocked, they boost higher in games than a reference card.

This way they can get 7-10% extra 'free' performance by playing the same game.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I only wrote 4 small paragraphs to you and the last one is a summary of where I would like the prices to be. The paragraphs above it explain why I think the prices should be where they are. If you don't want to read that info, you could have skipped to the pricing I provided instead. Is English not your first language? It takes me less than 3 min to type what I just typed which means it's < 3 min to read!

Considering your long winded replies and micro quote to hell and back posts have been called out in every single thread I've seen you participate in, by different members, it'll be difficult to paint him in the wrong on this one.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
After I heard that Skylake tops out at 4 cores, I decided to move up the 4K build to now instead of end of the year. The build has the budget for two GPU's.

980ti, if the past is any indication will offer Titan X performance at a $250 discount. I've been eyeing the Titan X SLI performance, I wouldn't pay $2k but I could part with ~$1500 for that performance. AMD, for the sake of both of us, please convince me to buy 2 390X's and a Freesync monitor.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,697
2,622
136
So which models, if any, has AMD *officially* stated will have HBM? Have they officially even released the model numbers?

No speculation, no "personal contacts", no extrapolations from assumptions, but confirmed statements from AMD.

AMD titled a talk of them at the Hot Chips conference as "Fiji, the World's First Graphics Processor with 2.5D High Bandwidth Memory". They then retitled the talk into ”AMD’s next Generation GPU and Memory Architecture”
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Considering your long winded replies and micro quote to hell and back posts have been called out in every single thread I've seen you participate in, by different members, it'll be difficult to paint him in the wrong on this one.

Reading isn't particularly hard, and thorough responses are good because they add something more than continual back and forth over the same issues. It'd be a pretty significant loss if his posts were trimmed back to suit people afraid of a few words.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Considering your long winded replies and micro quote to hell and back posts have been called out in every single thread I've seen you participate in, by different members, it'll be difficult to paint him in the wrong on this one.

1. When I am multi-quoting different posts in the thread, sometimes I have to edit my post. I see this as a better solution rather than posting 3-4 consecutive posts. I think multi-quoting is by far the preferred method on the forum per the guidelines.

2. OK, I will try to be more concise with my replies. I try to break down individual points and address them one-by-one which is why you see me break down a particular post into various quotes. I don't see how my replies to him were so difficult to read or comprehend the main message.

3. If you notice, on various occasions, many members do not read longer posts, not just mine but from other members. A lot of these very posts have key info. The result is we have posters repeating things that were covered 10-15 pages ago such as R9 390X limited to only 4GB of VRAM vs. 8GB. Had they read the posts, this question wouldn't be repeated 5-10X over. Similarly, had people actually read the data and threads on how to decipher TDP, they wouldn't be using TDP to measure power usage. Yet, 45 pages in after it was proven that TDP =! power usage, and that we can't compare AMD's vs. NV's TDP, TDP is again used to gauge power usage. That means some posters either don't bother reading other member's replies, or they purposely ignore things that contradict what they want to post.

AMD titled a talk of them at the Hot Chips conference as "Fiji, the World's First Graphics Processor with 2.5D High Bandwidth Memory". They then retitled the talk into &#8221;AMD&#8217;s next Generation GPU and Memory Architecture&#8221;

I saw that. Sounds like a 100% confirmation that Fiji has HBM with 2.5D HBM.

After I heard that Skylake tops out at 4 cores, I decided to move up the 4K build to now instead of end of the year.

Congrats! Certain popular games really benefit from higher resolution. I read that AMD delayed their FreeSync CF drivers indefinitely. Also, I haven't followed much on good quality 4K GSync monitor. What are the options today?
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Reading isn't particularly hard, and thorough responses are good because they add something more than continual back and forth over the same issues. It'd be a pretty significant loss if his posts were trimmed back to suit people afraid of a few words.

Except it just adds more back and forth. When trying to hit on multiple points in a single post, the person you're responding to isn't reading the entire post and someone else will respond to one of the non-pertinent points and the thread goes off on a needless tangent. Reading being easy isn't The point. People he's responding to aren't reading his posts which makes it pretty impossible to get your point across. Less is more in this case.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Except it just adds more back and forth. When trying to hit on multiple points in a single post, the person you're responding to isn't reading the entire post and someone else will respond to one of the non-pertinent points and the thread goes off on a needless tangent.

I read almost everyone's replies, no matter how long they are if I am interested in the topic. I am not sure how to respond to your comment. What is one supposed to do then, ignore 4-5 points in someone's post and respond to just 1 of them? I read his posts but he doesn't read mine. It wasn't just me who pointed out that you have to be very careful when you try to gauge perf/watt from TDP ratings. Tonga XT with 2048 SPs has 125W TDP for Apple, which sounds amazing on paper against a 250W HD7970, right? But what exactly does that do for a 7970/680/770 gamer? Does it move the market forward in terms of total performance at $200 range? Does it drastically change price/performance against current NV/AMD cards?

I guess some people really care about power usage and 370X with 140W power usage sounds exciting to them. The point here is if AMD solely focused on power usage, it isn't exciting for a lot of gamers because all NV needs to do is just lower prices and R9 300 series is irrelevant. As an example, if $349 R9 380X uses 180W of power and is as fast as a reference 290X, how is that exciting? A PC gamer could have had that Sept 2014 with a $330 970. This is the dilemma AMD faces today -- focusing just on perf/watt is not enough.

AMD needs more performance with R9 300 series and right now we still have no indication at all where 380/380X/390/390X sit based on that 370X data point.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
That would indicate 2 stacks of 512MB.

2 stacks of memory Hynix isnt selling (yet atleast).

I have a feeling someone is faking those results. A 384SP part with expensive memory doesnt make sense either.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
2 stacks of memory Hynix isnt selling (yet atleast).

I have a feeling someone is faking those results. A 384SP part with expensive memory doesnt make sense either.

Its only your assumption that its expensive. It looks like its not at all.
 

jpiniero

Lifer
Oct 1, 2010
17,211
7,585
136
Its only your assumption that its expensive. It looks like its not at all.

Maybe the cost difference isn't so bad to make it worthwhile for AMD to do considering the power savings.

As an example, if $349 R9 380X uses 180W of power and is as fast as a reference 290X, how is that exciting? A PC gamer could have had that Sept 2014 with a $330 970. This is the dilemma AMD faces today -- focusing just on perf/watt is not enough.

That's the reality of things today though. The days of performance at any cost are over. Plus, $/transistor is roughly the same at 20 nm. Maybe they can justify the design cost because of competitive reasons but actually increasing the core count at each level would just make it too expensive.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Its only your assumption that its expensive. It looks like its not at all.

If it wasnt because of the rebrands in the 15.3B driver and that OEMs are now selling rebranded 300 series in mobile it would have been much more likely. And even better, if the memory used matched something Hynix officially sells.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
TDP is such a marketing BS nowadays, AMD should just assign TDP ratings 30-50W lower like NV does.

AMD never assigned any official TDP rating to the Hawaii cards. They told Anandtech that the "average gaming power scenario" is 250W, but the reviewer thought that the real TDP was "closer to 300W". TechPowerUp's tests indicate that AMD is right about average power consumption, but the actual TDP for the reference card in Uber mode is 315W.

Enthusiasts may not care (though the market indicates that many of them do), but OEMs have to consider the fact that they need to spec out bigger (and more expensive) PSUs for GCN cards as compared to Maxwell. And that's without even getting into laptops - Maxwell's power efficiency absolutely slaughters AMD when it comes to discrete GPUs in laptops.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I have a feeling someone is faking those results. A 384SP part with expensive memory doesnt make sense either.

Everyone assumes that HBM is really expensive, but I haven't seen any actual evidence of that. The truth is that everything people are saying about pricing, yields, and configurations for HBM is almost pure speculation.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That's the reality of things today though. The days of performance at any cost are over.

What you are saying would only be true if flagship cards went from using 250-275W of power to just 150-200W of power. That's not what's happening. What you described is exactly what NV wants us to believe. They created a new marketing strategy whereupon a videocard's worth is tied directly to its perf/watt, rather than its marginal utility (price/perf). This is a brilliant marketing strategy because once the consumer believes this marketing BS, NV can double prices but still sell you GTX560Ti (980) and GTX580 (Titan X) under the veil of perf/watt. Think about it, in the past did newer GPUs improve perf/watt over older ones? Absolutely but AMD/NV hardly used it as a major selling point to get you to upgrade. Today, perf/watt is marketed as THE most important factor for upgrading. It's not if the Titan X or a 980 use significantly less power than a GTX580 or 560Ti. They don't. Then why all of a sudden are they more attractive to warrant double the price of their historical lineage predecessors?

If you ask a new PC gamer, they wouldn't know that GTX560Ti was from the 680-980 lineage and cost $249 while GTX280/480/580 were from the Titan lineage and cost $499. The perf/watt marketing worked but NV still continues to sell 250W flagships, just today they ask double the price. I think AMD will still give us a flagship 250W card that is 40-50% faster than the R9 290X. AMD has remained the price/performance king since HD4850/4870 series and I don't see R9 300 changing that trend. :p Since R9 290X sells for $280-300 today, even if AMD releases a flagship at $550-650, it should be enough to cover the 500mm2+ die size and HBM and make more $ than they are currently making on those 290 cards.

Plus, $/transistor is roughly the same at 20 nm. Maybe they can justify the design cost because of competitive reasons but actually increasing the core count at each level would just make it too expensive.

While I hate to go off-topic on financials, in this case it's no topic. Increased manufacturing costs of larger die GPUs on lower nodes in no way offset the major price increases certain GPUs experienced in the last 5 years. NV's FY2010 gross margins were 35.4%, and they increased every single year to 55.5% as of FY2015. Therefore, the theory that GPU makers can't afford to manufacture GPUs with more transistors at similar die sizes as in the past doesn't fly. I truly believe perf/watt is used as a marketing tactic to justify the price increases today because performance increases have slowed down (took 3 years for 780Ti to double the 580). That means GPU makers cannot sell us on perf/watt as easily anymore (just look at the 960 vs. 760 = total disaster).

How do you market something a consumer doesn't really need? You need to devise a strategy that makes his/her existing product seem vastly inferior in some way so that he/she thinks it's outdated tech. Today, the easiest way to do this is perf/watt marketing. Even Intel is doing it. Intel will probably try to launch 35W Skylake CPUs that are nearly as fast as the i7 4690K/4770K. The focus on perf/watt suddenly makes your perfectly fine Haswell CPU look outdated. I think the focus on perf/watt today is because it's the easier metric to market and the easiest one to hype up because its hows the greatest improvement from 1 gen to another among all other metrics consumers actually understand. All of a sudden you can market a 35W CPU that's slightly faster than a 65W one twice as good! :cool:
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
The problem is it's already been proven many times that not only can you not compare NV's TDP within NV's own product stack, you definitely cannot compare NV's TDP to AMD's GPU TDP. 370X with 140W TDP vs. 980 with 165W TDP tells me absolutely very little about the real world power usage of MSI Gaming/Gigabyte Windforce 370X vs. MSI Gaming/Gigabyte Windforce 980 as an example.

At TPU, reference 980 peaks at 184W and Gigabyte Windforce 980 at 204W. For those of us who buy mostly after-market cards, those reference TDP ratings are marketing BS. Also, as other posters mentioned, AMD often overstates their TDP but NV has a tendency to understate it. Until we can compare real world gaming power usage, it's hard to reach accurate conclusions here.

I always go by TPU's maximum power consumption figures, not whatever the manufacturer claims the TDP is. Here's what they came up with for various GTX 980 cards:

  • Reference: 190W
  • MSI Gaming: 207W
  • Asus Strix: 199W
  • Gigabyte G1 Gaming: 342W (!!!)
  • Asus Matrix: 193W
As you can see, all of these cluster roughly around the same range, except the Gigabyte, which is a gross outlier - apparently this AIB disabled the chip's power limit functionality altogether, resulting in power consumption regressing to the GTX 480 days. I would describe the GTX 980's real TDP as 190W for the stock card, with factory OC'd versions naturally having higher figures. I had almost forgot that Nvidia was claiming 165W - that's definitely too low.


Then I would be shocked if on the same 28nm without HBM, Hawaii can be re-spun to increase clocks 15% to match 980 but drop real world power usage to 190W. I don't see how that's possible as that's similar perf/watt revolution a full node jump or a brand new architecture would accomplish. Without a new architecture or some new revolutionary power tune algorithm and the card completely shutting off all DP transistors, I don't see how this is even possible on the same node.

If you're talking about a simple respin, I agree. On the other hand, if you're talking about a new chip with the same GCN core count as Hawaii, then you get a lot of efficiencies by going to GCN 1.2, cutting the memory bus to 384-bit, cutting out Double Precision performance, and so forth. Adding whatever efficiencies GloFo 28nm SHP may be able to offer, 190W doesn't seem unreasonable.
 
Feb 19, 2009
10,457
10
76
I don't expect 380 to be as fast as a 980. 380X.

These numbers have been around for a long time. AIBs have also said they are only waiting on volume.

One of the earliest leaks (Captain Jack?) have it 20% above 980 with similar power usage. The cut-down version would be ~= 980.

AMD's presentation claims 390X is at least 65% faster than R290X. Thus, the performance gap from 390X to 380X is in-line with expectations.

It will be interesting on the reveal, what is causing their delays, HBM yields or whether GloFo yields are just bad or its just a case of "drivers not ready for new tech".
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I honestly don't think many people at all buy based off of performance/watt. Whenever it's mentioned in forums, it's typically in conjunction with other metrics like noise and heat as well as performance. GTX 980 for example, doesn't cost what it costs simply because it's performance/watt is greater than that of a 290x. It in fact, beats it in every metric I can think of. It's more efficient, it's faster, it's cooler it's quieter. That's before we even get into the developer/vendor penetration which NVidia has been much more successful at than AMD, and all this is before we even get into marketing, which again, nVIida has been more successful at than AMD.

Point being, there are a lot of reasons NVidia has been more successful at moving product than AMD. Performance/watt is but one, and IMO a rather small one compared to some of the other reasons.