R9 380x rumor and speculation thread

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Without question R9 390/970 are worth the extra $50 right now, albeit I hardly saw the same argument made on these forums for the R9 290 vs. 960. Hmmm. Anyway, not sure why die sizes have anything to do with what the consumer pays though.

It doesn't but the discussion isn't limited to just consumer spending, is it?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It does smack around the 960 in TPU review, being 28% faster at 1080p and overclocks well too since the vanilla clocks are slower than 280X's. TPU get 1136Mhz on the strix card, which is 17% better than the stock 970Mhz, 1150Mhz in a couple of other reviews I saw. Volt modding will get it up to 1.2-1.25Ghz.

It does get close to 290 in some games in Guru3d review, even matching the 970 in thief but also falls behind in 280X in some.

The less power usage seems to be Asus doing better with it once again like with their Fury version.

Asus is likely the single biggest contributor (ads and perks) to these sites. You never see a review where the Asus product doesn't shine.

So did the 380. Unless they lower the price of this to $200 to compete directly with the 960 and lower the 4GB 380 to $150 to take on the 950, this is a worthless card. Even then, it's still worse than the 280X and isn't really justifiable over the cheaper 380, so it's a waste of R&D.

Unless you want the added features and ~20% better efficiency.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
GCN 1.2 Tonga/Fiji does have way better encoding/decoding than previous AMD chips, though.

http://www.anandtech.com/show/8460/amd-radeon-r9-285-review/4

From that link:

What you won’t find though – and we’re surprised it’s not here – is support for H.265 decoding in any form.....H.265 is still in its infancy, but given the increasingly long shelf lives of video cards, it’s a reasonable bet that Tonga cards will still be in significant use after H.265 takes off.

As in today.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
But Tonga and Fiji are still way better at what most people do with images and video today.

And:

But to give AMD some benefit of the doubt, since a hybrid mode is partially software anyhow, there’s admittedly nothing stopping them from implementing it in a future driver (NVIDIA having done just this for H.265 on Kepler).
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
But Tonga and Fiji are still way better at what most people do with images and video today.

"Today" people are buying 4K tvs left and right. In fact I expect that to be the hot item on Black Friday. Yet even AMD's newest GPU can't connect properly to those TVs via HDMI 2.0 or play a HEVC 4K file. That is a big deal for someone spending money on something they plan to use for a few years.


Ryan was being nice and gave AMD the benefit of the doubt that eventually they would roll out a hybrid HEVC decoder. They never did from what I know.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
"Today" people are buying 4K tvs left and right. In fact I expect that to be the hot item on Black Friday. Yet even AMD's newest GPU can't connect properly to those TVs via HDMI 2.0 or play a HEVC 4K file...

So you are telling us that all those GTX950 are better for 4K?
Like playing a game at 4K at 60fps?
That is what I thought... /sarcasm

HDMI 1.4 can drive a 4K display at 30Hz, plenty for movie/video watching.
Blu-ray content at 1080p is "only" 23.976 fps, and plays fine.
HVEC content? The same one that a lot of people called useless feature in Carrizo, but then it becomes important when the others guys have it?

HDMI 2.0 is marketing. Short of GTX 980 Tis in SLI / R9 fury X in crossfire there are not GPUs with enough firepower to drive the full 4K display at over 30Hz.
Was it an omission on Fiji? Absolutely!
But below Fiji / Grenada GPUs, it is marketing only.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
to be honest, why are you all talking about this?

before we can have any decent conversation on this, don't we need the numbers of actual pc gamers who want to play games in 4k on a 4k tv instead of a monitor?

without it, both sides of the argument is just blowing smoke or just stating their own, singular personal opinion which matters jack @#$% in this kind of discussion.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
I keep hearing that there's no reason to buy a 380/380X with the 280/280X so cheap.

Well, here is a reason to have a GCN1.2 card.

With this newest generation of UVD, AMD is finally catching up to NVIDIA and Intel in H.264 decode capabilities. New to UVD is full support for 4K H.264 video, up to level 5.2 (4Kp60). AMD had previously intended to support 4K up to level 5.1 (4Kp30) on the previous version of UVD, but that never panned out and AMD ultimately disabled that feature. So as of GCN 1.2 hardware decoding of 4K is finally up and working, meaning AMD GPU equipped systems will no longer have to fall back to relatively expensive software decoding for 4K H.264 video.

On a performance basis this newest iteration of UVD is around 3x faster than the previous version. Using DXVA checker we benchmarked it as playing back a 1080p video at 331fps, or roughly 27x real-time. For 1080p decode it has enough processing power to decode multiple streams and then-some, but this kind of performance is necessary for the much higher requirements of 4K decoding.

Speaking of which, we can confirm that 4K decoding is working like a charm. While Media Player Classic Home Cinema’s built-in decoder doesn’t know what to do for 4K on the new UVD, Windows’ built-in codec has no such trouble. Playing back a 4K video using that decoder hit 152fps, more than enough to play back a 4Kp60 video or two. For the moment this also gives AMD a leg-up over NVIDIA; while Kepler products can handle 4Kp30, their video decoders are too slow to sustain 4Kp60, which is something only Maxwell cards such as 750 Ti can currently do. So at least for the moment with R9 285’s competition being composed of Kepler cards, it’s the only enthusiast tier card capable of sustaining 4Kp60 decoding.
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
HDMI 2.0 is marketing. Short of GTX 980 Tis in SLI / R9 fury X in crossfire there are not GPUs with enough firepower to drive the full 4K display at over 30Hz.

If you mean maxing everything on the newest games at 60+FPS than yes you are correct. But my 290X runs 4K@60Hz perfectly maxing out older games (Skyrim and similar). On Ashes of Singularity my 290X at stock clocks gets about 20-30FPS max everything 4K@60Hz.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
So you are telling us that all those GTX950 are better for 4K?
Like playing a game at 4K at 60fps?

Not everyone buys GPUs to play games. In fact most of the ones I have bought in my life were purely for HTPCs.

Plus a lot of people play older/less intensive games that would run at 4K on a 950 or 960.

HDMI 1.4 can drive a 4K display at 30Hz, plenty for movie/video watching.
Blu-ray content at 1080p is "only" 23.976 fps, and plays fine.

Video content my play fine, but you get to that video content via a remote driven GUI in 2015. A GUI at 30fps feels very sluggish, I know from personal experience because I have a 4K screen in my kitchen with that limitation.

HVEC content? The same one that a lot of people called useless feature in Carrizo, but then it becomes important when the others guys have it?

I don't know what else anyone says. I think it is important because we are are less than a year away from 4K Blu Rays that will be full of such content. Already we are seeing HEVC get used for scene encodes all over. HEVC decoding is a big deal, or it will be within the life of this card.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
280 and 280X also appear to lack full freesync support?

GCN 1.0 parts don't support freesync, that's only 1.1 and up. So, 79xx/R9 280/x don't support it.


So from a feature set point of view, 380/380x are excellent replacements for 280/280x.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It doesn't but the discussion isn't limited to just consumer spending, is it?

So the point you are trying to make is that some XYZ NV card with ABC die size is performing well for its ABC die size and DEF power usage. Ok, that's nice but unless we are shareholders or electrical engineers, so what? Now tell me how that NV card stacks up in price/performance and overall performance against a $230 R9 290.

After-market R9 290 such as the one I linked ~ R9 290X reference.

77% faster than a GTX960 at 1080P
84% faster than a GTX960 at 1440P
https://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/23.html

Even if GTX960 had a 9mm2 die size and used 1W of power, at $170 is it a good buy for gaming against a $230 after-market R9 290? Absolutely not.

GCN 1.0 parts don't support freesync, that's only 1.1 and up. So, 79xx/R9 280/x don't support it.


So from a feature set point of view, 380/380x are excellent replacements for 280/280x.

True but in all honesty, I think FreeSync on R9 380/380X is a pointless marketing gimmick. Think about it, most gamers don't have GSync or FreeSync monitors. If you are going to go out and buy a FreeSync monitor today, chances are you will want an overall better monitor than what you currently have. To achieve that you'll want to get something decent, not some $199 mediocre TN 1080P panel. By this point I wouldn't be surprised that anyone who is upgrading from some older 19-23" 1080P 60Hz monitor is going 1080P 144Hz or 1440P 60Hz or 1440P 144Hz, etc. What are the chances someone is going to upgrade to such a FreeSync monitor and not have another $50 to step up to an R9 290/390?

If you are building a rig from scratch, then R9 380X makes even less sense. Let's say it costs you $800-1000 to build a new rig with a new monitor. What would you take an $800 rig with an R9 380X or an $860 rig with an R9 390/970? The higher the price of your rig, the less $50 extra for a much faster GPU matters.

I keep hearing that there's no reason to buy a 380/380X with the 280/280X so cheap.

Well, here is a reason to have a GCN1.2 card.

I don't think anyone said there is no reason at all. Obviously the most obvious reason is that you simply cannot buy 280/280X in most countries in the world today at reasonable prices which means R9 380/380X are the default choice on the AMD side.

Also, if someone can find an R9 380 for a good price ($115-125), that's a legitimate enough reason since some gamers cannot spend $200 on a 280X.

But, let's look at the perspective a bit here. R9 280X launched for $299 more than 2 years ago. Those models also often came with free games and XFX ones with lifetime warranty. Further, I remember distinctly that HD7970 1Ghz+ models like MSI TwinFrozr dropped as low as $270 on the release of R9 280X.

So let's see now - what's better for a PC gamer, to wait 2+ years to just get to R9 280X level of performance at $229-239 OR to have purchased an HD7970Ghz for $270-280 2 years ago? The opportunity cost of waiting 2 years to get to basically the same spot as an HD7970 1Ghz with some trivial HTPC features is too high.

You can bring all kinds of HTPC features into play but that's like putting lipstick on a pig because someone who paid $280-300 for an HD7970Ghz/R9 280X has enjoyed gaming on it for 2+ years already. What makes it even worse is that there have been sooooo many deals on after-market R9 290 cards for $250-260, it's mind-boggling. In that context, a $229 R9 380X isn't anything special.

Another major point is that anyone who was opportunistic and purchased an HD78xx/79xx card going way back to early 2012 could have been mining bitcoins. That means almost no matter how you slice it, there were plenty of opportunities to better time the purchase of an HD7970/7970Ghz/R9 280X/R9 290. If someone was an HTPC focused user, why wouldn't they have just bought a GTX960 almost a year ago? They would have especially since many of us knew that R9 380X wouldn't have HDMI 2.0 or as an evolved UVD.

It's easy to look at R9 380X and find 1-2 points that makes it better than some of those AMD cards I mentioned earlier, but guess what it's been almost 4 years since HD7970 came out. Compare a GPU from January 2008 to HD7970 of January 2012 (the same 4 years) and look at the dramatic difference in perf/watt, features, and VRAM. How does an R9 380X stack up compared to that? Meh. At this point the only reason R9 380X has a tiny amount of appeal isn't because it's a good graphics card for 2015, but simply because NV's 960 2GB-4GB are often even worse for gaming for the price. If we simply compare the time, context and how R9 380X stacks up against the 2-year-old R9 280X, the card is nothing but a disappointment.

I am not surprised because I truly believe that the new market sweet-spot for mid-range cards is the $290-350 segment moving forward. Prices have increased over time and our expectations are changing. Today the best 'value' cards are R9 290/290X/390/970, nothing below that segment, unless one is a casual or finds a super deal.

It's still possible to purchase an R9 290 for $220 and $235 and those cards come with a free racing game. In that context, the R9 380X still makes no sense. Drop PowerTune to -20%, do a TIM swap, undervolt, and the reference R9 290 will still beat any R9 380X without much effort.

Also, let's not ignore that R9 290 still have 64 ROPs and massive amounts of memory bandwidth. How is R9 380X going to stack up against R9 290 in DX12 games? Probably not very well.

1080pi7.png


Early benchmarks for R9 290 series in Star Citizen highlight how this card is in a league of its own for price/perf.

Intel-Arc-1080P.jpg


Oh, and I actually missed that there is an after-market XFX IceQ Turbo R9 290 for $230 on Newegg. This card peaks at low 70C. RIP $230 R9 380X until that card is for sale.
 
Last edited:

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
True but in all honesty, I think FreeSync on R9 380/380X is a pointless marketing gimmick. Think about it, most gamers don't have GSync or FreeSync monitors. If you are going to go out and buy a FreeSync monitor today, chances are you will want an overall better monitor than what you currently have. To achieve that you'll want to get something decent, not some $199 mediocre TN 1080P panel. By this point I wouldn't be surprised that anyone who is upgrading from some older 19-23" 1080P 60Hz monitor is going 1080P 144Hz or 1440P 60Hz or 1440P 144Hz, etc. What are the chances someone is going to upgrade to such a FreeSync monitor and not have another $50 to step up to an R9 290/390?

If you are building a rig from scratch, then R9 380X makes even less sense. Let's say it costs you $800-1000 to build a new rig with a new monitor. What would you take an $800 rig with an R9 380X or an $860 rig with an R9 390/970? The higher the price of your rig, the less $50 extra for a much faster GPU matters.


I agree.


Although there is the rest of the world where GPU prices are more or less MSRP and there's a sizeable price difference between say, 380x and a 390/x, and that price difference could make or break such a build.

In those cases, it's good to have the 300 series refreshes (apart from the Pitcairn based 370 cards) that are mostly GCN 1.1 and 1.2. Freesync is especially useful here where you don't have as much GPU horsepower as in the 390, and are more prone to go down to 30-40FPS, that's where the technology shines and Tahiti, although more powerful, can't hide its FPS slowdowns with freesync.Going forward having Freesync support in their products is a must have, top to bottom, if they want it to succeed vs Gsync.

The 380x slaps the 960 around most of the time, its redeeming quality is just that. The 960 should stop being recommended altogether if one is only to game and not require the 960's other capabilities.


But yeah, by all means, if the budget allows to step up to a 390, that's what should be done, Tonga has nothing to do against Hawaii in terms of raw horsepower at 1080p and up.

On the other hand, if one comes across a cheap 280x and only needs GPU horsepower and doesn't care for Tonga's improved features, that's what should be bought. At this rate Tahiti has another year of useful life, I think, before actually becoming obsolete. Insane longevity.

And there's the 8800GT of our time, the 290. That's another must buy this late into the 28nm era if available, if one wants a cheap upgrade that is more than enough for 1080p gaming at good quality levels.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The 960 should stop being recommended altogether if one is only to game and not require the 960's other capabilities.

That's not going to happen because it implies most PC gamers are knowledgeable, read many review, and aren't brand biased. I guarantee it if R9 380X was priced at $149, even with its 30% higher performance than a 960, it would never outsell 950/960. Some people only buy NV, no matter the price/performance. To them it's I have a budget of $XXX, what's the fastest NV card I can afford. AMD cards do not exist to them. That's probably one of the major reasons Lisa Su raised prices so much this generation since she realized this very point. It's like the Toyota Corolla/Camry of the world. Do you think if Audi, BMW, Mercedes,Cadillac released a car in that class it would outsell the Corolla or Camry? :D

And there's the 8800GT of our time, the 290. That's another must buy this late into the 28nm era if available, if one wants a cheap upgrade that is more than enough for 1080p gaming at good quality levels.

True but some gamers only buy NV. Look how many PC gamers keep asking for a 960Ti, still. They would literally rather keep riding their old GTX460/560/470/560Ti/560Ti 448 card and not upgrade to an after-market 290 for years -- just keep waiting for this mythical 960Ti. Imagine if R9 290 was an NV card - it would have become the 8800GT in sales.
 
Last edited: