[VC]AMD Radeon R9 390X WCE Speculation Thread

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
But a lot of people still bought AMD GCN products eventhough they were inferior, more power hungry, runs hot and slower than Nvidia products, that's clearly brand loyalty there.

Compare it during the Kepler generation. On the low-midrange, AMD was faster, only a bit more power hungry and custom models ran cooler, while being a lot cheaper. Makes perfect sense to buy it.

On the high end, R290/X vs 780/ti, it was very close on metrics such as performance, cooling, power use (225 vs 250W isn't a huge gap), with CF spanking SLI hard at high resolution. R290/X were also much cheaper. Again makes perfect sense for brand agnostic gamers to go with AMD. Mid-range, AMD had 3GB vram vs NV's 2GB. High end, it was 4GB vs 3GB vram. In fact, there were very few legit reasons (perhaps PhysX & "ecosystem" tax is fair for some) to go with NV during these times. So people who bought NV during these times were clearly NV loyalist.

It wasn't a blow out until Maxwell, in particular 970/980 that made AMD's lineup look obsolete due to major power use & performance gap.

But re Titan X, the reasons to go with NV & pay more are the same as ever, in the points I've outlined. These have always been factors that sway loyal gamers to pay more for less or equivalent, or pay a LOT more for slight performance gains.

Is it a game changer if AMD suddenly had superior performance?
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
AMD can't support feature level 12_1, Conservative Rasterization or Rasterizer Ordered Views on any current GCN. Maxwell fully supports both.

https://developer.nvidia.com/content/dont-be-conservative-conservative-rasterization

https://developer.nvidia.com/content/maxwell-gm204-opengl-extensions

Actualy it s Nvidia that has not full support of the new level, denomination 12_1 was created by MS because of Nvidia wanting to get the higher level sticker even if they are limited to tier 2, only AMD GPUs support up to tier 3.

http://www.hardware.fr/news/14135/gdc-d3d12-guerre-specs-vue.html
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
...Mid-range, AMD had 3GB vram vs NV's 2GB. High end, it was 4GB vs 3GB vram. In fact, there were very few legit reasons (perhaps PhysX & "ecosystem" tax is fair for some) to go with NV during these times. So people who bought NV during these times were clearly NV loyalist.

Generally speaking, if someone who already has an Nvidia card that is working well wants a new card, unless there is a dramatic price / perf difference I would rec they stick with Nvidia. Conversely, I'd rec that an AMD card user stick with AMD. This is because the replacement card is truly plug in and go, no driver issues.

Point being the performance metrics aren't the only thing buyers and sellers will be concerned with. Sellers don't want the support call, and buyers don't want the hassle.

As with any retail space, it is an uphill battle trying to win over someone else's customer. Doesn't matter if you're talking about cars, clothing stores, or video cards.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Yes, but memory speed should be in Hz, no? That is what I'm referring to in the leaked image.

Every Gbps mean 1024x1024x1024x1024 bits of data transfered per Second. So you can trade the Hz per the Bits and it is all okay.

Where GDDR5 transfers 4 bits per Hz clock, at 1250Mhz, HBM tranfers just only a bit per clock, but with a 32 times wider(1024 bits per stack) bandwidth. The lower clocks is what made HBM being so power efficient, and the interposer and die-stacking technology made HBM being so expensive.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
1. 12GB vram
2. CUDA
3. NV ecosystem & "better drivers"
4. GameWorks
5. GPU PhysX

There's a few, surely it will apply for many NV enthusiast, even if 390X ends up faster, they won't switch. I saw a few posts on other forums, one guy even claims he will never ditch NV for AMD unless his NV GPU rapes his wife and burns down his house. That is what it would take for the switch. I had to lol.

Those 5 reasons are pretty weak, to be honest.
12gig DDR5 vs. 8gig HBM. That's like taking an AMD CPU over an Intel one because it has more cores.
CUDA. How many gamers use CUDA? .001%?
NV ecosystem and drivers is marketing. We'll see how AMD does this time with marketing the 390.
Gameworks. For me that's a reason not to support nVidia. So that's a double edged sword. Gameworks just flies in the face of the direction the industry is moving in with API's being more open. Even msft had to capitulate. I imagine nVidia will as well.
GPU PhysX is about as dead as a technology can be.


You are right though, there are people who AMD only exists so they can reduce the price of the nVidia card they want. I'll be curious though if it's not the halo card if they'll still be satisfied with the premium they have to pay?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Generally speaking, if someone who already has an Nvidia card that is working well wants a new card, unless there is a dramatic price / perf difference I would rec they stick with Nvidia. Conversely, I'd rec that an AMD card user stick with AMD. This is because the replacement card is truly plug in and go, no driver issues.

Point being the performance metrics aren't the only thing buyers and sellers will be concerned with. Sellers don't want the support call, and buyers don't want the hassle.

As with any retail space, it is an uphill battle trying to win over someone else's customer. Doesn't matter if you're talking about cars, clothing stores, or video cards.

Very true!

In the end no product is perfect. As humans we tend to be far more tolerant of things we are fond of.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
With the recent performance chart from AMD I think Titan X and 390X will be extremely close in performance.

Since 390X TDP is 300W and Titan X TDP is 250W. With price being rumored $700 for R9 390 and "much higher" for R9 390X due to HBM, so perhaps both Titan X and 390X will cost $999. I think AMD will have the edge this time with watercooling and better 4K gaming performance than Titan X, because of HBM.

The only reason why AMD would lose sales, is if the cooling solution is noisy. But with cooler master that designed the cooling system, I cant see that happening.
Or price above $999, say $1200. But this is AMD. I cant see that happening either

As a Nvidia enthusiast, I can`t find a single reason to pick Titan X over 390X if everything above turns out true. Not a single one :confused:

Really? I'd like to see AMD get back in the ring, but there's no way they will be able to charge $999 for a single-GPU card. Whether or not you or I agree, Nvidia is considered a more premium brand than AMD, so they can charge more for equivalent performance. It may be unfair, but it's the way things are. If AMD charges the same as Nvidia for similar performance, they won't get any sales. Especially if the AMD solution is hotter and more power-hungry. I hope this isn't the case, but if these leaks are legit, two 8-pin power connectors on a single-GPU reference board are not a good sign. If the R9 390X takes the performance crown by a non-negligible margin, AMD might be able to get $700 for it, if they're lucky. Otherwise, $599 at most.

Personally, I find CLC to be an ugly hack and wish it would go away. Air cooling with as few and as high-quality fans as possible is the way to go.
 

Kippa

Senior member
Dec 12, 2011
392
1
81
I am brand agnostic and would say that getting either an 390X or a Titan X both will be good for gaming. Obviously there will be some difference in price or performance between the two, but I don't think that it will be "hugely different" like 25%+ performance difference between the two. For most it will come down to different factors like the price, power consumption and raw processing power. I do photography and see the same fanboy wars between Nikon and Canon, never mind AMD and Nvidia. You will be in good hands if you get either card.

As for whether I will buy one of these cards, I like to wait for the official reviews, then wait a month or two and let people use them in the real world then figure if it is worth it. The fact is that no one really knows how these cards perform yet.
 

mutantmagnet

Member
Apr 6, 2009
41
1
71
As a Nvidia enthusiast, I can`t find a single reason to pick Titan X over 390X if everything above turns out true. Not a single one :confused:

Well that depends.

Before the pricing rumors, ULMB was the only reason. Until someone like ToastyX makes a driver that enables AMD cards to work with ULMB mode of the Acer display I still require Nvidia.


I strongly suspect the 390X is going to be a poor overclocker. After the rumors that the 390X is going to be over $700 my attitude is that the Titan will highly likely be more cost effective even if the 390X is $800. Either way I'll just stick to 970s. I don't want to pay for Titan performance even if the alternative is roughly equal on a per dollar basis while at a lower price point.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
given the info from this closed/combine thread.
http://forums.anandtech.com/showthread.php?t=2424814


amd rumor specs are impressive..
1. 8.6 TFLOPS.. (27% faster than titan x) (this spec will matter)
2. 768 GB/s bandwidth.. (128% faster than titan x) (this spec will be moot)
3. 8gb vram.. (-33% less than titan x)(this spec will be moot too)
4. dual 8 pin.. (gonna be hot and power hungry no matter what) (nvidia wins here)
5. $700 price point.. (this will be a much better overall value - hot or hungry or not)
6. multi-gpu pcie bridge / scaling.. (amd wins here)



for the "few" of us that such specs "do" actually matter.
need to know the TUM specs and ROP/ROU specs for "quad fire on 7680x1600". (the expected spec "should" be a non issue)
 
Last edited:

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
Compare it during the Kepler generation. On the low-midrange, AMD was faster, only a bit more power hungry and custom models ran cooler, while being a lot cheaper. Makes perfect sense to buy it.

On the high end, R290/X vs 780/ti, it was very close on metrics such as performance, cooling, power use (225 vs 250W isn't a huge gap), with CF spanking SLI hard at high resolution. R290/X were also much cheaper. Again makes perfect sense for brand agnostic gamers to go with AMD. Mid-range, AMD had 3GB vram vs NV's 2GB. High end, it was 4GB vs 3GB vram. In fact, there were very few legit reasons (perhaps PhysX & "ecosystem" tax is fair for some) to go with NV during these times. So people who bought NV during these times were clearly NV loyalist.

It wasn't a blow out until Maxwell, in particular 970/980 that made AMD's lineup look obsolete due to major power use & performance gap.

But re Titan X, the reasons to go with NV & pay more are the same as ever, in the points I've outlined. These have always been factors that sway loyal gamers to pay more for less or equivalent, or pay a LOT more for slight performance gains.

Is it a game changer if AMD suddenly had superior performance?

:thumbsup::thumbsup:

for those of us that are non loyal and easily switch color.

nice summary.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
So lets play devils advocate here,

Consider the leaks are real then we must also accept that this came from an event AMD hosted to rally up their AIB partners. So most likely no one has these chips outside of AMD. I also get hints they are still in earlier development stages with all this up to talk.

If these slides are real, then I am pretty sure that AMD themselves were behind these recent rumors and leaks that popped up (especially after Titan X announcement). All this talk about AMD working on 8gb Fiji and such, which all appeared really really recently. That is concerning to me. If AMD is behind all the recent leaks, then there only objective is to try to slow Titan x adoption. If they can hold people back from buying, they have accomplished something. For is reason, I expect price rumors to start popping up once the Titan x is for sale. There might be sites claiming the 390x is just so many weeks away and it will be so much cheaper. Whether those rumors are true, it doesn't matter. The longer you hold off Titan X adoption, the better it is for AMD.

So lastly, since this is an event intended to rally up AiBs in the wake of the impressive Titan X......
You can bet the performance shown is absolute best case scenario. And it is directly from AMD. We know that you can to trust best case scenario charts from the companies themselves. Now this is actually the part that concerns me the most. I mean, look at the games listed. Tomb raider? That game is over 2 years old. What? And it is the lowest improvement of all the titles. Bf4 when there is a new one, why is that?
Then thère is this issue with the 290x in the first place. AMD had an interesting launch plan with the 290x. It came with default mode and über mode. The range of performance is quite large when you consider the worst case reference throttling 290x and the best case aftermarket editions. When AMD lis comparing the 390x to the 290x here, what are we talking about?
Since this is best case scenario, a reference 290x at stock settings would be ideal. This would be the way I would do it. Especially since we want to showcase how great the 390x WCE is. You would not want to compare it to the 290x über, it would only make the 390x less impressive.

I know there might be people who take offense to what I say and get really defensive, but please understand that I am not trying to take anything away from how awesome the 390x might be. I am just saying. There are people who don't understand why review sites still use 290x reference and über modes. They want to pretend, after custom vendor options, that the reference 290x doesn't exist. But that is the card that AMD launched. When they are comparing the 390x to their former 290x card, it would be hard to imagine them not using a reference 290x. And there is a real difference between a heavily throttle reference 290x out of the box performance vs the awesome custom aftermarket cards you can get today.

Then you must also accept that best case gains are what anyone would use if they were trying to rally up partners in the wake of nvidia releasing yet another impressive card while they have nothing new for all these months.

It could be that these leaks are all fake. And when I first saw the 65% performance boost in the graph, I really got excited. I am not just saying that. I really did. But then I started really started analyzing and thinking. Wait a minute........

I don't like pre hype, I don't like building up great expectations before a card ever launches. These leaks look great on the surface but I am not sure how we should actually interpret them. Everyone is drooling but I want to say there is a real risk in having high expectations. If the card launches and is amassing, it will only be that much more impressive launch day. But if your already expecting amassing and then some, if it reaches that bar then it only met expectations. If it falls just shy of it, a really great improvement will be a let down.
Just saying....
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Really? I'd like to see AMD get back in the ring, but there's no way they will be able to charge $999 for a single-GPU card. Whether or not you or I agree, Nvidia is considered a more premium brand than AMD, so they can charge more for equivalent performance. It may be unfair, but it's the way things are. If AMD charges the same as Nvidia for similar performance, they won't get any sales. Especially if the AMD solution is hotter and more power-hungry. I hope this isn't the case, but if these leaks are legit, two 8-pin power connectors on a single-GPU reference board are not a good sign. If the R9 390X takes the performance crown by a non-negligible margin, AMD might be able to get $700 for it, if they're lucky. Otherwise, $599 at most.

Personally, I find CLC to be an ugly hack and wish it would go away. Air cooling with as few and as high-quality fans as possible is the way to go.

It's newer tech. It supports the latest standards. It's worth a premium, you would think. $600 at most is stoopid considering it'll be many months before you can buy the same tech from nVidia. :D
 
Last edited:
Feb 19, 2009
10,457
10
76
If the R9 390X takes the performance crown by a non-negligible margin, AMD might be able to get $700 for it, if they're lucky. Otherwise, $599 at most.

So you think having similar performance and about 35W more in power use, AMD can only charge close to 1/2 the price of NV? Even if Titan X is $1000 (its rumored to be $999 to $1,300!), a $400 price gap is considerable for the "NV tax".

NV is doing something right when many people are willing to pay such a hefty tax.

Seriously even I would give them a small GameWorks tax because they got lots of nice titles I want to play.. but $400 is too much. lol
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So lets play devils advocate here,

Consider the leaks are real then we must also accept that this came from an event AMD hosted to rally up their AIB partners. So most likely no one has these chips outside of AMD. I also get hints they are still in earlier development stages with all this up to talk.

If these slides are real, then I am pretty sure that AMD themselves were behind these recent rumors and leaks that popped up (especially after Titan X announcement). All this talk about AMD working on 8gb Fiji and such, which all appeared really really recently. That is concerning to me. If AMD is behind all the recent leaks, then there only objective is to try to slow Titan x adoption. If they can hold people back from buying, they have accomplished something. For is reason, I expect price rumors to start popping up once the Titan x is for sale. There might be sites claiming the 390x is just so many weeks away and it will be so much cheaper. Whether those rumors are true, it doesn't matter. The longer you hold off Titan X adoption, the better it is for AMD.

So lastly, since this is an event intended to rally up AiBs in the wake of the impressive Titan X......
You can bet the performance shown is absolute best case scenario. And it is directly from AMD. We know that you can to trust best case scenario charts from the companies themselves. Now this is actually the part that concerns me the most. I mean, look at the games listed. Tomb raider? That game is over 2 years old. What? And it is the lowest improvement of all the titles. Bf4 when there is a new one, why is that?
Then thère is this issue with the 290x in the first place. AMD had an interesting launch plan with the 290x. It came with default mode and über mode. The range of performance is quite large when you consider the worst case reference throttling 290x and the best case aftermarket editions. When AMD lis comparing the 390x to the 290x here, what are we talking about?
Since this is best case scenario, a reference 290x at stock settings would be ideal. This would be the way I would do it. Especially since we want to showcase how great the 390x WCE is. You would not want to compare it to the 290x über, it would only make the 390x less impressive.

I know there might be people who take offense to what I say and get really defensive, but please understand that I am not trying to take anything away from how awesome the 390x might be. I am just saying. There are people who don't understand why review sites still use 290x reference and über modes. They want to pretend, after custom vendor options, that the reference 290x doesn't exist. But that is the card that AMD launched. When they are comparing the 390x to their former 290x card, it would be hard to imagine them not using a reference 290x. And there is a real difference between a heavily throttle reference 290x out of the box performance vs the awesome custom aftermarket cards you can get today.

Then you must also accept that best case gains are what anyone would use if they were trying to rally up partners in the wake of nvidia releasing yet another impressive card while they have nothing new for all these months.

It could be that these leaks are all fake. And when I first saw the 65% performance boost in the graph, I really got excited. I am not just saying that. I really did. But then I started really started analyzing and thinking. Wait a minute........

I don't like pre hype, I don't like building up great expectations before a card ever launches. These leaks look great on the surface but I am not sure how we should actually interpret them. Everyone is drooling but I want to say there is a real risk in having high expectations. If the card launches and is amassing, it will only be that much more impressive launch day. But if your already expecting amassing and then some, if it reaches that bar then it only met expectations. If it falls just shy of it, a really great improvement will be a let down.
Just saying....

Wow! Where do you get this from? AMD's using heavily throttled cards for the comparison? Default and uber? Reference and uber? It was quiet and uber. AMD supplied the leaks? I suppose this could all make sense to you, but realize there is nothing leading you to believe this except your imagination?

We've only seen rumored performance from either of these cards, so either/both/neither could end up a disappointment. We should be excited about both of them and hope they both compete hard enough that neither manufacturer can take advantage.

The performance improvement we are seeing from both companies considering no node shrink is quite impressive. If these archs had been able to be released on time along with a shrink we would have likely seen greater than 2x performance increase. It's been a long while since we've seen that. I'm impressed with what we are seeing from both of these companies.

And before you think I'm getting defensive, I'm not. I'm just a bit shocked at how you view these two cards. :)
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
But a lot of people still bought AMD GCN products eventhough they were inferior, more power hungry, runs hot and slower than Nvidia products, that's clearly brand loyalty there.

What is your definition of inferior?

GCN in its Tahiti to Hawaii iterations turned out to be better, or at least aged much more gracefully than Kepler in the long run, and that was a trend that one could see a while back, it's nothing new.

I'd also like to see your position on GF100/110's truly ridiculous power consumption (and on the 480's case, only a 10% performance advantage for over 100w more vs 5870) over AMD's HD5000 series where the exact opposite happened in that regard, and you nV guys didn't give a damn about. In fact AMD held the perf/W crown for quite a while during their VLIW5/4 days and suddeny when nV starts marketing that particular metric with Kepler, it becomes the holy grail. 780Ti vs 290x is a comparatively microscopic difference in power usage and the 290x was given the image of requiring a nuclear reactor to be able to use it, when it's quite the opposite. If anything the 290x's perf/w has improved quite nicely and I'd say it has surpassed the 780Ti's considering the latest performance metrics. Impressive.

Anyway, Kepler vs GCN is a closed chapter. As always, nV's marketing was successful in brainwashing their userbase and AMD's was as inept as always, not able to put up a fight in that regard. I hope that changes for the better with the 390x's arrival.



The 390x and its up to date GCN are Maxwell's competition, Hawaii just had to put up a fight with GM204 and it did quite sucessfully, considering it's just a 10% difference between 290x and GTX 980.

With that kind of performance increase over the 290x, if priced accordingly there isn't much of a point in the Titan X... AMD will probably force nV's hand... if only they can advertise their best card in quite a while and get it to sell as it should.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
<snip>
if only they can advertise their best card in quite a while and get it to sell as it should.

AMD badly screwed up their releases since for GCN. All of them. BADLY! The 7970 with immature drivers and under clocked. Then they took months to react to the market when Kepler came out. Hawaii with the terribly inadequate reference cooler. Not handling the mining craze even close to right. No cards for months and then when they did have them the mining market crashed and they were left with a glut that they are still apparently trying to dig themselves out from under. The 285 about 12mos too late and about 20% too expensive. It at least needed to be 4gig of VRAM.

So far this appears to be different. They look to be making a bigger splash and not leaving nVidia with so much to exploit. They actually have a lot of things they can leverage right now. The new API's moving in the direction they laid out, Freesync, HBM, reportedly competitive, or maybe even superior performance, a whole new lineup, all seem to be good for them to have a very successful launch of their latest generation. People are going to make a big deal out of what they can. OMG!!! It has 2x 8pin connectors! 290W vs. 250w, what a failure! It needs Niagara Falls to cool it! It better be 1/2 the price of nVidia because nVidia is so superior a name. That all seems like nothing to overcome though.

AMD needs to have planned this right, execute it right, and have planned for all of the variables. People can make excuses that they couldn't have foreseen Kepler and the mining market the damage that the reference cooler would cause, how long before they'd have supplies for the AIB's to get their custom designs out, etc. But, that's called vision. That's why you pay the CEO the big bucks. You could hire anyone if they didn't have to know what was needed to be done. Well Read is gone. Let's hope his replacement is qualified or we're going to end up with one dGPU company and nobody for them to compete with to drive innovation and balance the market.
 

loccothan

Senior member
Mar 8, 2013
268
2
81
loccothan.blogspot.com
What is your definition of inferior?

GCN in its Tahiti to Hawaii iterations turned out to be better, or at least aged much more gracefully than Kepler in the long run, and that was a trend that one could see a while back, it's nothing new.

I'd also like to see your position on GF100/110's truly ridiculous power consumption (and on the 480's case, only a 10% performance advantage for over 100w more vs 5870) over AMD's HD5000 series where the exact opposite happened in that regard, and you nV guys didn't give a damn about. In fact AMD held the perf/W crown for quite a while during their VLIW5/4 days and suddeny when nV starts marketing that particular metric with Kepler, it becomes the holy grail. 780Ti vs 290x is a comparatively microscopic difference in power usage and the 290x was given the image of requiring a nuclear reactor to be able to use it, when it's quite the opposite. If anything the 290x's perf/w has improved quite nicely and I'd say it has surpassed the 780Ti's considering the latest performance metrics. Impressive.

Anyway, Kepler vs GCN is a closed chapter. As always, nV's marketing was successful in brainwashing their userbase and AMD's was as inept as always, not able to put up a fight in that regard. I hope that changes for the better with the 390x's arrival.



The 390x and its up to date GCN are Maxwell's competition, Hawaii just had to put up a fight with GM204 and it did quite sucessfully, considering it's just a 10% difference between 290x and GTX 980.

With that kind of performance increase over the 290x, if priced accordingly there isn't much of a point in the Titan X... AMD will probably force nV's hand... if only they can advertise their best card in quite a while and get it to sell as it should.

You got the point my Bro :colbert: its called nV Propaganda lol

IMO all GCN's are Superb GPU's and have H/W Features way ahead Time
DX11.1 11.2 11.3 + Vulcan + DX12.3 all Thanks to Mantle o_O
============================================
So the nV and their "New" Maxwell (Fermi rev.5 ! ) (+ 970 Lie/Fail).
The "company" i have Ignore Mode ON always ;-)
+ nV Game/FailWorks and New Games only with DX11_0 H/W !
SoM, DyingLight, pCars, Witcher3 etc. all in Dx11
Lol DX11 ? Dunno 7 years old now?
======================================
But 11.1 + 11.2 have 1 year:
AMD Games Evolved: Performs well on AMD and nV :cool:
DX11.1 present in:
BF4, StarCitizen, Alien Isolation, NFS Rivals + maby other Dunno
DX 11.3 + DX12.3 Support (mantle in FrostBite3.1 still :) :
StarWars Battle.. , DeusEX Universe, TomRaider Next + will be DX12.3 in StarCitizen 100% and other upcoming titles :p

So nV go home and give Pascal already...

Here the Proof of nV DX11 = DX11_0 in BF4 (780Ti & 980) (75deg. cels)

-> https://www.youtube.com/watch?v=ae5dqR5BWZU

And on My RIG BF4 Maxed 1793:1344 so >FHD in DX11.1 (1050/1550 60deg cels) :

->
35m16ya.jpg



Dont look into GPUz or nV Panel for the H/W support cuz is a lie !
Anybody can wrote in code to show DX11.2 H/W support ! DX11.2 is only API that can support games in 11.2 (no game is using it !)
GPUz has error cuz shows DX11.2 SM5.0 lol DX11.2 has shader model 5.2 !
Every instance of DX has better optimised SM so:
DX11 5.0
DX11.1 5.1
DX11.2 5.2 and so on :D

-> here proof -> http://www.hardwaresecrets.com/article/131
No Comments. Knowlege is POWER and SCENE always Rules... Still and beyond
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
AMD badly screwed up their releases since for GCN. All of them. BADLY! The 7970 with immature drivers and under clocked. Then they took months to react to the market when Kepler came out. Hawaii with the terribly inadequate reference cooler. Not handling the mining craze even close to right. No cards for months and then when they did have them the mining market crashed and they were left with a glut that they are still apparently trying to dig themselves out from under. The 285 about 12mos too late and about 20% too expensive. It at least needed to be 4gig of VRAM.

I agree with everything here, but...

So far this appears to be different. They look to be making a bigger splash and not leaving nVidia with so much to exploit. They actually have a lot of things they can leverage right now. The new API's moving in the direction they laid out, Freesync, HBM, reportedly competitive, or maybe even superior performance, a whole new lineup, all seem to be good for them to have a very successful launch of their latest generation. People are going to make a big deal out of what they can. OMG!!! It has 2x 8pin connectors! 290W vs. 250w, what a failure! It needs Niagara Falls to cool it! It better be 1/2 the price of nVidia because nVidia is so superior a name. That all seems like nothing to overcome though.

What are you basing this off? I don't think I've even see a real FreeSync demo in person yet. Beside that point, it just sounds like your giving them benefit of doubt on all the things that could be. Well, HD 7970 got hyped, and read your post above. Than Hawaii got hyped up with similar info "it's gonna be a huge die" and some other stuff I can't remember now, and read your post above.


I feel people are just giving to much benefit of the doubt to AMD. They've basically failed to deliver since 2012 and we're acting like "this is the year, things will be better." That's beaten-wife syndrome.

AMD, make me want to buy you again. I even upped my budget, but if you fail like you did the last time I had a pocket full of cash - sorry but NV get's that up to $1K sale.

AMD needs to have planned this right, execute it right, and have planned for all of the variables. People can make excuses that they couldn't have foreseen Kepler and the mining market the damage that the reference cooler would cause, how long before they'd have supplies for the AIB's to get their custom designs out, etc. But, that's called vision. That's why you pay the CEO the big bucks. You could hire anyone if they didn't have to know what was needed to be done. Well Read is gone. Let's hope his replacement is qualified or we're going to end up with one dGPU company and nobody for them to compete with to drive innovation and balance the market.

What!? What vision? Have you seen their CPU department? I don't think Rory did anything right. And this new CEO hasn't really said anything re-affirming.

We're back to hoping and wishing.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
What!? What vision? Have you seen their CPU department? I don't think Rory did anything right. And this new CEO hasn't really said anything re-affirming.

We're back to hoping and wishing.

AMD's CPU department is working on Zen and K12. If they fail then your criticism is valid. Rory Read got Jim Keller and Raja Koduri to lead the CPU and GPU design efforts. Keller's impact will be known only when Zen and K12 launch. Raja's impact will be felt on the next gen 14nm FINFET products

http://www.anandtech.com/show/6129/apple-a4a5-designer-k8-lead-architect-jim-keller-returns-to-amd

http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1720866

http://www.anandtech.com/show/6907/the-king-is-back-raja-koduri-leaves-apple-returns-to-amd

Designing a brand new CPU architecture is hard. But Keller and his team are giving their best shot. Zen and K12 will launch in Q3 2016. If AMD fail with these 2 cores then they are dead.

I do not think you can say Rory Read did nothing. He knew AMD will die if it did not get back with a competitive CPU core. So he got the best architect who can make it happen. All this talk about no vision is just rubbish. Zen is a bet the company move. I am optimistic that Keller and his team will deliver a competitive core.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I agree with everything here, but...



What are you basing this off? I don't think I've even see a real FreeSync demo in person yet. Beside that point, it just sounds like your giving them benefit of doubt on all the things that could be. Well, HD 7970 got hyped, and read your post above. Than Hawaii got hyped up with similar info "it's gonna be a huge die" and some other stuff I can't remember now, and read your post above.

What I'm basing it off of is the info we've been given. The 390X was supposed to come out 1st and be 4gig HBM. Well, now it looks like they are going to magically (I say magically because supposedly 4 gig was all that was possible) make it 8gig and they are not going to relabel the rest of the line but are going to give us all new chips. We aren't going to get the 5870 cooler pressed into service again. I've also heard that they have built a very large supply of GPU's up for the launch. In the past they would have simply rushed out the first thing they had ready.


I feel people are just giving to much benefit of the doubt to AMD. They've basically failed to deliver since 2012 and we're acting like "this is the year, things will be better." That's beaten-wife syndrome.

What's this thrown in here for? Just a bit of flame bait? That's rude and insulting and completely uncalled for. It makes me want to delete this whole post and put you on ignore. I doubt you realize how ignorant what you said is here though. You just think that's what someone who doesn't crap all over AMD is like.

AMD, make me want to buy you again. I even upped my budget, but if you fail like you did the last time I had a pocket full of cash - sorry but NV get's that up to $1K sale.

If this is the case have you bought Tahiti or Hawaii, or, did you buy GK104 and GK110. Both designs were inferior to Tahiti and Hawaii, but nVidia's hype machine made them appear better.

What!? What vision? Have you seen their CPU department? I don't think Rory did anything right. And this new CEO hasn't really said anything re-affirming.

Where did I say Rory had any vision or did anything right? Here is some of your issue with my post. You didn't even read what I said. I don't know where the confusion comes from. I spat all over Read. As far as what the new CEO says, I don't even pay attention. They all BS and say nothing. They are like politicians. they stand up there talking with their hands because the words that come out of their mouths are meaningless. Are you telling me you listen to Jen-Hsun? He's just trying to fleece you out of your lunch money. lol The new CEO, IMO, has done everything right so far. It's early days though.

We're back to hoping and wishing.

I'm not sure what you are back to. Maybe just continuing to buy from the best promoted and marketed brands? I'm waiting and seeing. I'm sure not looking into what AMD's CPU division is doing. Maybe next year, we'll see. Or was that just said to be belittling to AMD like your "beaten-wife syndrome" comment? It's of no relevance to the 390 or our conversation.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
After 2.5 years of 28nm GPUs, im waiting for a R9 380/GM200 cut-down version with higher than GTX980 performance at $300-350 max.
If that doesnt happen i will not upgrade my GPU and will not recommend any of those new GPUs from both companies.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
After 2.5 years of 28nm GPUs, im waiting for a R9 380/GM200 cut-down version with higher than GTX980 performance at $300-350 max.
If that doesnt happen i will not upgrade my GPU and will not recommend any of those new GPUs from both companies.

You might be asking for a bit too much About the same as 980 for $350ish is probably closer. Better than might be a bit more. And this is only if the 380(X) is truly competitive to GM104. I don't see a cut down Fiji for that price.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I'm strongly considering AMD this round and I'm willing to wait in order to compare it with GM200. Yet I will miss TXAA and GPU-PhysX (Batman and Witcher 3). I hate not being able to turn on everything :D
And I hope AMD gets their CPU overhead down for GTA V. If NV is like 30-50% faster here in CPU bound benchmarks (open world...), it will be hard to switch to AMD.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You might be asking for a bit too much About the same as 980 for $350ish is probably closer. Better than might be a bit more. And this is only if the 380(X) is truly competitive to GM104. I don't see a cut down Fiji for that price.

They both can have a $999 special edition GPU, i dont care. But GTX980 + 5-10% higher perf at $300-350 should be doable after 2.5 years of 28nm.