[Rumor/Speculation] GTX Titan X 12GB vs R9 390X 4GB vs Unknown GM200 GPU

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DiogoDX

Senior member
Oct 11, 2012
757
336
136
March 16 for close presentation but they still didn't have a date for the NDA to lift.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
WCE, so will one version be air cooled? Or is one AIO and the other closed loop ready?
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
WCE = 4096 shaders & 8GB HBM
Air cooled = 3712 shaders & 4GB HBM
?

I have no idea. Makes little sense to me
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Well I might get 2 x R9 390X if
A) price is right. Max $600 per GPU for WCE.
B) Noise level vs Titan X is allright
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I'm pretty blah about such normally but I'm pretty curious how this will play out.
I still vote the 390x to be the $500ish at launch high end mainstream-enthusiast card like the 290x was rather then a Titan competitor. That would be more of a coup than I think current tech from anyone would allow. Maybe a higher end card later or a 395x2 or whatever. I'll be amused to be proven wrong though.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
I'm pretty blah about such normally but I'm pretty curious how this will play out.
I still vote the 390x to be the $500ish at launch high end mainstream-enthusiast card like the 290x was rather then a Titan competitor. That would be more of a coup than I think current tech from anyone would allow. Maybe a higher end card later or a 395x2 or whatever. I'll be amused to be proven wrong though.

I think, based on the tech, the 390x will quite likely compete favorable against the GM200 in the Titan X.

It may or may not have the same compute capability, depending on what Nvidia debuts inside of the Titan X variant of the GM200 (surely the 9## version of the GM200 will be crippled in FP64 compute, aka double precision).

But in terms of gaming performance, the 980 wasn't THAT far ahead of the 290X. Yes, the GM104 is a smaller die, but it is the current architecture for Nvidia, and they can only scale that performance so much between the GM204 and the GM200 full-size die. Whereas for AMD, the 390X is an entirely new generation of architecture, and while not an entire redesign, it is definitely a refinement compared to past iterations. Will they scale 200% compared to the last full-size die they released? I sincerely doubt it. But I would not expect them to only perform 30% higher than the 290X.

If AMD's 390X bests the 290X by 40%, it may not perform better than the GM200 in every benchmark, but if they release it at the original MSRP for the 290X (around $550-600), they may very well take this generation in a landslide, at least until Nvidia can answer with a reasonably-priced 9## variant of the GM200.

I don't honestly expect the 390x to beat the Titan X, but in some ways, it is very much possible.

If it DOES take the performance crown from the Titan X, I doubt it will release at $550. But in no way do I see AMD attempting to compete with Nvidia at the same market level. That would be suicide and, frankly, one of the worst business decisions a major company has made in at least a few decades. Perhaps not that bad, as I'm not at all interested in researching the worst business mistakes in decades, but while it may be hyperbole, it is not outside the realm of possibilities in this scenario. ;)

If AMD somehow bests the Titan X, I think AMD would be wise to release the GPU at $650. It is unlikely Nvidia would be willing to release a 9## version of the GM200 (let's call it the 985) at anything less than $650, and until Nvidia can match or counter the 390x's price, AMD will rake in the money when comparing against the laughably-priced Titan.

Make no mistake that the Titan X, regardless of the competition, will make money for Nvidia simply because of the clout and brand presence Nvidia carries at this point. I'd love for AMD to be able to command the premiums Nvidia gets away with, and while I don't want those numbers at any point, that would at least mean AMD is healthy again. For now, as long as AMD can target around $600 or higher, and actually sell in good numbers, they can meet revenue targets. It is when Nvidia is able to counter with cards priced under $400 that revenue for AMD tanks, at least if those cards are their largest die.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Where does it end? Every gen we go up 100W? That can't happen.
why not staying below 300 watt? why it has to be +100w with every new gen? why does it have to be all or nothing? it doesn't have to be that way. if 390x can give me 50% more performance over 290x while staying about the same wattage, why not? that is a heck of an improvement/watt considering it is on the same 28nm.

imo, if 390x can give me 100% improvement over 290x and is at 400w on full load, I would still buy it. 100w more for 100% more performance, why not :cool:
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
why not staying below 300 watt? why it has to be +100w with every new gen? why does it have to be all or nothing? it doesn't have to be that way. if 390x can give me 50% more performance over 290x while staying about the same wattage, why not? that is a heck of an improvement/watt considering it is on the same 28nm.

imo, if 390x can give me 100% improvement over 290x and is at 400w on full load, I would still buy it. 100w more for 100% more performance, why not :cool:
just saying
most people fight over gpu coolers and gpu watts [heat] now that nv midrange cards are cooler and only draw 180 watts lol. but adding 3 bil trannys for the titan x won't draw any more power = from some peeps gtx 960 uses 0 watts .

I don't care my self, under 800 watts for gpu's falls in my safety zone
and heat go's into heating a stone foundation in the craw space.
-cost if it was important , I would change my 6k watts electric hot water heater to gas which runs into the peak time of the smart meter after I leave for work after the morning shower [funny time saver crap]
 
Last edited:

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
why not staying below 300 watt? why it has to be +100w with every new gen? why does it have to be all or nothing? it doesn't have to be that way. if 390x can give me 50% more performance over 290x while staying about the same wattage, why not? that is a heck of an improvement/watt considering it is on the same 28nm.

imo, if 390x can give me 100% improvement over 290x and is at 400w on full load, I would still buy it. 100w more for 100% more performance, why not :cool:

Then you agree with me. Getting 50% more performance at the same wattage is precisely increasing performance by increasing efficiency.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I don't see this having anything to do with what I said. I never said they are mutually exclusive.



Again, I'm talking long term as I've stated many times. Sure, jump up to 400W GPUs, but then what? Go to 600W for the next big increase? As you can see, major gains must be made in efficiency to keep moving forward in performance.

What i was trying to communicate here is that efficiency doesn't always mean lower power. You can have higher efficiency with higher power and higher efficiency with lower performance.

Yes everybody would like higher performance with lower power but that is not always available for everyone's needs. This is mostly doable with a new node or at lower performance segments.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
What i was trying to communicate here is that efficiency doesn't always mean lower power. You can have higher efficiency with higher power and higher efficiency with lower performance.

Yes everybody would like higher performance with lower power but that is not always available for everyone's needs. This is mostly doable with a new node or at lower performance segments.

If future GPUs are designed to run in an electrical and thermal envelope of 250-300W like right now, then higher efficiency means higher performance. Yes, it is that simple.