[Ars] AMD confirms high-end Polaris GPU will be released in 2016

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I don't understand why AMD is even spending R&D money on Fiji x2 unless they see something in DX12 that will solve most multi GPU problems

Totally agree. If AMD released this in 2015 at or under $1000 it would be in my rig now. t is too late now...
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
I don't understand why AMD is even spending R&D money on Fiji x2 unless they see something in DX12 that will solve most multi GPU problems
The burden of multi gpu implementation and support lies almost entirely on the developer.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
The burden of multi gpu implementation and support lies almost entirely on the developer.

I see this becoming a good thing, and there will be source code out there with multiple multigpu implementations. It will take a bit for this to get out there. I am excited to see what can finally be done.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Totally agree. If AMD released this in 2015 at or under $1000 it would be in my rig now. t is too late now...

Yup. I only was interested because it meant no 2 card hassles. I could just pop in 1 card and play at 4K. But now, it's way too late. This will have to hit R9 295x2 pricing for me to care.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
The burden of multi gpu implementation and support lies almost entirely on the developer.

Sort of. I think we would need a developer to weigh in on this, but the profiles themselves are developed by amd and nvidia, and a non-insignificant amount of work goes into those too.


edit - Nvm, somehow totally missed the context of your post.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
The burden of multi gpu implementation and support lies almost entirely on the developer.

Not according to a former Nvidia driver developer. This was back with directX 10, but somehow I doubt directX 12 is going to magically remove all the patching and workarounds that the driver teams are doing now:

"The fourth lesson: Multi GPU (SLI/CrossfireX) is fucking complicated. You cannot begin to conceive of the number of failure cases that are involved until you see them in person. I suspect that more than half of the total software effort within the IHVs is dedicated strictly to making multi-GPU setups work with existing games. (And I don't even know what the hardware side looks like.) If you've ever tried to independently build an app that uses multi GPU - especially if, god help you, you tried to do it in OpenGL - you may have discovered this insane rabbit hole. There is ONE fast path, and it's the narrowest path of all. Take lessons 1 and 2, and magnify them enormously."

For lessons 1 and 2, and quite a bit of other interesting information follow the link:

http://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
7,372
17,472
136
The burden of multi gpu implementation and support lies almost entirely on the developer.
Not according to a former Nvidia driver developer. This was back with directX 10, but somehow I doubt directX 12 is going to magically remove all the patching and workarounds that the driver teams are doing now
Wonder how Oxide making GeForce+Radeon multi-GPU setups work in AOTS fits in with this theory.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
7,372
17,472
136
That's DX12, not DX11. In 12 mGPU is implemented in the engine by the dev, in 11 and prior the driver was handling mGPU.
I was under the impression DX12 mGPU implementation was exactly the subject at hand here, at least that's what the original comment implied.
I don't understand why AMD is even spending R&D money on Fiji x2 unless they see something in DX12 that will solve most multi GPU problems
 

garagisti

Senior member
Aug 7, 2007
592
7
81
All AMD needs to do to stand out for enthusiasts is sell a bare card for watercooling folks, or even a binned bare card like eVGA did for their Kingpin 980ti (but naked, and less exorbitant). That and ensure an overclocking utility with overvolting is available day one, every time, no exceptions
You mean like 290 and 290x which visiontek was selling? Heck a 290x with factory fitted block was selling for $350 about six months to a year ago.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You mean like 290 and 290x which visiontek was selling? Heck a 290x with factory fitted block was selling for $350 about six months to a year ago.
Yup,. I mean amd has made the right moves slowly regarding cooling. It's usually 1 gen too late though.

fury x successor probably will allow non reference designs. 1 gen too late but still
 

Tweak155

Lifer
Sep 23, 2003
11,449
264
126
Possibly stupid question, but since I haven't done an mGPU rig yet I may as well ask...

If your mGPU card is a single card dual GPU solution, does the motherboard still have to support xFire?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Possibly stupid question, but since I haven't done an mGPU rig yet I may as well ask...

If your mGPU card is a single card dual GPU solution, does the motherboard still have to support xFire?

Pretty sure all motherboards support crossfire out of the box. It's the silly SLI option one has to pay xtra for.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
He was talking about DX12, so ...

Nah, he was talking about DX11/OpenGL and older. The overall context was DX12/Vulkan but not when he says stuff like
"I suspect that more than half of the total software effort within the IHVs is dedicated strictly to making multi-GPU setups work with existing games.".
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Possibly stupid question, but since I haven't done an mGPU rig yet I may as well ask...

If your mGPU card is a single card dual GPU solution, does the motherboard still have to support xFire?

No it doesn't. Although the only reason a modern mobo wouldn't support Crossfire is if it physically lacked two X16 slots.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I don't understand why AMD is even spending R&D money on Fiji x2 unless they see something in DX12 that will solve most multi GPU problems

I don't think AMD necessarily sees anything in DX12 that will solve multi GPU problems (i.e. scaling, microstutter and compatibility), however I believe they see something in VR which will solve all of those issues, thanks to the inherent multi-frame rendering scheme of VR (i.e. you have to render two separate frames at the same time for VR, one for each eye).

Fiji x2 will likely remain the most powerful singular VR card for the foreseeable future (pretty much until AMD/Nvidia releases a 14nm dual GPU card), and marketing it as such AMD could possibly see a moderate amount of success.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,208
537
126
Out of curiosity, how many chips do you think nVidia will release as consumer GPUs this year? GP104 seems to be a given, but other than that there seem to be a lot of different opinions on what we'll see.
Last gen we got GK104 (294mm²) in March 2012, GK107 (118mm²) in April, and GK106 (221mm²) in September, with GK110 (561mm²) following in February (Titan) and March (780) the following year. I don't think we'll have to wait a full six months this time to get GP107, 106 and 104, but I would be surprised if nVidia did launch top to bottom in the span of a few months.

Nvidia sat on GK110 because the only competition they had were themselves. Their previous gen cards still had the high end crown (in everything but 4k which was and still is a niche market). AMD still had yet to release the a card which would rival the 780 TI let alone the 980 until the Fury X, and Nvidia timed their launch of the GK110 to take all the thunder out of AMD's Fury X announcements by releasing the card when AMD was introducing the Fury X (introducing is the correct name of the event as the Fury X wouldn't be on the market for another 2 months after the introduction, while you could in theory buy a 980 Ti at stores), stealing the news cycle that would have possibly given AMD some steam.

Nvidia was 2-3 steps ahead of AMD the last few years, and as such has been able to do things like time releases to cause the most market damage possible to their competition, while also getting the most money from the market as possible by not cannibalizing their own existing products needlessly. I mean, I want AMD to get their act together if only to force Nvidia to compete and release new technology instead of sitting on things for 8+ months simply because they can.
 
Last edited:

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
If Fury had released back in 2012 to compete with Kepler it would have done rather well :) The Maxwell release was quite staggered too of course, so no reason to expect this one won't be.

Will have to see with AMD too of course. They've not done a coherent top to bottom launch for a long time now but the hope has to be that they'll at least get close this time.
 

Tweak155

Lifer
Sep 23, 2003
11,449
264
126
No it doesn't. Although the only reason a modern mobo wouldn't support Crossfire is if it physically lacked two X16 slots.

Ah ok. Have thought about mGPU in the past but never pursued it much (if that's not obvious already :)).

I recently got a 4K monitor and starting to think it might be worth the investment... the talk about a new dual GPU from AMD might be interesting and worth the first leap.
 

maddie

Diamond Member
Jul 18, 2010
5,157
5,545
136
Interesting read:
http://www.gamecrate.com/interview-amds-roy-taylor-dawn-virtual-reality-age/12842


Fighting words indeed. Looks like next gen might not be as expensive as some of our experts claimed.
The second thing is, I mentioned just now that we're going to need the minimum specs to be available at a much more aggressive target price to drive the number of platforms available. We're ahead to market with 14 nanometer FinFET process, way ahead of our competitors, so our ability to ramp high-performance parts which are at a very good price with low power consumption is also going to be an advantage for us.
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Interesting read:
http://www.gamecrate.com/interview-amds-roy-taylor-dawn-virtual-reality-age/12842

The second thing is, I mentioned just now that we're going to need the minimum specs to be available at a much more aggressive target price to drive the number of platforms available. We're ahead to market with 14 nanometer FinFET process, way ahead of our competitors, so our ability to ramp high-performance parts which are at a very good price with low power consumption is also going to be an advantage for us.

Fighting words indeed. Looks like next gen might not be as expensive as some of our experts claimed.

Well, that quote seems pretty accurate. AMD likely does have a massive lead on nVidia to 14nm FF, and no one knows when or even if nVidia will catch up and release a 14nm GPU.

As for price, we'll have to wait to see whether a cost advantage for AMD parleys into a price advantage for us.
 

maddie

Diamond Member
Jul 18, 2010
5,157
5,545
136
Well, that quote seems pretty accurate. AMD likely does have a massive lead on nVidia to 14nm FF, and no one knows when or even if nVidia will catch up and release a 14nm GPU.

As for price, we'll have to wait to see whether a cost advantage for AMD parleys into a price advantage for us.
You're a funny guy.:D

This implies a selling price advantage for us, at least for the GTX970/R290 class.
we're going to need the minimum specs to be available at a much more aggressive target price to drive the number of platforms available
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
You're a funny guy.:D

This implies a selling price advantage for us, at least for the GTX970/R290 class.
No it doesn't, the enthusiast class (& HALO) products are always going to be expensive especially at launch. The deep(er) price cuts only happen when the competition has a better product at similar or slightly lower price level. Kind of like how the Fury X was rumored to be $750 but ended up $100 short of that number as the 980Ti eventually stole its thunder, AMD will most likely exploit any high margin (low volume) product for as long as they can & they should :thumbsup: