• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

2015 looking to be an exciting year.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SimianR

Senior member
Mar 10, 2011
609
16
81
AMD definitely needs to move to 20nm asap. Which I think is more than likely whet they'll do unless 390X is the last card on 28nm. Who knows. AMD is in a tough spot but I don't think people should count them out just yet. It is discouraging to read stories about them laying off engineers or deals where they have exchanged engineers/resources for access to other tech. Seems like the last thing they should be doing is cutting anything to do with R&D.
 

JM Popaleetus

Senior member
Oct 1, 2010
375
47
91
heatware.com
I feel AMD could easily skip a GPU generation just coasting on income from console sales and and R9 revision (the chips are still competitive, especially if made cheaper/cooler).

Use that year to develop something new from the ground up and actually beat Pascal.

Then when the money is pouring in again, start R&D on CPUs/APUs to maybe beat Intel (or at least go back to being equal and cheaper).
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
The psuedo speculation by the basement dwelling pundits in this thread hurts my brain.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
I feel AMD could easily skip a GPU generation just coasting on income from console sales and and R9 revision (the chips are still competitive, especially if made cheaper/cooler).

Use that year to develop something new from the ground up and actually beat Pascal.

Then when the money is pouring in again, start R&D on CPUs/APUs to maybe beat Intel (or at least go back to being equal and cheaper).
This bit of this topic but I never saw AMD showing chips or tech demonstration like Nvidia did with pascal.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If AMD is actually moving to 20nm AND simultaneously using HBM, they should recapture a comfortable performance lead over anything 28nm Maxwell. Right now, none of that is 100% confirmed, and even if it does happen, no one knows for sure when.

Nvidia is riding a high wave right now and AMD is lucky if they aren't drowning.
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
20nm is not going to do anything for AMD in that regard. Because nVidia have access to the exact same process. So the processnode as such is irrelevant between the GPU makers.

AMD can only compete by developing a better uarch. And GCN1.2 wasnt that.

This is Amusing.

Do tell me how was GCN 1.2 a bad uarch.?

There are 2 falts in GCN 1.2 as far as I can see

1) It wasn't leap bounds ahead in terms of performance compared to GCN 1.0, but then it 1.2 not 2.0, it an evolution not a revolution.

2) Unable to offer good power efficiency as compared to the competition.


GCN 1.2 did well in bring AMD close to Nvidia's Kepler in terms of performance, 290X is what 10-12% slower than 780Ti.?

It has excellent compute performance too & it was a smaller die when compared to Kepler. The only place it lost absolutely was in power consumption, which was made worse by the lackluster stock cooler.

Its just that now Maxwell has dropped, all these Nvidia boys are screaming how AMD's GCN 1.2 is bad & AMD can't do s***.

Are forgetting that AMD has yet to release an uarch to compete with Maxwell.
GCN 1.2 was supposed to compete with Keplar & it did an Okay job at that.

And I love how you guys are some future predicting magicians, screaming how AMD has nothing down the line that can get close to Nvidia, while you don't have a concrete idea of what AMD is working on.

Take my advice, go predict the future on Stock market if you are so sure about your view of how the future is going to turn out. Atleast you'll make some money.:sneaky:

@Paul98

Techhog's posts are just cringe worthy.
No point in wasting the time to reply.:thumbsup:

Infraction issued for member callout.
-- stahlhart
 
Last edited by a moderator:
Feb 19, 2009
10,457
10
76
Actually, Hawaii is very competitive versus Kepler. It's a smaller die that manages to come very close to GK110 in performance. Custom models have similar power use as 780ti too so the efficiency isnt bad like some here would believe.

Then factor in its performance at 4K, multi-card configs beating SLI 780ti, on a smaller die, is actually quite a good feat of engineering. AMD has not managed to come so close or match on performance NV's large dies with their mid-range die for many generations (if ever) until Hawaii.

But the situation now is that NV has a new generation uarc while AMD is still competing with a revised Tahiti "old-tech", ofc it would look bad.. compare Maxwell versus Kepler and the same outcome happens.

AMD is late with their true next-gen. How late they are will determine how much they bleed.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
This is Amusing.

Do tell me how was GCN 1.2 a bad uarch.?

There are 2 falts in GCN 1.2 as far as I can see

1) It wasn't leap bounds ahead in terms of performance compared to GCN 1.0, but then it 1.2 not 2.0, it an evolution not a revolution.

2) Unable to offer good power efficiency as compared to the competition.


GCN 1.2 did well in bring AMD close to Nvidia's Kepler in terms of performance, 290X is what 10-12% slower than 780Ti.?

It has excellent compute performance too & it was a smaller die when compared to Kepler. The only place it lost absolutely was in power consumption, which was made worse by the lackluster stock cooler.

Its just that now Maxwell has dropped, all these Nvidia boys are screaming how AMD's GCN 1.2 is bad & AMD can't do s***.

Are forgetting that AMD has yet to release an uarch to compete with Maxwell.
GCN 1.2 was supposed to compete with Keplar & it did an Okay job at that.

And I love how you guys are some future predicting magicians, screaming how AMD has nothing down the line that can get close to Nvidia, while you don't have a concrete idea of what AMD is working on.

Take my advice, go predict the future on Stock market if you are so sure about your view of how the future is going to turn out. Atleast you'll make some money.:sneaky:

@Paul98

Techhog's posts are just cringe worthy.
No point in wasting the time to reply.:thumbsup:
Well yes Raw performance is very important but other things are also mainly important and that is the reason why AMD market is far behind Nvidia market.AMD mainly focus on Raw performance and they lack on efficiency,power consumption,stability and cooling and that is why AMD lost it most of the Market to Nvidia in mobiles and non exist on CPU market.
 
Feb 19, 2009
10,457
10
76
It wasn't that long ago AMD had the major share in notebooks. So I guess back then they had stability (is this your way of saying good drivers?)?
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
Well yes Raw performance is very important but other things are also mainly important and that is the reason why AMD market is far behind Nvidia market.AMD mainly focus on Raw performance and they lack on efficiency,power consumption,stability and cooling and that is why AMD lost it most of the Market to Nvidia in mobiles and non exist on CPU market.

I agree other things matter than raw performance.
If Hawaii was 10% faster than Kepler while consuming more power, it would actually be praised by some people because AMD went for raw performance.
If Hawaii was slower but more efficient Kepler than also it would have been praised.

But as Hawaii is both more slower & power inefficient as compared to Kepler, it gets a lot more flack. Combined that with AMD's shaky driver support & people Completely forget that AMD managed to keep up with Nvidia with a Smaller budget, a smaller die & forget that its a good compute chip.

Nvidia fans don't give enough credit to AMD for their XDMA bridgeless crossfire. Its both better & more convenient then Nvidia's solution. Maxwell is here, where is Nvidia's answer to that.?

Also personally speaking, with the limited time I have had playing with Graphic cards, AMD's drivers have never caused any trouble for me. Not to undermine or invalidate the issues other people have had. Just my personal experience.
 
Last edited:
Feb 19, 2009
10,457
10
76
Yes i mean all in terms of Stability.

If you're saying back then, AMD had better stability/drivers.. then I would disagree and so would many.

In fact, prior to the 5800 series, AMD/ATI's drivers were well known to be attrocious, its only been in these recent generations that it has improved to a point where I as a user of both vendors, consider them both good.

So in fact, whether either company get design wins in notebooks or not, drivers don't seem to be a factor. It's probably due to marketing & efficiency (where it matters more in notebooks).
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Would love to see a compute power used in games. I wonder if GCN2.0 will go even more double-precision heavy. The difference now between hawaii and maxwell is quite astonishing.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Do tell me how was GCN 1.2 a bad uarch.?
There are 2 falts in GCN 1.2 as far as I can see
2) Unable to offer good power efficiency as compared to the competition.
From what I understand (from reading, not personal experience), Maxwell is only power-efficient in games. Once you approach 100% load (as with distributed computing, what I do), there are little to no gains in power efficiency as compared to their prior 28nm architecture.
 

CakeMonster

Golden Member
Nov 22, 2012
1,630
810
136
Well, yes. Every chip (or not every, but a lot) can be power efficient if given the right voltage/clock speed. If Maxwell had been released with slightly higher clocks, we would not have praised it for efficiency at all, but for its huge performance gains. Funny how perception can change over small adjustments..
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Do tell me how was GCN 1.2 a bad uarch.?

I never said it was a bad uarch. There was quite a few good improvements in. It just cant compete with the competion and something radically new is needed to do that.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I agree other things matter than raw performance.
If Hawaii was 10% faster than Kepler while consuming more power, it would actually be praised by some people because AMD went for raw performance.
If Hawaii was slower but more efficient Kepler than also it would have been praised.

But as Hawaii is both more slower & power inefficient as compared to Kepler, it gets a lot more flack. Combined that with AMD's shaky driver support & people Completely forget that AMD managed to keep up with Nvidia with a Smaller budget, a smaller die & forget that its a good compute chip.

Nvidia fans don't give enough credit to AMD for their XDMA bridgeless crossfire. Its both better & more convenient then Nvidia's solution. Maxwell is here, where is Nvidia's answer to that.?

Also personally speaking, with the limited time I have had playing with Graphic cards, AMD's drivers have never caused any trouble for me. Not to undermine or invalidate the issues other people have had. Just my personal experience.
I disagree. They wouldn't be praised unless they had both. Remember that the Nvidia fanboys didn't care about efficiency in the Fermi days at all.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I think the 2012/2013 AMD layoffs have hurt R&D and that is apparent now. AMD maxed out their arch with the 290/290X as can be seen by the absurd power usage. Same thing happened with their FX line, rapped it out until they could no longer compete with Intel. Look where FX is now.

I would suggest the consoles actually had a bigger impact then the layoffs. Designing 2 new console chips takes a lot more effort then anyone here is willing to admit and probably took key people off developing new gen GPU's. Historically you can pretty well match console releases with dodgy desktop gpu's. xbox=5900FX, xbox 360=HD2900XT. Hence AMD got the console wins but now they are paying the cost in PC graphics.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I would suggest the consoles actually had a bigger impact then the layoffs. Designing 2 new console chips takes a lot more effort then anyone here is willing to admit and probably took key people off developing new gen GPU's. Historically you can pretty well match console releases with dodgy desktop gpu's. xbox=5900FX, xbox 360=HD2900XT. Hence AMD got the console wins but now they are paying the cost in PC graphics.

If AMD expelled a lot of R&D on the console offerings, they even less wise than I give them credit for. Those are very small margin products; they generate revenue but that's essentially why NV wasn't even that interested to begin with. It is also not as if they designed a brand-new arch with either the GPU or CPU portions either, but some good work was done with integrating the two into an efficient design. I really doubt it was a large distraction from normal CPU or GPU operations. If it was, it may have been a serious misstep for them in the long-term...
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
AMD definitely needs to move to 20nm asap. Which I think is more than likely whet they'll do unless 390X is the last card on 28nm. Who knows. AMD is in a tough spot but I don't think people should count them out just yet. It is discouraging to read stories about them laying off engineers or deals where they have exchanged engineers/resources for access to other tech. Seems like the last thing they should be doing is cutting anything to do with R&D.

This.

It remains to be see what else AMD can squeeze from 28nm, but NV has thrown us a few surprises, so why not AMD too? It is telling, however, that information on their new products is pretty thin. AMD has some good products with the 280/280x/290/290x and they can cut prices until 20nm is here, but that will eat into profits. NV definitely has them in efficiency, but if I can get an equal-performing (or just barely slower) for a lot less $$$, it's a good buy. AMD is definitely not lacking any features either (yet) although I am excited about the Oculus Rift/SLI feature of the 9xx series. :)
 

CakeMonster

Golden Member
Nov 22, 2012
1,630
810
136
Its been a year since 29*, AMD could probably easily have planned a slightly tweaked version of the same chip on a more mature 28nm and take back the performance crowd from 980 GTX (it wouldn't take much effort although power usage will still be high). And I suspect that's what they will do. If they do anything more than that short term, its simply a bonus.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Reopening thread. Stop the personal attacks and stay on topic.
-- stahlhart
 
Sep 27, 2014
92
0
0
I am pretty excited about the end of this year with 980TI and 990 rumors floating around, and then 20nm and HBM coming from AMD in 2h 2015... I always enjoy it when both companies bring all they have to the table.

I am really just trying to hold out for a great single card 4k solution, the 295x2 has been tempting me for some time but the rumors of whats coming in 2015 makes me want to hold on to my 780 for a while longer...
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
didn't care about efficiency in the Fermi days at all.

To some but not all -- personally offered the constructive nit-pick for the first iterations of Fermi was performance/watt. Also, the market didn't enjoy the lack of efficiency from nVidia based on they lost over-all discrete market share to AMD.

Performance watt is probably the most important metric for the market place.