• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

2015 looking to be an exciting year.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
If AMD expelled a lot of R&D on the console offerings, they even less wise than I give them credit for. Those are very small margin products; they generate revenue but that's essentially why NV wasn't even that interested to begin with.

Remember, Nvidia competed against AMD for the contracts for those consoles, and only became disinterested after losing all 3 console contracts. If Microsoft, Sony, and/or Nintendo had awarded the contract to Nvidia, you can bet they would be toting whatever console win they got. Perhaps if they hadn't pissed off Sony with the PS3's GPU fiasco. Secondly, the GPU silicon in these APUs is from 2012, AMD's already made their profits on those R&D costs. Even the CPU silicon was over a year old when the XB1 and PS4 launched. Additionally,there really wasn't much done to improve them for the console, aside from a die shrink, that have cost AMD to burn through cash on R&D.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
To some but not all -- personally offered the constructive nit-pick for the first iterations of Fermi was performance/watt. Also, the market didn't enjoy the lack of efficiency from nVidia based on they lost over-all discrete market share to AMD.

Performance watt is probably the most important metric for the market place.

NVIDIA actually lost market share at that time because they didnt have a competitive GPU for 6 months, not because Fermi wasnt power efficient.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
2014 has been pretty uninteresting without any major advancements.

2015 has a lot of interesting things coming and I agree it will be very interesting.

NV: 980 ti / 990
AMD : 390x, a-sync

4k will get cheaper, we'll get better screens.

It has all of the ingredients for a great year in the GPU world. Now I just hope the pricing doesn't ruin it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Also, the market didn't enjoy the lack of efficiency from nVidia based on they lost over-all discrete market share to AMD.

I think the 480 being 6 months late and the 460 taking almost a year had more to do with that than power usage.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
The Radeon R9 285 is the only GCN 1.2 GPU available.

The 290X is GCN 1.1, if something is gonna be criticized, at least it should be done correctly.


In the last console gen there was and still is over 250,000,000 systems sold and selling. A console win is a very good thing. And this generation, it hopefully brings some audio improvements to the PC too.


Personally, I'm very interested in the Steam controller, iGPU's, APU's, freesync and 4K.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
NVIDIA actually lost market share at that time because they didnt have a competitive GPU for 6 months, not because Fermi wasnt power efficient.



Here is the 1st quarter of 2010 -- when nVidia didn't execute:

On a quarter-to-quarter basis Nvidia gained in the notebook integrated, and discrete segments as well as the desktop integrated segment. AMD gained a fraction in the desktop discrete segment and over four percent in notebook integrated.

http://jonpeddie.com/press-releases...irst-quarter-shipments-of-pc-graphics-increa/
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I think the 480 being 6 months late and the 460 taking almost a year had more to do with that than power usage.

Believe what you want --- the first iterations of Fermi were not compelling enough to keep share and lessons may have been learned --- with Kepler and Maxwell there was tremendous focus and effort on Performance/watt!
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
To some but not all -- personally offered the constructive nit-pick for the first iterations of Fermi was performance/watt. Also, the market didn't enjoy the lack of efficiency from nVidia based on they lost over-all discrete market share to AMD.

Performance watt is probably the most important metric for the market place.
Price/performance is the ratio that drives most video card sales, not performance/watt.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
If that was the case AMD would of been dominated sales for years and years.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I do wish AMD would have kept the ATI name sometimes. I believe some of the brand recognition was lost the last 4 years.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Brand loyalty trumps both your metrics. Once you stop pretending the consumers are level headed entities you will see this.

Forums like these are where you'll see less of those sort of realizations. Examples of the most zealously devoted are generally found on tech forums.

You still need good product metrics though. You can't be dominant on marketing converting consumers alone. But good products and marketing are going to work well, good marketing and weak products not so much.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136

And look what happend the next quarter (Q2 2010) when AMD had high volume of HD5xxx cards from Sub $100 HD5670/50 all the way up to $379 HD5870.
NVIDIA Released Fermi GF100 (GTX480) end of March 2010,that is almost middle of Q2 and GTX 460 in Jully 2010 when AMD had Full HD5xxx series cards in the market since January 2010.

HD5870/50 official release September 2009 ($379, $259)
HD5770/50 official release Octomber 2009 ($159, $129)
HD5670/50 official release January 2010 ($90, $60)

GTX480/470 Official release March 2010 ($499, $349)
GTX460 official release Jully 2010 ($229)

Edit:Have a look what happend in mobile with AMD leveraging HD5xxx series mobile GPUs

gpumsq2.gif
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
And look what happend the next quarter (Q2 2010) when AMD had high volume of HD5xxx cards from Sub $100 HD5670/50 all the way up to $379 HD5870.
NVIDIA Released Fermi GF100 (GTX480) end of March 2010,that is almost middle of Q2 and GTX 460 in Jully 2010 when AMD had Full HD5xxx series cards in the market since January 2010.

HD5870/50 official release September 2009 ($379, $259)
HD5770/50 official release Octomber 2009 ($159, $129)
HD5670/50 official release January 2010 ($90, $60)

GTX480/470 Official release March 2010 ($499, $349)
GTX460 official release Jully 2010 ($229)

Edit:Have a look what happend in mobile with AMD leveraging HD5xxx series mobile GPUs

gpumsq2.gif

And the second and third quarters AMD gained market share and captured share leadership with new products from nVidia --- the first iterations of Fermi couldn't retain share leadership.

Amd was so confident with their share leadership they decided to get rid of the ATI brand, imho.

Brand loyalty trumps both your metrics. Once you stop pretending the consumers are level headed entities you will see this.

One of the hardest things to create is a brand name and loyal customers, imho! Metrics like performance, performance/dollar, performance/watt, support, features, tools and flexibility, quality of tools and flexibility, developer relations/content all help build the brand name and loyal customers, imho.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
It's unfortunate that certain things have plagued AMD (ATI) going all the way back to their rage pro days - like drivers. I feel like the negative reputation surrounding their drivers is actually bigger than the problems themselves. For example, my friend was asking me for recommendations regarding a new PC he wanted to put together. He doesn't really follow new releases too much but every so often he'll build a new PC. Before I could make any recommendations on GPU's he said "but I won't use AMD video cards, their drivers are shit" and I said "oh yeah, have you had trouble with them in the past?", followed with "No, that's just what I hear everyone say". If you jump into any random Counter-Strike or BF4 match and ask what GPU people are using, they'll name anything from 560 ti's to Geforce 760's and they'll all tell you "BECAUSE AMD DRIVERS STINK!~" with the majority of them having never used an AMD/ATI card.

I remember a few years back when AMD did the major restructuring they cut a big portion of their PR team and I'm sure to many people that is ideal compared to cutting engineers or anyone involved in R&D, but sometimes you realize how much of an impact marketing & PR makes both good and bad.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Believe what you want --- the first iterations of Fermi were not compelling enough to keep share and lessons may have been learned --- with Kepler and Maxwell there was tremendous focus and effort on Performance/watt!

You keep repeating this false fact over and over that NV lost market share because Fermi was inefficient. NV lost a lot of market share from Q4 2009 to Q2 2010 because Fermi never even showed up until April 2010 and when it did, 470/480 costs significantly more than 5850/5870. Yet, despite that NV actually gained market share once 470/480/460 launched. And as you know 5850 OC was way more efficient than 460 OC. Your theory doesn't explain at all how in the world NV gained market share in Q3 2010, Q4 2010 and beyond when 6950 destroyed 570 and 580 in performance/watt and you could buy nearly 2 unlocked 6950s for the price of a 580!

JHH also mentioned that NV failed to gain notebook market share in late 2009/early 2010 because they were 6 months late due to Fermi fabric being broken - you basically lose market share because you don't show up for the Fall 2009 - Holiday 2009 shopping season. Just like right now AMD is going to bleed market share because 970M and 980M are out and AMD has nothing in response at all.

perfwatt.gif


06Qvn3i.jpg


And the second and third quarters AMD gained market share and captured share leadership with new products from nVidia --- the first iterations of Fermi couldn't retain share leadership.

No, they did not gain market share in Q3 2010. Your statements don't match reality because it was 1st gen Fermi that gained market share for NV, not lost it. Q2 2010 is when Fermi just came out in volume. Q2 2010 was 44.5% for AMD but then Q3 2010 was 41% and then Q4 2010 was 38.8%. Since 570/580 came out November 2010, and 560Ti was not out until late January 2011, most of NV's sales in Q3 2010 and Q4 2010 were old Fermi GTX400 series, the horribly inefficient kind. During this period AMD went form 44.5% to 38.8%! And even after that 570/580's efficiency was still well below 6950/6970. The reason NV lost market share before Q2 2010 is because they were competing with GTX200 series and no Fermi cards.

Throughout all of Fermi generation from April 2010 to March 2012 when 680 launched, I don't recall any of the vocal NV owners on our boards switching to AMD due to superior performance/watt. It's like that metric was completely ignored in their purchasing decision and now with Kepler and Maxwell, it's the thing to talk about. :sneaky:

You can see a much better explanation for NV gaining market share - 8800GTX/8800GTS, etc. and then 2900XT from AMD. During this period, many people switched to NV and remained with them since. NV's market share was about 59% in Q1 2007 due to 8800GTX/8800GTS and right now 7.5 years later it's about 61-62%. Also, if you look at Q2 2014, AMD gained market share despite 750/750Ti and their 860M versions cleaning up in performance/watt for that entire quarter. How do you explain that if you think performance/watt drives consumer purchases?

Brand loyalty trumps both your metrics. Once you stop pretending the consumers are level headed entities you will see this.

I agree with that. Most NV users will pay hundreds of dollars more for similar performance. Further, AMD struggled to gain much market share despite having superior performance/watt and/or performance/$$$ all the way to HD4850 days until HD6970. Therefore, price/performance and performance/watt aren't be the primary drivers of market share. Even when you could get almost 2x 290s vs. 780Ti, gamers still bought 780Ti. And I've seen gamers buy $450-$550 780 3-6GB over $360-370 R9 290 or buy $400-450 770 4GB over $300 R9 280X/7970Ghz.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
You keep repeating this false fact over and over that NV lost market share because Fermi was inefficient.

nVidia couldn't retain over-all discrete share leadership with the first iterations of Fermi 'till Q4 and a new generation. The lessons learned here made nVidia a better company and focus on improving performance/watt, imho.


6950 destroyed 570 and 580 in performance/watt

nVidia improved performance/watt with the 5xx series and the GTX 570's competition was the HD 6970.

http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/30.html

http://www.techpowerup.com/reviews/HIS/Radeon_HD_6970/31.html

JHH also mentioned that NV failed to gain notebook market share in late 2009/early 2010 because they were 6 months late due to Fermi fabric being broken

Really:

Q1 2010:

On a quarter-to-quarter basis Nvidia gained in the notebook integrated, and discrete segments

http://jonpeddie.com/press-releases...irst-quarter-shipments-of-pc-graphics-increa/


Since 570/580 came out November 2010, and 560Ti was not out until late January 2011

They were shipping these cores before the release date to the public.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Now gpu is often sold together with CPU and 2015 is damn exiting in that perspective; Denver, A57, 14nm comming perhaps even a new qq arch after the 810. Mantle is mature, DX12 is comming with win10.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
HD5xxx 40nm Mobility was only released in January 2010, that is why in Q2 2010 AMD had a huge increase in Mobile market share because NVIDIA was fighting with 65nm/55nm G200 GPUs.

I don't disagree that execution played a part but the first iterations of Fermi mobile didn't shine with performance/watt and efficiency -- and the major reason for the performance/watt focus we see today is more based on mobile revenue moving forward. The HD 5XXX mobile was first but was also very efficient.

If anything performance/watt and efficiency is paramount here and ya can't continue to compete with such lackluster performance/watt as the first iterations of Fermi.

Everything out of nVidia's mouth is performance/watt and understandably so.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I don't disagree that execution played a part but the first iterations of Fermi mobile didn't shine with performance/watt and efficiency -- and the major reason for the performance/watt focus we see today is more based on mobile revenue moving forward. The HD 5XXX mobile was first but was also very efficient.

If anything performance/watt and efficiency is paramount here and ya can't continue to compete with such lackluster performance/watt as the first iterations of Fermi.

Everything out of nVidia's mouth is performance/watt and understandably so.

Well perf/watt is not always low power, you have to remember that. You can have high perf/watt with high power consumption. Also, i will agree that high perf/watt is needed in Mobile but Desktop and especially high-end Gaming was always and still is more about performance than perf/watt. I dont thing people with 2-3 or 4 GPUs in CF/SLI care about energy consumption. ;)
 

Kippa

Senior member
Dec 12, 2011
392
1
81
Could we realistically have a single gpu card that can handle 4K gaming alright by the end of 2015? I am all for new tech and better processing power, though for me I'll only make the jump to 4K when we have high quality 60hz monitors that are reasonably priced and also have a single gfx card that can handle 4K gaming on one card.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Could we realistically have a single gpu card that can handle 4K gaming alright by the end of 2015? I am all for new tech and better processing power, though for me I'll only make the jump to 4K when we have high quality 60hz monitors that are reasonably priced and also have a single gfx card that can handle 4K gaming on one card.

Looks like it, we are almost there already with the 980. New AMD and Big Maxwell along with DX12, I would hope so.