[JPR] YoY Graphics Card Sales Continue to Decline, Marketshare by Supplier

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
... Just sell off the GPU division already. I give up. Who's ready for $1500 midrange GPUs and a monthly subscription to get drivers from Nvidia? I'd say that I'm going to switch to consoles, but I have no idea what's going to happen to them when AMD dies in 2017... I guess this is the end of gaming rope for me.

Maybe they'll actually shrink Tonga (470/X) and Fiji (490/X), and throw in a new GPU around Hawaii performance (480/X) and a new card (Fury 2/2X)? That would work, but I'm sure it's wishful thinking...

Within 2 generations Intel's Iris will be powerful enough to compete with NV's "budget" tiers, so it'll be potent enough to give them at least some competition.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Within 2 generations Intel's Iris will be powerful enough to compete with NV's "budget" tiers, so it'll be potent enough to give them at least some competition.

Who's gonna buy a $300 CPU for gaming to use integrated graphics? If Intel moved it down to i3s then sure...
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Again Raja said that there will be only 2 brand new GPU next year. It means that whole whole AMD GPU line will be same as this year but they will only include 2 new GPUS where as Nvidia will launch pascal from top to bottom.

http://wccftech.com/amd-raja-koduri...ng-discrete-gpu-market-share-gains-2016-2017/

Two GPUs would cover a decent bit of the market, since that will probably be 4 or 5 cards. The rest would need to be flushed out with rebrands. It would be nice if nVidia could launch Pascal top to bottom in 2016, but I wouldn't hold my breath on that. They still haven't pushed GM20X down under $150 and they don't have anything Maxwell based at the $100 price point, over a year and a half after they launched Maxwell 1. I'll be pleasantly surprised if nVidia can get a full range of Pascal cards out the door in 2016.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Coupled with the incoming holidays, AIB's are likely to keep prices down and volume high.

Ya but if you look at the BIG picture, it's the complete opposite of what you are saying - GPU prices are going up and unit volumes are going down from 5 years ago, and it's getting worse. I am not surprised though because so many console to PC ports are unoptimized turds and the rest of well optimized PC games can be played well on older GPUs. With lower unit sales, NV and AMD jacked up the prices on GPUs and split the generation into parts - aka milking us in the end.

What NV doesn't want anyone to know is how dGPU unit sales are bombing, kinda like MS doesn't want to discuss XB1 unit sales anymore. That's actually bad, very bad for us consumers since it means even more likely that NV will finally cement the bifurcating a generation strategy they first started with Kepler and since AMD cannot compete financially, they will do the same. In the end, these firms are just laughing at us while feeding us $500-550 mid-range chips and holding back the real $650 flagships. What's happening in the current dGPU market segment is just horrible. We went from $250 GTX560Ti to a $500 GTX680/$550 GTX980 and now AMD cannot even compete on price/performance since they cannot afford to. What have we ended up with as a result? The most overpriced GPUs and the slowest trickle down of technology EVER. GTX960 is easily the worst x60 series card ever made. On the AMD side, we have mostly refreshes since they had no money to design new 28nm products top-to-bottom.

When you say "keeping prices down and volume high", it's not at all what's happening. $450 GTX980 was a $250 GTX560Ti, while 980Ti $650 is a cut-down flagship aka $350 GTX570. That means what's happening is NV is keeping prices HIGH and volumes keep falling. AMD is financially strapped that they aren't even competing on market share or price/performance with Fiji products. In the end, PC gamers got the short end of the stick. If it wasn't for dGPU demand out of China and some emerging markets in Brazil and Russia, dGPU sales would have tanked even harder. MOBA, World of Tanks, Starcraft 2 expansions and some other key games are keeping the dGPU segment in some demand in China/Brazil/Russia. What happens moving forward? I hope things improve for our wallet's sake.

Sounds like we, the consumers, are stuck between a rock and a hard place. Instead of just looking at NV vs. AMD, I am looking at the big picture and it's looking awful.

Also, 2015 had some of the most anticipated games across the entire PC gaming community:

- GTA V
- MGS: V
- Batman AK
- Fallout 4
- The Witcher 3
- Starcraft 2: Legacy of the Void
- Star Wars: Battlefront

There were some other big games, such as Homeworld Remastered, Kerbal Space Program, Pillars of Eternity, Dying Light, Ori and the Blind Forest, Heroes of the Storm, Project CARS, Total War: Attila, Assassin's Creed Syndicate, Evolve, Arma 3: Marksmen DLC, Grey Goo RTS, and on top of that some early access titles like ARK Survival, Killing Floor 2, and of course Star Citizen. YET, despite ALL of these games, this is thus far THE WORST year for discrete GPU unit sales when looking at Q1+Q2+Q3 2015 vs. the last 5 years of GPU sales.

This is alarming because we went from 40 million to > 125 million Steam members in the last 3-4 years and yet discrete GPU unit sales are absolutely bombing, worst period in dGPU graphics in a decade probably if we extended that graph to 2005. :eek:

Again Raja said that there will be only 2 brand new GPU next year. It means that whole whole AMD GPU line will be same as this year but they will only include 2 new GPUS where as Nvidia will launch pascal from top to bottom.

That could mean a lot of things. Let's go back to 2012. "Only" 2 GPUs could generate:

HD7990
HD7970Ghz
HD7970
HD7950 V2
HD7950
&
HD7870XT
HD7870
HD7850

That's just the desktop. On top of that they have Fury, Nano and Fury X that they can shrink. They have a lot of options. What matters is their execution and pricing. NV's current generation below GTX970 has been extremely weak, but their marketing and design win execution is what allowed them to win in that market segment.

If AMD's "only 2" new GPUs are extremely competitive as HD7950 and HD7970 were, and if AMD can shrink some of their existing products (i.e., the Nano shrunk could be a great laptop GPU), then AMD may have a good line-up of cards.

If AMD cannot deliver a competitive GPU architecture in 2016 and gain share in 2016 there is no hope for them in the discrete GPU business. AMD needs to do the same in 2017 in CPUs/APUs. Otherwise they will not survive. This hammering which AMD is getting from Intel/Nvidia cannot go on for long. rip AMD.

Do you buy graphics cards based on their market share or what the card is? If AMD had 50% market share, would that make Fury X or 390 a better videocard? If NV had 18.8% market share, would it make 970/980Ti bad products?

As long as AMD aims to achieve profitability, it's better to have 5% market share with positive cash flow than to have 95% market share with negative cash flow. I am not suggesting that AMD is about to start making millions of dollars off their graphics cards but your point that AMD might die in 2017 is premature. You are not taking into account any positive net cash flow that will start coming in from Nintendo's NX, some other unannounced design wins and Zen. It's also not easy to turn the momentum of OEMs when most of the laptop design wins have been spoken for all of 2015 and AMD has no competitive products in laptops. That means the only major segments where AMD can somewhat regain market share are desktop dGPUs. That's less than 50% of the entire dGPU market segment and it's only been 1 quarter.

... Just sell off the GPU division already. I give up. Who's ready for $1500 midrange GPUs and a monthly subscription to get drivers from Nvidia? I'd say that I'm going to switch to consoles, but I have no idea what's going to happen to them when AMD dies in 2017... I guess this is the end of gaming rope for me..

Why are you so pessimistic? The only card from NV that's head and shoulders better than anything AMD has is the 980Ti. The rest of NV's line-up is completely unremarkable against AMD's current offerings. The reason AMD is bombing so much is mostly to do with NV's execution, marketing and OEM relationships because AMD basically forfeited the notebook dGPU market for the last 4 years. However, as far as actual technical performance and price/performance is concerned, AMD's current line-up from $100-$500 segments is extremely strong.
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/23.html

Will Pascal destroy Arctic Islands? Possibly but what we can say with 100% certainty, that if AMD produces the world's best graphics card in 20 years but their execution is crap, it won't matter. AMD needs to focus on launching sooner, more game bundles, more AMD GE games. And honestly, even if we went back to the epic era of ATI vs. NV when ATI was wiping the floor with NV, if NV used the GW's tactics back then, ATI wouldn't have stood a chance. If you take the overall picture into account, it's pretty hard to envision how AMD can possible get back to 50/50% market share since they don't have the $ to Pay to Win with game developers. :cool:

Within 2 generations Intel's Iris will be powerful enough to compete with NV's "budget" tiers, so it'll be potent enough to give them at least some competition.

Intel is going to need Kaby-lake just to catch up to a 750Ti. After that it's going to take them probably 2 more generations just to catch up to an R9 280X/960. Now we are looking at 2018 when NV will have Volta and a $150 GPU will be as powerful as a GTX980. Intel itself will no be competition to AMD/NV. It's actually far worse than that. If PC games stagnate and Intel's GPUs keep getting better, less and less PC gamers will even need a dGPU products. It's not even a matter of AMD/NV competing as they simply won't be able to. Luckily for AMD/NV, there is VR and 4K gaming and possibly next gen PS5/XB2 consoles in 2019. If any one of these things takes off, AMD/NV will be able to sell $200-700 graphics cards due to increased demand.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Who's gonna buy a $300 CPU for gaming to use integrated graphics? If Intel moved it down to i3s then sure...
laptops bro, laptops. you know the version of pc that sells the most? if intel can really pull it off in 2 more gens, it will turn every new laptop into a gaming machine.

at that point, gpu makers/vendors can all die and I wouldn't bat an eye.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
laptops bro, laptops. you know the version of pc that sells the most? if intel can really pull it off in 2 more gens, it will turn every new laptop into a gaming machine.

at that point, gpu makers/vendors can all die and I wouldn't bat an eye.

And who's going to buy an expensive laptop with gaming at lowest settings as a priority? Hardly any laptops have Iris Pro, and for the ones that do it's nothing more than a bonus. I can't see too many people with a $1500+ budget choosing a laptop with Iris Pro when they were originally considering a gaming laptop at that price point. It's just different audiences. I'm not saying that the decline won't continue, but Iris Pro will not play a huge part in it, especially if pricing doesn't change.
 
Mar 10, 2006
11,715
2,012
126
laptops bro, laptops. you know the version of pc that sells the most? if intel can really pull it off in 2 more gens, it will turn every new laptop into a gaming machine.

at that point, gpu makers/vendors can all die and I wouldn't bat an eye.

Intel has a lot of work to do before it can really replace dGPUs in gaming laptops, both in terms of raw hardware power as well as driver work/developer relations. NVIDIA is miles ahead of Intel here on both counts.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I find it so sad though. All those people blindly buying inferior hardware. All those 960s and 970s sold. jeez. Real PC gaming tragedy
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
The biggest mistake AMD made was not answering NVDA by taking back the crown for the fastest and most powerful graphics card that does not require a 6 pin connector. This is a HUGE market that they blatantly ignored. They could take it right now with a 2GB HBM half-a-Fury die run at Nano clocks and voltages. Charge $30 more if they have to, whatever it takes, but they have to be able to clearly hold the performance crown for cards that do not require a 6 pin connector.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
The only tragedy is comments like yours.

in the upcoming AAA games where do you expect the 960 to be compared to the 285/380?

Unfortunately there are a ton of people being fooled into buying weaker cards. I can see both the 960 and 970 falling off hard.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
2 GPUs isnt that big of a deal guys. 7000 and 600 series both launched with 2 GPUs with larger dies coming later. This is how it goes now.

I'd love it if it was all 3 small -> medium -> large gpus at once instead of getting charged high end prices for the mid range chip, but that ship has sailed. I don't know why this wouldn't be expected by now.

2 GPUs with probably 3 die cut variations of each GPU = 6 cards. The very low end and the very high end usually have their own schedules. Dual GPU Fury will stopgap the very top and I imagine we will see a Pitcairn equivalent of 14nm that will get cut down to the lower-midrange. That is my prediction.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Do you buy graphics cards based on their market share or what the card is?

No, I don't think anyone does. With that said, when alternatives are so close that goes into someone's equation for sure.

For example, if I am comparing the 970 vs the 390 right now for 1080p gaming there isn't a "clear winner." In both cases you are kinda making a bet on the future and what will matter more: Directx 12 enabled ACEs and 4GB more VRAM or the fact that GTX 970 is the most popular dedicated GPU in Steam right now so developers will target it over the 390? I know we all want to believe that Directx 12 will level the playing field but someone who argues that the popularity of the 970 means we will see more Gameworks games in the future than asynchronous compute games probably isn't crazy (I still think the 4GB of VRAM matters though).

Or look at the GTX 960 vs the R9 380. When they have the same amount of RAM these cards trade blows, so you are looking at details (Directx 12 performance vs HDMI 2.0) to pick what is the best card. The fact that the 960 is a lot more popular and will probably be mentioned in name as the minimum requirement for 2017 games will matter to some people.

AMD is very competitive right now but unless you are playing at high resolutions (and most aren't) the most clear win they have had in a year was that $250 R9 290 that too many people ignored.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Or look at the GTX 960 vs the R9 380. When they have the same amount of RAM these cards trade blows, so you are looking at details (Directx 12 performance vs HDMI 2.0) to pick what is the best card. The fact that the 960 is a lot more popular and will probably be mentioned in name as the minimum requirement for 2017 games will matter to some people.

But they do not. Any reputable site shows 380 outperforming the 960. Look at 950/960 outselling 380 by a lot and how far behind they are:

perfrel_1920_1080.png


Do you remember any ATI vs. NV generation where A LOT of people would recommend and buy the slower videocard, much slower one to save power? Can you name any one generation of cards prior to NV's perf/watt marketing in 2012 when anyone would sacrifice 25-60% of performance to save power? Can you logically explain how a budget gamer would predominantly choose a GTX750Ti for $140 over an R9 270/270X for $140-150 when the latter is 35-44% faster?
http://www.computerbase.de/2015-08/nvidia-geforce-gtx-950-test/3/#abschnitt_tests_in_1920__1080

How can thousands of people ignore 960 2GB of VRAM issue for 9 months prior to NV launching 960 4GB? How was it that thousands of gamers thought 2GB GTX960 over a $200 R9 280X or $250 R9 290? You are telling me that's not marketing, perception and brand bias? If GTX960 was an AMD card and 280X/290 were NV cards, 960 would be ridiculed from day 1 for being utter garbage and you know it. Same for the 950.

This generation more than any other proved that popularity perception (what GPUs YouTubers/Twitch gamers use and recommend) and perception of GW's marketing, NV's brand name is what wins, not actual objective overall performance or price/performance data. People went crazy over a $330 GTX970 when for 3-4 months prior to its September 2014 launch, it was easily possible to purchase an after-market Sapphire Tri-X 290 for $350. The situation got so bad that people were buying 980 over a $650 295X2 or a $280 R9 290X. Do you realize what a 295X2 does to a 980 today? It destroys it, while a 980 is just 20-25% faster than a 290X but it cost almost double for 6+ months.

950/960 might be trading blows with a 380 in games that heavily favour NV, but not overall if you take many PC games into the account. R9 280X is a whopping 50% faster than a 950 at 1080P. Who is buying an R9 280X over a GTX950/960? Almost no one. You are telling me that's not marketing/perception or OEM volume driven?

It's the same thing as the point sm625 made above. NV offered 750/750Ti cards without a 6-pin connector so no matter what AMD did, it wasn't even an option in that market segment. In other words even if AMD positioned R9 290 for $150, it wouldn't have outsold the GTX750Ti. Why? Because it doesn't mean the market's (i.e., large OEM customers) overall requirements.

This forum also has a tendency to highlight games in which NV performs very well like FO4 and downplays situations where AMD performs well. This is also the case for the market overall. For instance, did you know that R9 390 is 14% faster at 1080P and 25% faster at 1440P in BO3's multiplayer than a GTX970 is?
http://www.computerbase.de/2015-11/...-test/3/#diagramm-cod-black-ops-iii-2560-1440

This constant myth that AMD's drivers suck and that NV's cards are dominating in GW's titles usually rests on 980Ti beating everything and misses the standing of the rest of the line-up (halo effect). Secondly, it misses the other side of the equation - all the games where AMD is winning because I guess those games don't matter since they are less popular.

Based on the current market share, it's far worse than the HD2900 era vs. GeForce 8. Can you honestly say AMD's R9 200/300 lineup from $100-500 is as bad as HD2900 era? No, not even remotely close but it's as if AMD is behind by 20-30% in performance in every price category. The reality is not even close to that but yet AMD has barely 18.8% market share.

Go way back to the introduction of HD7970's launch and just how badly it wiped the floor with a GTX580. This is not even remotely comparable to how 980 outperformed the 290X. Yet, people hated on the 7970, ignored its overclocking (yet on day 1 7970 OC was 40-80% faster than a GTX580 OC -- and we heard how AMD's drivers sucked during 7970's launch ha!), etc. Now ask yourself, which card for the first 12 months of its launch was by far the more popular and liked the 980 or 7970?

980 delivered less of everything against the 290X vs. what the 7970 brought over the 580. 980 was one of the least impressive next gen $550 cards in years and yet it sold like hot cakes. Why? Remember, this very forum and the online community as a whole ridiculed 7970 925mhz for being just 25% faster than a stock 580, while delivering double the VRAM for just $100 more. They also largely ignored how 7970 OC was 40-80% faster than a GTX580 OC, its forward looking GCN architecture, bit-coin mining, etc. The launches of 7970 vs. 980 tell you a lot about the average PC gamer.

Another example is R9 285 was ripped for having 2GB of VRAM. The GTX960 2GB? Nah, not a problem. How many professional reviews mentioned that it took NV more than 1.5 years to launch a 960 that was barely 14% faster than a GTX760? No one besides Computerbase. How many professional review sites warned PC gamers on day 1 of launch that they should not buy a $199 card with 2GB of VRAM? Nope, instead they showed 960 with Silver and Gold awards. It makes you really question the independence integrity of some professional review sites.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
But they do not. Any reputable site shows 380 outperforming the 960. Look at 950/960 outselling 380 by a lot and how far behind they are:



Do you remember any ATI vs. NV generation where A LOT of people would recommend and buy the slower videocard, much slower one to save power? Can you name any one generation of cards prior to NV's perf/watt marketing in 2012 when anyone would sacrifice 25-60% of performance to save power? Can you logically explain how a budget gamer would predominantly choose a GTX750Ti for $140 over an R9 270/270X for $140-150 when the latter is 35-44% faster?
http://www.computerbase.de/2015-08/nvidia-geforce-gtx-950-test/3/#abschnitt_tests_in_1920__1080

How can thousands of people ignore 960 2GB of VRAM issue for 9 months prior to NV launching 960 4GB? How was it that thousands of gamers thought 2GB GTX960 over a $200 R9 280X or $250 R9 290? You are telling me that's not marketing, perception and brand bias? If GTX960 was an AMD card and 280X/290 were NV cards, 960 would be ridiculed from day 1 for being utter garbage and you know it. Same for the 950.

This generation more than any other proved that popularity perception (what GPUs YouTubers/Twitch gamers use and recommend) and perception of GW's marketing, NV's brand name is what wins, not actual objective overall performance or price/performance data. People went crazy over a $330 GTX970 when for 3-4 months prior to its September 2014 launch, it was easily possible to purchase an after-market Sapphire Tri-X 290 for $350. The situation got so bad that people were buying 980 over a $650 295X2 or a $280 R9 290X. Do you realize what a 295X2 does to a 980 today? It destroys it, while a 980 is just 20-25% faster than a 290X but it cost almost double for 6+ months.

950/960 might be trading blows with a 380 in games that heavily favour NV, but not overall if you take many PC games into the account. R9 280X is a whopping 50% faster than a 950 at 1080P. Who is buying an R9 280X over a GTX950/960? Almost no one. You are telling me that's not marketing/perception or OEM volume driven?

It's the same thing as the point sm625 made above. NV offered 750/750Ti cards without a 6-pin connector so no matter what AMD did, it wasn't even an option in that market segment. In other words even if AMD positioned R9 290 for $150, it wouldn't have outsold the GTX750Ti. Why? Because it doesn't mean the market's (i.e., large OEM customers) overall requirements.

This forum also has a tendency to highlight games in which NV performs very well like FO4 and downplays situations where AMD performs well. This is also the case for the market overall. For instance, did you know that R9 390 is 14% faster at 1080P and 25% faster at 1440P in BO3's multiplayer than a GTX970 is?
http://www.computerbase.de/2015-11/...-test/3/#diagramm-cod-black-ops-iii-2560-1440

This constant myth that AMD's drivers suck and that NV's cards are dominating in GW's titles usually rests on 980Ti beating everything and misses the standing of the rest of the line-up (halo effect). Secondly, it misses the other side of the equation - all the games where AMD is winning because I guess those games don't matter since they are less popular.

Based on the current market share, it's far worse than the HD2900 era vs. GeForce 8. Can you honestly say AMD's R9 200/300 lineup from $100-500 is as bad as HD2900 era? No, not even remotely close but it's as if AMD is behind by 20-30% in performance in every price category. The reality is not even close to that but yet AMD has barely 18.8% market share.

+1

Also to point out, Media/Press had a lot to do with that perception the last 12-18 months. They constantly favored NV cards for reasons not mentioned 1-2 years ago (like perf/watt). And dont get me wrong, Maxwell has the better perf/watt but the perf/price was AMDs all along and nobody in the media made any positive commend. It was like perf/price was completely irrelevant for everyone when a few years ago NV was releasing good $200 perf/price cards. :rolleyes:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
+1

Also to point out, Media/Press had a lot to do with that perception the last 12-18 months. They constantly favored NV cards for reasons not mentioned 1-2 years ago (like perf/watt). And dont get me wrong, Maxwell has the better perf/watt but the perf/price was AMDs all along and nobody in the media made any positive commend. It was like perf/price was completely irrelevant for everyone when a few years ago NV was releasing good $200 perf/price cards. :rolleyes:

Not just favouring perf/watt. The media flat out started ignoring price/performance, which was historically the most important metric for all mainstream GPUs besides the flagship ones. All of a sudden when NV started losing in price/performance, the focus shifted to perf/watt.

Look at this TR review of GTX960 2GB around launch - $209-219 960's compared:
http://techreport.com/review/27702/nvidia-geforce-gtx-960-graphics-card-reviewed/2

In this very review, the conclusion chart -- 290 is matching 970 and dominating the 960:
value-fps.gif


You'd think the conclusion would discuss how 2GB of VRAM and a card for $60 more and offering almost 50% more performance would be worth considering. NO. Instead, the author flat out ignored 2GB of VRAM limits, ridicules AMD's AIBs by implying they are desperate to offload R9 290 inventory with MIRs and flat out declares:

"Nvidia has made big strides in efficiency with the introduction of Maxwell-based GPUs, and the GeForce GTX 960 continues that march. Clearly, Nvidia has captured the technology lead in GPUs. Only steep price cuts from AMD have kept the Radeons competitive—and only then if you don't care about your PC's power consumption.

... What's not to like?"


Really now? How can an objective reviewer ignore that one card is 50% faster, has 2X the VRAM and costs $270 but another card is DOA with 2GB of VRAM out of the box and costs $210 ($270 / $210 = 29% more expensive for 50% more performance)!

Shockingly, things got even worse for the 960 since that review. Today a reference R9 290 is 67% faster at 1080P and 73% at 1440P.
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/23.html

Since that reviewer will never admit to his awful review and horrible recommendations and lack of ability to be forward looking for more demanding next gen games over 2-3 years, he doesn't care that gamers got burned with buying an inferior product. Ask yourself this, if you ran an independent review site and you recommended to thousands of readers to buy a $210 960 2GB over a $270 290, how would you feel right now? You should feel remorse and realizing that you made the wrong recommendation as a professional reviewer. Where is the article acknowledging this and full out apology?

But the problem is the PC gaming community never stood up to incompetent professional reviewers who are biased or don't understand fundamentals. When the media flat out ignores issues with Apple products we call them out on it. When certain media flat out starts blatantly favouring one brand over the other for whatever reason and ignores the fundamentals of how GPUs have been recommended for decades, we have to pay attention and ask ourselves if we should listen to incompetent media. Does the media have our best interests in mind or just the profits/ad revenue of its website? Does the media favour a quick recommendation with no forward looking life-cycle to the card's longevity or is the media more likely to recommend us a card that's more inferior but it makes the company that feeds them the most samples the happiest? Let's face it, when > 80% of the market is owned by one firm, they have more control over the media because the media cannot afford to tarnish its reputation with that firm or they'll have to buy ALL of their future review samples.

Do you guys realize this isn't just about us, but it's about them, the reviewers? They are now at the mercy of NV unless they buy their own samples. How likely is someone going to be to criticize any product from a market leader?

Unfortunately too many console to PC converts who are just entering the PC space read this type of garbage like the 960 review I linked and believe that it's actually good advice. :thumbsdown:

I think all the major GPU reviewers should be buying their own GPUs. If anyone is receiving free review samples, marketing perks, media perks, it automatically means that site requires additional due diligence. Since they don't release this information to us, we have to start to be far more critical of reviews when we see something that flat out makes no sense.
 
Last edited:
  • Like
Reactions: Grazick

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Go way back to the introduction of HD7970's launch and just how badly it wiped the floor with a GTX580. This is not even remotely comparable to how 980 outperformed the 290X. Yet, people hated on the 7970, ignored its overclocking (yet on day 1 7970 OC was 40-80% faster than a GTX580 OC -- and we heard how AMD's drivers sucked during 7970's launch ha!), etc. Now ask yourself, which card for the first 12 months of its launch was by far the more popular and liked the 980 or 7970?

980 delivered less of everything against the 290X vs. what the 7970 brought over the 580. 980 was one of the least impressive next gen $550 cards in years and yet it sold like hot cakes. Why? Remember, this very forum and the online community as a whole ridiculed 7970 925mhz for being just 25% faster than a stock 580, while delivering double the VRAM for just $100 more. They also largely ignored how 7970 OC was 40-80% faster than a GTX580 OC, its forward looking GCN architecture, bit-coin mining, etc. The launches of 7970 vs. 980 tell you a lot about the average PC gamer.

To be fair, the general gist of the disappointment with the GCN introduction wasn't so much that they were bad cards, but that they were too expensive and they weren't good enough given the brand new architecture and new 28nm node. The stock 7970 was 30% faster than a 6970, but moved up 60% in price from $350 to $550. That was exacerbated when they launched the 7770 a month later priced above a 6870, but was considerably slower than the 6870.

In hindsight the 7970 was a great card, but it's understandable that people were not thrilled with it at the time. I remember waiting up until the NDA was lifted and AT posted their 7970 review when it was launched, and there was definitely a feeling of "That's it? A brand new arch and new node we get 30% better performance for 60% more money?"
It's an interesting thing to keep in mind as we wait for Pascal/Arctic Islands and 16nm.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Who's gonna buy a $300 CPU for gaming to use integrated graphics? If Intel moved it down to i3s then sure...

Very, very good point. If the kid's parents have enough money to buy the pre-built that is twice as expensive, and the ones that come with an i7 are, and those don't come without a dGPU.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
To be fair, the general gist of the disappointment with the GCN introduction wasn't so much that they were bad cards, but that they were too expensive and they weren't good enough given the brand new architecture and new 28nm node. The stock 7970 was 30% faster than a 6970, but moved up 60% in price from $350 to $550. That was exacerbated when they launched the 7770 a month later priced above a 6870, but was considerably slower than the 6870.

In hindsight the 7970 was a great card, but it's understandable that people were not thrilled with it at the time. I remember waiting up until the NDA was lifted and AT posted their 7970 review when it was launched, and there was definitely a feeling of "That's it? A brand new arch and new node we get 30% better performance for 60% more money?"
It's an interesting thing to keep in mind as we wait for Pascal/Arctic Islands and 16nm.

Well this time with the 14/16nm GPUs that are being designed for DX-12 and Compute, they will have DX-12 games to be tested on launch days. So im expecting on DX-12 titles the new GPUs will have a substantial performance increase way higher than what the first 28nm cards had back in 2012.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Within 2 generations Intel's Iris will be powerful enough to compete with NV's "budget" tiers, so it'll be potent enough to give them at least some competition.
I have two reasons to doubt that:
Firstly, Ivy Bridge:
Ivy-Bridge_Die_Label.jpg

Skylake (faked, but good enough to show my point):
SKYLAKE_H_Map_Speculation.png

That's quite a costly method of just competing with NV's budget tiers.


Secondly, by then those budget tiers will probably see two new generations of performance increase as well. It's a moving goal post.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I think this forum is actually far more balanced towards AMD GPUs than the public at large. Outside this forum most people really only know GeForce. AMDs branding is and has always been kind of idiotic. GeForce sounds cool and powerful. Radeon sounds like something that will literally hurt you, like give you cancer or something. Who knows how many sales have been lost due to that silliness alone. But AMD keeps doing it. A10 and FX are unsearchable terms. i7, i5, i3, pentium, and even celeron are searchable terms. Sure you might get some BMW products mixed in with your results, but by and large when you search for these terms you tend to get intel products. When you search for AMD A10, the first thing that happens is the search engine autocorrects "AMD" to "AND". lol. And the A10 yields a whole bunch of ARM products, usually cheap chinese tablets.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
i think this forum is actually far more balanced towards amd gpus than the public at large. Outside this forum most people really only know geforce. Amds branding is and has always been kind of idiotic. Geforce sounds cool and powerful. Radeon sounds like something that will literally hurt you, like give you cancer or something. Who knows how many sales have been lost due to that silliness alone. But amd keeps doing it. A10 and fx are unsearchable terms. I7, i5, i3, pentium, and even celeron are searchable terms. Sure you might get some bmw products mixed in with your results, but by and large when you search for these terms you tend to get intel products. When you search for amd a10, the first thing that happens is the search engine autocorrects "amd" to "and". Lol. And the a10 yields a whole bunch of arm products, usually cheap chinese tablets.

10/10 :d
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I think this forum is actually far more balanced towards AMD GPUs than the public at large. Outside this forum most people really only know GeForce. AMDs branding is and has always been kind of idiotic. GeForce sounds cool and powerful. Radeon sounds like something that will literally hurt you, like give you cancer or something. Who knows how many sales have been lost due to that silliness alone. But AMD keeps doing it. A10 and FX are unsearchable terms. i7, i5, i3, pentium, and even celeron are searchable terms. Sure you might get some BMW products mixed in with your results, but by and large when you search for these terms you tend to get intel products. When you search for AMD A10, the first thing that happens is the search engine autocorrects "AMD" to "AND". lol. And the A10 yields a whole bunch of ARM products, usually cheap chinese tablets.

I dont know what engine you are using, but i just searched "AMD A10" on google and it found the real AMD A10 APUs in the first page.
When i searched for "i3" i god mixed results with Intel and BMW. It also found this https://i3wm.org/
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I dont know what engine you are using, but i just searched "AMD A10" on google and it found the real AMD A10 APUs in the first page.
When i searched for "i3" i god mixed results with Intel and BMW. It also found this https://i3wm.org/

Used Google. Searched "i3" first hit was Intel.

Searched "A10" first hit was free games online.

Hell the website is www.a10.com ha-ha.