Why are graphic cards not getting cheaper?!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
More performance at the same price they were charging last year and you think that's ripping customers off ?

ROFL!

Those Sandy Bridge users who keep lamenting about lack of CPU progress are just dumb. Just WTF is so bad about *not* giving any more money to Intel if market conditions are still making you satisfied with a 4 year old CPU?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Those Sandy Bridge users who keep lamenting about lack of CPU progress are just dumb. Just WTF is so bad about *not* giving any more money to Intel if market conditions are still making you satisfied with a 4 year old CPU?

I'm pretty pissed my 4770k isn't going to be outdated soon. Intel is a pretty bad company tbh. I wanted this 4770k to last 2 years. Instead, it looks like it'll last 5. I'll probably just throw it in a dumpster next year screw intel. /sarcasm

The reason I bought the 4770k WAS sandybridge.

Intel said "We're going to give you performance SO fast, you won't even want to upgrade. It's not even expensive either, it's less than a flagship GPU for the mobo/CPU"

Was I supposed to say no to that deal?

If Nvidia/AMD had offered me the same deal, I would have been ecstatic.

Intel checked EVERY box with it's processors.
FAST (Like don't even need to upgrade faster).
EFFICIENT
FLEXIBLE (Applies to newer processors, but man having something like Quicksync available? That's a game changer for my little bro who wouldn't have had a way to do fast encodes otherwise and would have had his main PC tied up for hours)
etc.

You have to TRY to think of ways to be made after your CPU purchase with intel. You really do.

Intel has done such a bangup job with CPU performance and CPU longevity, ESPECIALLY for gamers, that I'll probably buy the HEDT platform before I need to upgrade, just because I'll want something new to play with.

If I want to be mad on the gaming front, there are a lot of things to get mad about, if you're an i5-i7 user, you're not bad about your CPU purchase that's for sure.
 
Last edited:

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
They are not buying, that is why we have RECORD LOW NUMBERS, record low numbers!!!!!

Only morons with no brains buy them or people who own 4+ years old hardware so they have no choice! Everyone who owns a Nvidia 600 series or AMD 7000 series doesn't need to upgrade, why should they? They have overpriced cards with only few that actually make sense like a 290 on sale, like a 280x on sale, etc...

If you can get a 290 for $240-250 its worth it, if you can get a 280x at $170 its worth it, but again you have to look at sales and stuff.

Otherwise you have to be a moron or a blind Nvidia worshiper to be buying their overpriced turds! Even religious zealots are not so zealous about eating up what their god(nvidia) serves them! To some people Nvidia is their god, they probably have shrines to which they pray before they go to bed!
Or, like me, you need some Nvidia exclusive feature (CUDA... cough) and for the mid-range, stuck between a 960, or fork over another Benjamin for the 970 with absolute jack-squat in between.

Not saying the 960 is bad, if you find one cheap, it's a pretty rocking chip, but good lord, would it kill Nvidia to plop something in-between that giant performance gap.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I'm pretty pissed my 4770k isn't going to be outdated soon. Intel is a pretty bad company tbh. I wanted this 4770k to last 2 years. Instead, it looks like it'll last 5. I'll probably just throw it in a dumpster next year screw intel. /sarcasm

The reason I bought the 4770k WAS sandybridge.

Intel said "We're going to give you performance SO fast, you won't even want to upgrade. It's not even expensive either, it's less than a flagship GPU for the mobo/CPU"

Was I supposed to say no to that deal?

If Nvidia/AMD had offered me the same deal, I would have been ecstatic.

Intel checked EVERY box with it's processors.
FAST (Like don't even need to upgrade faster).
EFFICIENT
FLEXIBLE (Applies to newer processors, but man having something like Quicksync available? That's a game changer for my little bro who wouldn't have had a way to do fast encodes otherwise and would have had his main PC tied up for hours)
etc.

You have to TRY to think of ways to be made after your CPU purchase with intel. You really do.

Intel has done such a bangup job with CPU performance and CPU longevity, ESPECIALLY for gamers, that I'll probably buy the HEDT platform before I need to upgrade, just because I'll want something new to play with.

If I want to be mad on the gaming front, there are a lot of things to get mad about, if you're an i5-i7 user, you're not bad about your CPU purchase that's for sure.

Compared to 2 year old Haswell, desktop Skylake is just plain embarrassing pointless if you don't give a flying crap to USB3.1. SB users are still having a good laugh on how much they got out of a 4 year old platform.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
They are not getting cheaper because both AMD and NV are greedy mot*******.
But people already spoken because they literally dont selling much dGpus.
 

Carnage1986

Member
Apr 8, 2014
92
0
0
Because Moore Law is not a "law" anymore. (As far as i know Moore was uneasy about his theory's referring as law)

They are manufacturing 28 nm since 3.5-4 years. Yields in 20 nm are not very successful. Brand new chips are more expensive in today as againist the past:

EEOL_2014APR22_PL_NT_01_01.jpg
 
Last edited:

tajoh111

Senior member
Mar 28, 2005
348
389
136
I don't think it's fair to blame everything on TSMC/GloFo, etc. Even before we knew that 20/22nm GPU node was dead, NV launched GTX680 2GB for $499 and 680 4GB for $579. At that point, how was it TSMC's fault that NV doubled the prices of a mid-range chip from $249 with GTX560Ti to $499 with GTX680? Gamers bought those cards so NV said, great, we'll raise mid-range 680's successor to $550 and see if it sells. Sold like hot cakes!

Looking at NV's Gross Profit, it's not TSMC that's screwing us/resulting in high prices. NV is simply raising prices and pocketing the profits. :biggrin:

6-months ended July 2008 (GTX280 series)
32%

Q4 2010->Q4 2011 (GTX480/580 series)
44-48%

Q2 2013 (GTX670/680 series)(quarter ended July 2012)
52%

Q2 2015->Q2 2016 (GTX970-980Ti series)
55-56%

If wafer prices/prices of lower nodes have skyrocket so much, how come NV's gross margins increased from 32%-->44/48% and now are record high 55-56%?

The Titan and Titan X are what GTX280/580 cards used to be -- $499 products. What the 980Ti is, a cut-down flagship, is what GTX470/570 used to be. What used to be a GTX560/560Ti is now GTX970/980. AMD is saddled with massive 2B in debt and trying to compete with Intel and NV on both fronts which is essentially impossible so NV is just loving every minute of it. ():) The end result is a 14% increase in performance from a GTX760->960 in nearly 2 years. :thumbsdown:



I would add to that AMD/NV bifurcating a generation is likely the new normal. AMD already publicly stated that they will try to release newer GPUs quicker than the time-span between R9 290X and Fury X. This means it's probably better for them to squeeze 20-25% more performance every 12-15 months rather than wait 18-24 months to get 40-45%. NV can just feed off AMD's vulnerable financial position and lack of sources and just slowly trickle down performance to maximize profits. A 350mm2 GP204 with 8GB HBM2 and 20% more performance over the 980Ti priced at $550 would sell like hot cakes, while GTX970 successor at $350-400 with near 980Ti level of performance would look incredible. I think $330-375 will become the new sweet-spot for a desktop gaming GPU.

Even if NV gives 70% boost to the 960's successor, that's barely faster than an after-market 290 (reference 290X). The sub-$200 space is becoming extremely weak. Historically, once we passed the $200-250 mark, diminishing returns started to kick in quickly but today it's actually better to spend extra on a $250-300 card and skip the lower end ones.

perfrel_2560.gif

That's ridiculous to blame all this on Nvidia when it was AMD that shot the first bullet.

For the most part, AMD has made mid range sized parts and charged prices slightly above midrange for these parts. What started this rip off generation was AMD when they started to charge high end prices for midrange size parts.

AMD released the 7970 at $550 months before the gtx 680 launched at $499. The worst part of it was that AMD didn't particularly move the price to performance bar with this move. It's price to performance was the same as the gtx 580 which didn't provide good price to performance in the first place.

https://www.techpowerup.com/reviews/AMD/HD_7970/30.html

The gtx 680 provided a generous increase as far as price to performance goes over the 7970(less so over the gtx 580 as prices had fallen already).

The fact that a gtx 680 had better price to performance over a 7950 just showed how out of line AMD pricing was. It wasn't just tahiti that raised the pricing bar, it was pitcairns too.

The 7870 was priced at 350 dollars at launch. You have to remember the die size of pitcairns was 212mm2. That's more along the lines of low end die size and AMD was charging more than the gtx 970 pricing.

And this was all before keplar was launched. AMD horrible pricing is what lead to the collapse of their marketshare as most people stood on the fence to see what Nvidia had in store.

If AMD didn't decide to raise the pricing bar, all of Nvidia's line would have been cheaper. It's mostly because Nvidia that all of AMD lineup got price cuts and they became the value company again.

AMD value pricing at the moment is more of a reflection of the cards not selling and the market correcting the pricing than AMD generosity.

When the generally value brand starts raising prices, it give the market leader the ability to raise prices.

BTW, its partners that decide how much it costs for additional memory, not nvidia.
 

Vinny N

Platinum Member
Feb 13, 2000
2,278
1
81
I'm really glad this thread exists...it seems to sum up all of the current pain points of naturally upgrading a desktop PC piecemeal. I went from a Core2Duo 1.8ghz (2006-2011) to Xeon X3360 (2012-2014) and finally to a i7-4790K (current).

For all three of these CPUs I have used the same NVIDIA 1GB GTX 560 (2011-current) and with the X3360 I actually was able to play most of the games I wanted with reasonable settings, usually 720p and medium-high. I had upgraded to the Haswell feeling it was overdue and to avoid being CPU bottlenecked at the time of an eventual GPU upgrade.

The more I read, the more torn I was about where to buy in. And there had been many times I had eyeballed a used 660 or a 750 even but it never looked like it would have been the leap the GTX 560 was (from a ATI HD 2600).

I just recently returned a refurb 4GB GTX 960 SSC for $195, it still didn't feel like a sizable leap. Some hungrier games (Witcher 3) or poorly optimized engines (Dreamfall Chapters running Unity 4) ran almost the same as the GTX 560!

While waiting for my refund, I've been reading threads like these and looking at more techpowerup summary charts thinking the GTX 970 is the answer, but as is, I want to hold out for a GTX 1000 series. I turned settings up/down for a good hour yesterday on Witcher 3 and I think I should just play it at 720p with 35-45fps.

And it's not as though the GTX 970 isn't fair at 50% more for 50%+ performance over a GTX 960, right? I think I'm just looking for the astounding leap I would expect from the 4 year gap between the 560 and 960 for about the same price. $185US in 2011 is about $196US in 2015.

So, I guess my question is, are we expecting too much innovation? Will upgrades just give ever diminishing returns in satisfaction from here on out? This feels a lot like replacing a vehicle. There might be some bells and whistles or some gains to be had but at the end of the day, going up 4 model years in a vehicle isn't going to mean much unless there was a major redesign...
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
I just recently returned a refurb 4GB GTX 960 SSC for $195, it still didn't feel like a sizable leap. Some hungrier games (Witcher 3) or poorly optimized engines (Dreamfall Chapters running Unity 4) ran almost the same as the GTX 560!
Same resolution and settings? Hard to belive you can not tell the difference at 1920x1080 or lower resolutions.
 

mikeymikec

Lifer
May 19, 2011
21,134
16,336
136
So, I guess my question is, are we expecting too much innovation?

I upgraded from a 5770 to a 750Ti. The only gaming difference I noticed was when I played Tomb Raider 2013 in DX11 mode as the 5770 really struggled with it and the 750Ti handled it fine (both in 1080p). NB: My gaming habits these days are not generally revolving around the latest games. In recent years I've played a lot of Skyrim, SC2, XCOM EU and completed TR2013 once.

However, the upgrade really paid off in the way I hoped because the noise levels dropped sharply. My 5770 would run the fan at full tilt (5000RPM) when gaming to keep the temp below temps like 80-90C (nothing to do with dust levels), which was pretty damn noisy, but also the simplest things like browsing with hardware acceleration enabled would cause a few seconds of noticeable noise difference on every single page load! Now I would regard the system noise levels when gaming to be "slightly elevated" and I no longer turn the volume up in order to drown out the PC noise. What my 5770 did with its temp at around 80C, the 750Ti does at 40C or less.

I think my graphics card upgrade habits will in future revolve around the occasions when a new game is released that piques my attention enough to upgrade. Same goes for the CPU even though I've experienced the upgrade itch more than once for my Ph2 960T and I've had my eye on a 4790k and a cheap but good mobo aaand the fact that the 4790k is about £100 cheaper than the Skylake i7 and I won't have to worry about re-using my RAM in a compromise board solution. Yup, the itch is still there :)
 
Last edited:

Seba

Golden Member
Sep 17, 2000
1,599
259
126
I upgraded from a 5770 to a 750Ti. The only gaming difference I noticed was when I played Tomb Raider 2013 in DX11 mode as the 5770 really struggled with it and the 750Ti handled it fine (both in 1080p).
That is another understatement. From HD 5770 to GTX 750 Ti there is a performance increase of around 100% (double performance).

I get that low-mid range graphics card are not as powerful as people wish, but those cards are not as bad as some people here are saying they are.

I can't find a direct comparison between GTX 560 and GTX 960, but here is a comparison between GTX 560 Ti (which is around 10% better than GTX 560) and GTX 760 (which is at least 10% slower than GTX 960).
 

mikeymikec

Lifer
May 19, 2011
21,134
16,336
136
That is another understatement. From HD 5770 to GTX 750 Ti there is a performance increase of around 100% (double performance).

No it isn't, it's a statement of fact.

Please read what I wrote again:

I upgraded from a 5770 to a 750Ti. The only gaming difference I noticed was when I played Tomb Raider 2013 in DX11 mode as the 5770 really struggled with it and the 750Ti handled it fine (both in 1080p). NB: My gaming habits these days are not generally revolving around the latest games. In recent years I've played a lot of Skyrim, SC2, XCOM EU and completed TR2013 once.

I did not say "there is no noticeable difference in performance between these two cards". There may be a huge performance difference in some games, just not in the games I usually play.

I should point one thing out though - I bought Skyrim after the 750Ti, so I've never seen it run on the 5770.
 

Sunaiac

Member
Dec 17, 2014
124
172
116
lolilol AMD raised the price

dafuq did I just read ?

AMD raised the price of graphic cards by offering a 7970 40% faster than a GTX580 for 10% more, and we should thank nvidia for fighting to keep prices low with a 680 10% slower than a 7970GHz for only 10% more money ?

really ?
REALLY ?

ppl buying that rip of that was the 680 IS exactly what started that whole "high end 1000€ mid range 550€ low end 200€" thing.

Thank anyone who bought a 680/770/780/titan when AMD had faster for less, not someone who bought a 7970 when nVidia had nothing in 40% range of performance.
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
I did not say "there is no noticeable difference in performance between these two cards". There may be a huge performance difference in some games, just not in the games I usually play.
Then it's like comparing two gaming cards by using the PC to browse the net. That is OK if this is all you need, but do not dismiss the improvements with statements like "the only gaming difference". It is at least misleading when in fact there is a doubling of performance.
 
Last edited:

mikeymikec

Lifer
May 19, 2011
21,134
16,336
136
Then it's like comparing two gaming cards by using the PC to browse the net. That is OK if this is all you need, but do not dismiss the improvements with statements like "the only gaming difference". It is at least misleading when in fact there is a doubling of performance.

The only people it will mislead are those who fail at reading comprehension. I supplied a reasonable amount of context for my post that left very little ambiguity for those who successfully comprehended its content and didn't jump to illogical conclusions.

Furthermore, saying things like "in fact there is a doubling of performance" is much more unhelpful without sufficient context. For example, I very much doubt that those who will read your comments will have the same hardware as was used to run the anandtech benches (which isn't mentioned from what I can see). For example, it's not exactly a reasonable expectation that someone wanting to play up-to-date games is going to shell out on a Core i7* then only buy a 750Ti (which in terms of mainstream gaming cards it could only have been described in its time as low to mid range), and I'm pretty damn sure that if I played the games listed in the AT bench you mentioned on my Ph2 960T that some of those are going to be CPU bound before the graphics card gets pushed to its full potential. In addition, some of those benches are run on low quality, some on high, etc, which I would be surprised has any kind of consistency amongst significant percentages of gamers in the real world. I for example play on 1080p on high graphics settings for every game I play with minimal AA/AF if I can get away with it (I prefer as smooth a frame rate as possible which getting as much image quality as I can), and the only time I can recall messing with graphics settings otherwise was in SC2 with the options that were marked as being more CPU dependent. I personally would probably consider a graphics card upgrade (if I knew it would help) before accepting a significant downgrade in 3D image quality, but again this comes down to a personal question of standards of performance/quality acceptability and budget, again, not elements that can easily be addressed by a single reviewer's benchmarks.

Vinny N's (the guy I responded to) gaming habits probably are radically different from my own (which I won't claim to be in any way definitive), I'm not sure I understand why he upgrades CPUs so often but evidently not the graphics card so his gaming setting preferences probably differ from mine and those from the AT bench. Sure enough he plays at 720p, which is an odd combination with such a high-end CPU, but that's exactly why throwing around statements like "double the performance" as if it's a universally applicable fact is unhelpful. Is it reasonable to assume that a typical 750Ti user is playing the latest games? I don't know, neither do you.

All I can do is make what I consider to be an accurate statement, provide adequate context for it and let people make up their own minds about whether the statements I've made have any validity for their own purchasing strategies. I don't attempt to speak for what people should expect in general.

FWIW, upgrading to the 750Ti was the most personally rewarding graphics card upgrade I can remember doing (for the reasons I've already mentioned), if I exclude that time in 1998 or thereabouts when I swapped out a Cirrus Logic 5446 for a Permedia 2 and experienced Tomb Raider 2 in PROPER 3D for the first time :)

* - which is a reasonable thing to do when a reviewer wants to try and ensure that they're pushing a graphics card to its limits, btw.
 
Last edited:

Seba

Golden Member
Sep 17, 2000
1,599
259
126
You can check other sites for benchmarks. You will see that the performance doubling is evident.

Regarding the use of a weak CPU as a bottleneck: that is another artificial condition, which may be valid on a particular case (such as yours), but it is not valid in a thread which is discussing the general performance and prices of cards or when making a general statement. A $50 Pentium Haswell is enough to not bottleneck a GTX 750 Ti in most games so it is not like you would need for that a $350 CPU anyway.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
36
51
So, I guess my question is, are we expecting too much innovation? Will upgrades just give ever diminishing returns in satisfaction from here on out?

We've been stuck on the same process for much longer than ever before. The engineers haven't had a free lunch by going down a node and cramming double the transistors onto a piece of silicon to bring you more performance. They still managed to increase performance over that time, but it was all hard fought efficiency increases.

We're about to get another node next year, but I don't expect to see the next one after that until the 2020s. Graphics performance increases are going to slow down now that we're hitting ever harder to reach manufacturer node shrinks.
 

flopper

Senior member
Dec 16, 2005
739
19
76
We've been stuck on the same process for much longer than ever before. The engineers haven't had a free lunch by going down a node and cramming double the transistors onto a piece of silicon to bring you more performance. They still managed to increase performance over that time, but it was all hard fought efficiency increases.

We're about to get another node next year, but I don't expect to see the next one after that until the 2020s. Graphics performance increases are going to slow down now that we're hitting ever harder to reach manufacturer node shrinks.

Next we will all enjoy 16nm AMD powa with Dx12 err Mantle.
28nm 5 years and maybe we have 16nm for longer?
ouch I will enjoy next year as christmas every day once the new line up is out.
hardware nerd totally
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
That's ridiculous to blame all this on Nvidia when it was AMD that shot the first bullet.

For the most part, AMD has made mid range sized parts and charged prices slightly above midrange for these parts. What started this rip off generation was AMD when they started to charge high end prices for midrange size parts.

AMD released the 7970 at $550 months before the gtx 680 launched at $499. The worst part of it was that AMD didn't particularly move the price to performance bar with this move. It's price to performance was the same as the gtx 580 which didn't provide good price to performance in the first place.

https://www.techpowerup.com/reviews/AMD/HD_7970/30.html

The gtx 680 provided a generous increase as far as price to performance goes over the 7970(less so over the gtx 580 as prices had fallen already).

The fact that a gtx 680 had better price to performance over a 7950 just showed how out of line AMD pricing was. It wasn't just tahiti that raised the pricing bar, it was pitcairns too.

The 7870 was priced at 350 dollars at launch. You have to remember the die size of pitcairns was 212mm2. That's more along the lines of low end die size and AMD was charging more than the gtx 970 pricing.

And this was all before keplar was launched. AMD horrible pricing is what lead to the collapse of their marketshare as most people stood on the fence to see what Nvidia had in store.

If AMD didn't decide to raise the pricing bar, all of Nvidia's line would have been cheaper. It's mostly because Nvidia that all of AMD lineup got price cuts and they became the value company again.

AMD value pricing at the moment is more of a reflection of the cards not selling and the market correcting the pricing than AMD generosity.

When the generally value brand starts raising prices, it give the market leader the ability to raise prices.

BTW, its partners that decide how much it costs for additional memory, not nvidia.

The problem here is that this assumes that Tahiti was originally conceived as a midrange chip, and it ignores AMD's small core strategy. So, I have to disagree with you. AMD simply ended up falling so far behind that their high-end chip was on-par with Nvidia's midrange chip. They weren't doing the same thing that Nvidia did with the 680 and 980.
 

flopper

Senior member
Dec 16, 2005
739
19
76
The problem here is that this assumes that Tahiti was originally conceived as a midrange chip, and it ignores AMD's small core strategy. So, I have to disagree with you. AMD simply ended up falling so far behind that their high-end chip was on-par with Nvidia's midrange chip. They weren't doing the same thing that Nvidia did with the 680 and 980.

would have worked if 28nm wasnt going so long as it did.
the small die was depending on die shrinks and when that wasnt happening they had to go big also. best card this generation is the fury for sure.
future is AMD
 

Vinny N

Platinum Member
Feb 13, 2000
2,278
1
81
Same resolution and settings? Hard to belive you can not tell the difference at 1920x1080 or lower resolutions.

Yes, with Witcher 3, 720p was the threshold, at 1080p, the gap between the 560/960 was small especially with the higher side of the settings. I heard that performance is continuing to improve on the 900 series though as the drivers further mature.

With Dreamfall Chapters, the resolution didn't matter, both cards scaled the same, only reaching 60fps when all shadows were disabled. I'm pretty sure it's an engine issue. The next release is supposed to update to Unity 5. Of course my dilemma is if i keep waiting to play at max or higher settings, I might as well wait for a next gen card or just play without shadows since the performance is the same at my desired resolution.

There's an interesting series of videos on youtube called Q6600 and GTX 560 where they show many games being playable at 720p or 768p.

I wasn't planning on the X3360 CPU upgrade in between the E6300 and 4790K, but I know I was slightly CPU bottlenecked when playing Dragon Age Inquisition. The X3360 gave me the breathing room to get the GTX 560 up to a playable 40-50fps range at 720p med/high settings.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
dafuq did I just read ?

AMD raised the price of graphic cards by offering a 7970 40% faster than a GTX580 for 10% more, and we should thank nvidia for fighting to keep prices low with a 680 10% slower than a 7970GHz for only 10% more money ?

really ?
REALLY ?

ppl buying that rip of that was the 680 IS exactly what started that whole "high end 1000€ mid range 550€ low end 200€" thing.

Thank anyone who bought a 680/770/780/titan when AMD had faster for less, not someone who bought a 7970 when nVidia had nothing in 40% range of performance.
I agree
seems he does not know of the titan z lol
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Because Moore Law is not a "law" anymore. (As far as i know Moore was uneasy about his theory's referring as law)

They are manufacturing 28 nm since 3.5-4 years. Yields in 20 nm are not very successful. Brand new chips are more expensive in today as againist the past:

Somebody gets it: Economics will be by far the most limiting factor for both gaming hardware and software from here on. A shrinking dGPU market and Konami quitting video games altogether are warning signs for trouble and stagnation ahead, because stuff are just getting too expensive to develop.

At the very least consoles is a primary dev platform with a larger install base, for PCs is going be a vicious cycle of hurt of shrinking PC sales + even more expensive dGPUs -> lower demand for dGPUs -> even less incentive for devs to push graphics on PCs -> even lower demand for dGPUs -> even more expensive GPUs due to lessened economics of scale -> repeat.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Moores law still works.

But since double patterning you have to choose between 2 options. Cheap IC design and high transistor cost. Or expensive IC design and low transistor cost.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
Wait, I know what OP wants. Titan performance at IGP prices.

*prepares for thrown rotten vegetables*