[TPU] Nvidia prepares "price cuts across it's entire lineup"

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
With AMD detailing its Radeon R9 and R7 series, especially at some very attractive sub-$299 price-points for the most part, there are jitters being felt at NVIDIA. The company is expected to unveil one or two new sub-$250 GeForce GTX SKUs around mid-October, 2013. The company is also expected to introduce price-cuts across its entire lineup, to make it competitive with AMD's. NVIDIA could tap into its existing GK104 and GK106 silicons to carve out the two new SKUs ranging between $149.99 and $249.99. The idea here would be to topple Radeon R9 270X. Price-cuts could be directed at the likes of GeForce GTX 760 and GTX 770, to make them competitive with the Radeon R9 280X, while in anticipation of the $599 pricing of the R9 290X, NVIDIA could rethink pricing of its $650 GeForce GTX 780, and $1000 GTX TITAN.

Source:
http://www.techpowerup.com/191829/nvidia-prepares-two-new-sub-250-skus-price-cuts.html

This, folks, is why competition is an awesome thing. Great for everyone regardless of which brand anyone prefers!

Hopefully the 780 lowers to the 6 hundy mark.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Competition? I thought it was because both companies have to sell the same products for 2½-3 years. The increased time in node shrinks is the real killer. And we get more and more rebrands.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
At least we get a little bit more choice these days. 10 years ago, we'd have 1 to 3 models per generation. Now we have a slew of options to choose from. Yes, some are rebrands, but they do sometimes knock the price points down.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Duopoly much? Let's stagger the releases so price drops = last models highest price. (580 = $550, Titan the successor = $1k) Dropping the price will still probably be above the MSRP of the last gen card. 770 is just the same as the 680 which is the 560 ti successor. If it hits $300 or less that's only the 560 TI normal price (after it's -680 been milked out from $500 for 1.5 years).

I agree it's a good thing but people "forget" they just raped everyone for the whole generation (both sides, but NV particularly badly) so the price "drops" are bring prices back to the last generations initial price.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Competition? I thought it was because both companies have to sell the same products for 2½-3 years. The increased time in node shrinks is the real killer. And we get more and more rebrands.

The node shrinks are part of it, most companies won't be able to use new nodes from the offset because the number of SKUs a firm has to sell to make a "break even" profit point is astronomical. Unless things dramatically change in the next 8 months or so, I imagine that only Apple and Qualcomm will even have access to TSMC 20nm in 2H 2014, and TSMC's 20nm isn't as good as intel's 22nm in terms of leakage due to the lack of FinFETs. Will be interesting to see how further node shrinks play out on the market.

If intel made discrete GPUs they would have an astronomical advantage over Nvidia and AMD because they (intel) are 3 years ahead of TSMC in terms of technology.... of course they don't, so.... Broadwell's mobile iGPU should be pretty amazing, though. ;)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The node shrinks are part of it, most companies won't be able to use new nodes from the offset because the number of SKUs a firm has to sell to make a "break even" profit point is astronomical. Unless things dramatically change in the next 8 months or so, I imagine that only Apple and Qualcomm will even have access to TSMC 20nm in 2H 2014, and TSMC's 20nm isn't as good as intel's 22nm in terms of leakage due to the lack of FinFETs. Will be interesting to see how further node shrinks play out on the market.

But without node shrinks. Nothing really happens on the dGPU front as we can see with 2-3 rebrands of the same product. Slight tweaks if we are lucky. And as you say, 20nm dGPUs are a little year away at best.

Not to mention first they want to sell us 20nm dGPUs, then 20nm with finfets. Sorry I mean "16nm" with finfets. :biggrin:

I have a feeling 20nm will start at 600-700$ (if not more) to make sure it can carry pricecuts a long the years.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
If intel made discrete GPUs they would have an astronomical advantage over Nvidia and AMD because they (intel) are 3 years ahead of TSMC in terms of technology.... of course they don't, so.... Broadwell's mobile iGPU should be pretty amazing, though. ;)

Neither you or anybody else actually believes this. And before you claim it's because Intel isn't interested, the GPU market is worth many $billions per year that Intel could certainly make use of.

They don't have the architecture to compete with AMD or Nvidia in GPU - they are about a node behind in performance while being a physical node ahead.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Neither you or anybody else actually believes this. And before you claim it's because Intel isn't interested, the GPU market is worth many $billions per year that Intel could certainly make use of.

Many billions? AMDs entire graphic and visual division got a what, 1.2-1.3B$ yearly revenue. I dont know about nVidia.

I am quite sure its a better deal to manufactor CPUs than GPUs. Also why AMD and nVidia cant even compete on wafer price with Qualcomm and Apple for 20nm.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Neither you or anybody else actually believes this. And before you claim it's because Intel isn't interested, the GPU market is worth many $billions per year that Intel could certainly make use of.

They don't have the architecture to compete with AMD or Nvidia in GPU - they are about a node behind in performance while being a physical node ahead.

Yeah, actually, Intel isn't interested because they don't really care about gaming. Intel got in the iGPU business primarily because of Apple urged them to do so, and when Apple gives you more business and sales than any other PC sales, well you do the math.

Intel could certainly do it if they wanted to, and they have the technology to fit billions more transistors in a same die size while having substantially better leakage characteristics. So yeah, if intel wanted to, they could dominate the dGPU industry and it wouldn't be difficult. I don't think you understand the technology they have allows them have a far higher transistor budget with much better leakage and efficiency than AMD or nvidia will have access to for the next 3 years. Furthermore, TSMCs 20nm will still be substantially worse than Intel's 22nm - without FinFETS it will be 100% similar to their 28nm node in terms of leakage. That is to say, TSMCs 20nm will have similar efficiency characteristics to 28nm - in other words, NO IMPROVEMENT IN LEAKAGE - but will have higher desnsity. That's it. At least Intel isn't outright deceptive with their node descriptions, because TSMC's "16nm" is an outright lie. There is no other way to describe it. Their 16nm is actually their "20nm" with FinFETS, and won't be ready until what? 2015? 2016? And each vendor will need to sell 750,000 units to break even with the 16nm wafer costs. Yeah, have fun with that. If you think anyone but giant corporations will have access to these new nodes at TSMC you may be in for a rude awakening. The wafer costs are sky high and the break even points require sales near 1 million.

Like I said, intel doesn't care about gaming. If they did, they could easily destroy both AMD and nvidia - they have both the money and technology to do so. Again, it isn't hard when intel has the capability to fit billions more transistors with better efficiency than AMD or nvidia have access to for the next 3 years. But intel doesn't care. They only care about mobile efficiency and mobile SKUs - a far larger market. Which is fine by me.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Competition? I thought it was because both companies have to sell the same products for 2½-3 years. The increased time in node shrinks is the real killer. And we get more and more rebrands.


While I agree that the node shrinks slowing down certainly factor in, given the timing of these price cuts I'm going to assume competition has a lot to do with it.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Many billions? AMDs entire graphic and visual division got a what, 1.2-1.3B$ yearly revenue. I dont know about nVidia.

I am quite sure its a better deal to manufactor CPUs than GPUs. Also why AMD and nVidia cant even compete on wafer price with Qualcomm and Apple for 20nm.

What CPU's do you mean? Atom for example is losing Intel well over a $billion a year - I'm sure they would rather be making money on GPU instead, regardless of how little that would be.

I mean they are even getting into zero-margin industrial stuff with Quark so basically Intel covers everything except discrete GPU and Consoles. If there is money to be made, Intel goes there. They don't go to discrete GPU because they can't compete.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I thought Intel cares only about $... Color me surprised!
Intels process superiority is show broadly in the epic igpu failure where they need process advatage and 1,5 die size to match the performance of amd APUs.

I know a genius that could design an APU with CPU performance of 4x 4960 and GPU performance of 10 Titans on a 40nm node with a die size of 100mmsq... Guess what... He doesn't care!
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I thought Intel cares only about $... Color me surprised!
Intels process superiority is show broadly in the epic igpu failure where they need process advatage and 1,5 die size to match the performance of amd APUs.

I know a genius that could design an APU with CPU performance of 4x 4960 and GPU performance of 10 Titans... Guess what... He doesn't care!

Look at the overall picture. AMD can't compete with intel in terms of mobile efficiency and AMD is getting slaughtered in the ultrabook/convertible sales market - which is what the iGPU is designed for. AMD can't compete with the Iris Pro in terms of performance and none of their mobile SKUs have the performance per watt of intel's offerings. That's why you see very few mobile SKUs with AMD APUs while you see HD5000,HD5200, and Iris Pro everywhere. Intel makes a better overall product with better efficiency for mobile - and they are better balanced products with great CPU performance and great GPU performance for mobile. Whereas AMD mobile APUs might have decent GPU performance, but will have poor CPU performance and worse efficiency. They are not balanced products, whereas intel has a much better, "balanced" product for the mobile market. If you're talking about desktop LGA, nobody really gives a crap about that, certainly intel doesn't. Intel is going after the market that is selling (mobile).

I appreciate AMD GPUs and love the competition between AMD/NV but as far as AMD competing with intel in terms of mobility, that is going to be futile. You can call it what you want but intel's process advantage gives them the ability to have more transistors with better efficiency than AMD (or nvidia) have access to for the next 3 years - AMD can compete on price and that is about it because intel has a huge advantage in terms of mobility. This is getting off topic though, I digress.

Back to the topic...

Anyway, what I was saying is that neither AMD or nvidia can compete with intel on a technology basis since intel is far ahead of TSMC - but luckily, that is a non issue. Intel isn't focused on gaming, they are focused on mobility. That is giving us the great competition between AMD and nvidia and as you're seeing, that presumably is going to result in nvidia price cuts. We, the consumers win.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Any word on a Titan price drop? I could go for a 4th card.

Every penny counts... I would wait a week or two.

Look at the overall picture. AMD can't compete with intel in terms of mobile efficiency and AMD is getting slaughtered in the ultrabook/convertible sales market - which is what the iGPU is designed for. AMD can't compete with the Iris Pro in terms of performance and none of their mobile SKUs have the performance per watt of intel's offerings. That's why you see very few mobile SKUs with AMD APUs while you see HD5000,HD5200, and Iris Pro everywhere. Intel makes a better overall product with better efficiency for mobile - and they are better balanced products with great CPU performance and great GPU performance for mobile. Whereas AMD mobile APUs might have decent GPU performance, but will have poor CPU performance and worse efficiency. They are not balanced products, whereas intel has a much better, "balanced" product for the mobile market. If you're talking about desktop LGA, nobody really gives a crap about that, certainly intel doesn't. Intel is going after the market that is selling (mobile).

I appreciate AMD GPUs and love the competition between AMD/NV but as far as AMD competing with intel in terms of mobility, that is going to be futile. You can call it what you want but intel's process advantage gives them the ability to have more transistors with better efficiency than AMD (or nvidia) have access to for the next 3 years - AMD can compete on price and that is about it because intel has a huge advantage in terms of mobility. This is getting off topic though, I digress.

Back to the topic...

Anyway, what I was saying is that neither AMD or nvidia can compete with intel on a technology basis since intel is far ahead of TSMC - but luckily, that is a non issue. Intel isn't focused on gaming, they are focused on mobility. That is giving us the great competition between AMD and nvidia and as you're seeing, that presumably is going to result in nvidia price cuts. We, the consumers win.

Don't make me redirect you to benchmarks with Irispro performance and power consumption. Loosing to higher clocked variant without igp + gt650m on performance with same power consumption.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Well this could go well for me if my buddy purchases my 470GTX. Was in the mood for the 760. But if the 770 comes down enough. Ill get that.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Great, I hope AMD fires back. The 7850 2gb needs to come down to $120 before rebates. It touched 130 for a quick sec on newegg, may have been a misprice though, same card is now 170.
 

Edgy

Senior member
Sep 21, 2000
366
20
81
Hmm... Not sure if Intel can dominate ENTIRE graphics market (especially discrete) without AMD/Nvidia graphics IP... I thought that's one of the primary reasons why Larabee was based on x86 tech?

Personally - I don't give a damn what process node they're in and if the new graphics cards are new silicon or re-brands etc., All it needs to do is improve in performance, efficiency, and value compared to previous "generation."

Nothing is 100% confirmed yet but it looks as if AMD (and Nvidia by planning to slash prices) are meeting at least my requirements
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Duopoly much? Let's stagger the releases so price drops = last models highest price. (580 = $550, Titan the successor = $1k) Dropping the price will still probably be above the MSRP of the last gen card. 770 is just the same as the 680 which is the 560 ti successor. If it hits $300 or less that's only the 560 TI normal price (after it's -680 been milked out from $500 for 1.5 years).

I agree it's a good thing but people "forget" they just raped everyone for the whole generation (both sides, but NV particularly badly) so the price "drops" are bring prices back to the last generations initial price.
*nods*
Although, since AMD sees fit to release the new generation mid-range cards with lesser amounts of VRAM than the last-gen main-line cards, now's the time to finally get that 7950 or 7970 at a reasonable price, the same or better price than the new mid-range cards, with the same or more VRAM than the new mid-range cards.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Would they even price cut Titan ? The card is irrelevant now and they ought to just discontinue it. My current 780s are faster than my Titans were and cost $330 less if you get the air-cooled models.

The R9 290X is going to make that card look even worse. My opinion is make the 780 a second tier card for $500 and redo the Titan as 780ti with 3GB of VRAM for $600-650.

Nvidia is going to be in a bad spot with AMD likely taking Battlefield 4 benchmarks with R9 290X and who knows how bad it will look once Mantle is released for BF4 in December. Battlefield is the game that drives upgrades because it's one players play for years and buy GPU upgrades for specifically. Nvidia did so well with the 680 at launch in part because of how they performed in BF3. Maybe we will see some deep price cuts to match R9 290X & 280X. Unless R9 290X really steals the show in BF4 making me want to switch, I could go for a third 780 with a price slash on it :D
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
I don't think you understand the technology they have allows them have a far higher transistor budget with much better leakage and efficiency than AMD or nvidia will have access to for the next 3 years. Furthermore, TSMCs 20nm will still be substantially worse than Intel's 22nm - without FinFETS it will be 100% similar to their 28nm node in terms of leakage. That is to say, TSMCs 20nm will have similar efficiency characteristics to 28nm - in other words, NO IMPROVEMENT IN LEAKAGE - but will have higher desnsity.

I think you should stop spreading misinformation

http://www.eda-stds.org/edps/Papers/4-4 FINAL for Tom Quan.pdf

refer to slide 19.

16FF/28HPM 16FF/20SoC
Speed @ same total power 38% 20%
Total power saving @ same speed 54% 35%
Gate density 2X 1.1X

Since 16FF is 20% faster at same total power compared to 20SOC and 38% faster compared to 28HPM and if we keep speed with 28HPM as 1x and speed with 16FF as 1.38x then speed with 20nm is 1.38x / 1.2 = 1.15x

So there is a 15% speed improvement at same power from TSMC 28 HPM to TSMC 20 SOC. remember 28 HPM is the best 28nm high k process at TSMC and superior to TSMC's 28HP and 28HPL.

the key with 20nm is there is a close to doubling of transistor density at 1.9x. so this allows much bigger chips to be produced. they can be clocked at lower speeds/voltage so that within the same power budget you can still get a significant performance improvement. look at the Apple Cyclone core on Samsung's 28nm process.

http://www.anandtech.com/show/7355/chipworks-provides-first-apple-a7-die-shot

http://www.anandtech.com/show/7335/the-iphone-5s-review/5

Cyclone clocked at 1.3 Ghz is competing with Baytrail boosting to 2.4 Ghz. Even in multithreaded benchmarks like cinebench r11.5 Baytrail is running at close to 2.4 Ghz. you can derive that from the scaling from single thread to multi thread performance.

That's it. At least Intel isn't outright deceptive with their node descriptions, because TSMC's "16nm" is an outright lie. There is no other way to describe it. Their 16nm is actually their "20nm" with FinFETS, and won't be ready until what? 2015? 2016? And each vendor will need to sell 750,000 units to break even with the 16nm wafer costs. Yeah, have fun with that. If you think anyone but giant corporations will have access to these new nodes at TSMC you may be in for a rude awakening. The wafer costs are sky high and the break even points require sales near 1 million.
TSMC, Samsung and GF are well funded to pursue these multi billion dollar efforts. Most importantly these foundries have customers who sell hundreds of millions of chips every year like Qualcomm, Apple, Nvidia, AMD and Samsung themselves. Their chips power desktop, mobile, console and PC gaming, HPC/servers and other markets like embedded. So there is no shortage of customers for these foundries.

If anything I have to say further node progresses are getting difficult and out of Intel's control as lithography becomes the primary constraint and not just the transistor device. There is a very good chance that Intel will not get to 7nm before the end of the decade. the 2 year node progression for Intel is going to be very difficult going forward. Intel 14nm requires double patterning immersion litho, 10nm requires quadruple pattering immersion litho and 7nm is mostly not possible without EUV. thats the reason Intel, TSMC and Samsung have invested in ASML for EUV development. still EUV seems to be ready only by the end of the decade.

http://www.intc.com/releasedetail.cfm?ReleaseID=690165
http://www.asml.com/asml/show.do?ctx=5869&rid=46974
http://www.asml.com/asml/show.do?ctx=5869&rid=46903
http://www.forbes.com/sites/jimhandy/2012/08/27/whys-everyone-investing-in-asml/


Like I said, intel doesn't care about gaming. If they did, they could easily destroy both AMD and nvidia - they have both the money and technology to do so. Again, it isn't hard when intel has the capability to fit billions more transistors with better efficiency than AMD or nvidia have access to for the next 3 years. But intel doesn't care. They only care about mobile efficiency and mobile SKUs - a far larger market. Which is fine by me.
this is utter crap. Why isn't Intel able to destroy Nvidia in HPC where Intel is competing with Knights Corner. Nvidia and AMD's architectures are vastly superior to Intel's for GPU compute and HPC. Nvidia dominates the HPC market with their Teslas. :biggrin:
 
Last edited:

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Would they even price cut Titan ? The card is irrelevant now and they ought to just discontinue it. My current 780s are faster than my Titans were and cost $330 less if you get the air-cooled models.

The R9 290X is going to make that card look even worse. My opinion is make the 780 a second tier card for $500 and redo the Titan as 780ti with 3GB of VRAM for $600-650.

Nvidia is going to be in a bad spot with AMD likely taking Battlefield 4 benchmarks with R9 290X and who knows how bad it will look once Mantle is released for BF4 in December. Battlefield is the game that drives upgrades because it's one players play for years and buy GPU upgrades for specifically. Nvidia did so well with the 680 at launch in part because of how they performed in BF3. Maybe we will see some deep price cuts to match R9 290X & 280X. Unless R9 290X really steals the show in BF4 making me want to switch, I could go for a third 780 with a price slash on it :D


I was playing BF4 beta with the 13.4 drivers last night and I was mostly happy with the performance already.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
NV got to make money hand-over-fist for months due to lack of competition, and now they have the margin built in to lower prices and negate AMD's late-to-the-party launch.

This isn't exactly 4870 launching into GT200, that is for certain.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Would they even price cut Titan ? The card is irrelevant now and they ought to just discontinue it. My current 780s are faster than my Titans were and cost $330 less if you get the air-cooled models.

Believe it or not, people use graphics cards for more than gaming. Highend gaming is an extremely small percentage of the entire GPU market. Titan is not a bang for buck champion in the gaming arena. However, compared to this:

http://www.newegg.com/Product/Produc...82E16814132010

it's an absolute bargain. For many professionals, the Titan is good enough for their line of work (same performance, no ECC RAM, no scalability) which saves them serious money over a Tesla card. That's why NVidia has sold so many Titans. Not because gamers are knocking down their front door to buy them. With Titan's unique position of uncrippled compute capabilities, the price does not need to be dropped. It's competing with no one.

Why isn't Intel able to destroy Nvidia in HPC where Intel is competing with Knights Corner. Nvidia and AMD's architectures are vastly superior to Intel's for GPU compute and HPC. Nvidia dominates the HPC market with their Teslas.

Because GPU architectures are ideal for HPC while general purpose CPU's are not. Intel has the superior technology, but that doesn't matter if its the wrong tool for the job. Intel certainly has to capability to dominate the dGPU market, but it would take time to catch up and a huge upfront investment for a market that won't return enough to make that investment worthwhile.