The GTX 780, 770, 760 ti Thread *First review leaked $700+?*

Page 47 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

CFP

Senior member
Apr 26, 2006
544
6
81
Yeah if this card came in at 599, I'd probably regret just buying my 7970. But if it comes in at 699, then that's a much tougher sell.

Let's take a moment to remember the good old days where top tier cards didn't cost this much.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I remember buying the Geforce 4 Ti4600 for $360 (got 10% off as a Software Etc employee) in Feb 2002...

Account for 3% inflation over 11 years, 1.03*11 (conservative inflation), account for the 10% discount, and you're at $553.69.

Just saying, dunno what that means to you.

The math:
360*(1.03^11)*(1/.9)= $553.69
 

tential

Diamond Member
May 13, 2008
7,348
642
121
That's a little extreme. Here's a useful tool for a situation like this: http://data.bls.gov/cgi-bin/cpicalc.pl . As you can see, that $360 is only $465 today.

It depends on the numbers you use obviously. Everyone has different measures of inflation. I chose 3% because that's just a general target. Actual inflation may have been less of course. Inflation in different sectors is also different etc.
Only reason I didn't use CPI is because it includes a LOT of items that have large variability that have nothing to do with inflation such as food.
But yes, you can get different numbers pretty easily, based on what you choose. I just chose the number 3% as the inflation target but a quick wiki search shows that the actual inflation target is 1.7%-2% for the US.

Either way, no matter what method we choose, we can see that GPUs are selling for more today, than they normally were in the past. Exact reason I refuse to buy cards over 500 dollars. Feels way too overpriced.

Edit: I knew I wasn't that far off even using 3% just off the top of my head. You didn't account for the 10% discount. Account for that and you see that the number is infact $517.31.

I confirmed their numbers with CPI tables just to be safe and they're right. I like my major way too much....
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Now go look at how many GPU's Nvidia has over 400mm2

Since the big die strategy nvidia has offered smaller GPU's twice with the flag-ship moniker -- 9800 GTX (G92) -- and the GTX 680 ( GK-104) -- it's not like they haven't done this before.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Since the big die strategy nvidia has offered smaller GPU's twice with the flag-ship moniker -- 9800 GTX (G92) -- and the GTX 680 ( GK-104) -- it's not like they haven't done this before.

Pauly IIRC 9800 GTX didn't replace 8800 Ultra.It was still the fastest card.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
not sure a 2x crippled chip should sell for $200-300 hundred dollars over the same sized 580 from dec.2010 and 2 x 580[@900] are almost = [in mark 3d 11any ways] but maybe it's just me.

Imho,

I thought of this point -- this chip reminds me of the GTX 560 Ti 448 chip for the 5XX family or the GTX 465 for the 4XX Family --and more-so a third tier GK-110 derivative.

Product offerings for GK-110.

2688 CUDA Cores - GeForce/Professional

2496 Cuda cores - Professional

2304 CUDA cores - GeForce

Still no fully enabled core.

Based on market conditions and competitive advantages nVidia can use a third tier GK-110 derivative as their flagship for the 7XX series compared to a fully enabled GF-110 for the 5XX series.

Based on market conditions and competitive advantages, nVidia offered a new sector for compute and gaming named Titan and this wasn't a fully enabled core.

Translation: Amd couldn't compete with the monolith and this generation was the tipping point where there was separation between the two.
 

Saylick

Diamond Member
Sep 10, 2012
4,121
9,641
136
Pauly IIRC 9800 GTX didn't replace 8800 Ultra.It was still the fastest card.

Wasn't the 9800 GTX a rebranded 8800 GTS (G92) with higher clocks? The whole 9xxx generation was mostly a rebranding of the 8xxx series iirc. The 9800 GTX+ was one of the few cards that generation which offered something new until nVidia launched their 200 series.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Imho,

I thought of this point -- this chip reminds me of the GTX 560 Ti 448 successor chip for the 5XX family or the GTX 465 for the 4XX Family --and more-so a third tier GK-110 derivative.

Product offerings for GK-110.

2688 CUDA Cores - GeForce/Professional

2496 Cuda cores - Professional

2304 CUDA cores - GeForce

Still no fully enabled core.

Based on market conditions and competitive advantages nVidia can use a third tier GK-110 derivative as their flagship for the 7XX series compared to a fully enabled GF-110 for the 5XX series.

Based on market conditions and competitive advantages, nVidia offered a new sector for compute and gaming named Titan and this wasn't a fully enabled core.

Translation: Amd couldn't compete with the monolith and this generation was the tipping point where there was separation between the two.

Yep, it's basically a 560 TI 448 @$700. (How much were they again? $270?)

What a rip off. (Assuming the $700 is true at this point)

For those desperate enough to drag inflation into this, the price jumped in ONE generation. Not over 10 years. You don't just take the entire history of a product and suddenly add inflation to it. The previous flagship was $550, inflation is a few percent.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Yep, it's basically a 560 TI 448 @$700. (How much were they again? $270?)

What a rip off. (Assuming the $700 is true at this point)

Imho,


Market conditions and competitive advantages have changed thanks to AMD and nvidia offering evolutionary and incremental price performance with 28nm, with the market accepting it, rightly or wrongly!
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Imho,


Market conditions and competitive advantages have changed thanks to AMD and nvidia offering evolutionary and incremental price performance with 28nm, with the market accepting it, rightly or wrongly!

It's called monopoly or duopoly. No need to sugarcoat it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

Not really, based on AMD and nVidia offered impressive price/performance over the years based on strong competition. This is the first arch generation where both had windows of opportunity or competitive advantages. Maybe the first in history. AMD's execution advantage and launch prices and setting 28nm performance -- nVidia, sustaining this set price/performance with GK-104, extended it with the monolith.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
It depends on the numbers you use obviously. Everyone has different measures of inflation. I chose 3% because that's just a general target. Actual inflation may have been less of course. Inflation in different sectors is also different etc.
Only reason I didn't use CPI is because it includes a LOT of items that have large variability that have nothing to do with inflation such as food.
But yes, you can get different numbers pretty easily, based on what you choose. I just chose the number 3% as the inflation target but a quick wiki search shows that the actual inflation target is 1.7%-2% for the US.

Either way, no matter what method we choose, we can see that GPUs are selling for more today, than they normally were in the past. Exact reason I refuse to buy cards over 500 dollars. Feels way too overpriced.

Edit: I knew I wasn't that far off even using 3% just off the top of my head. You didn't account for the 10% discount. Account for that and you see that the number is infact $517.31.

I confirmed their numbers with CPI tables just to be safe and they're right. I like my major way too much....

http://www.shadowstats.com/alternate_data

http://www.shadowstats.com/imgs/charts/alt-cpi-home2.gif?hl=ad&t=

http://www.shadowstats.com/imgs/sgs-cpi.gif?hl=ad&t=

http://www.shadowstats.com/imgs/sgs-m3.gif?hl=ad&t=1368291331
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Imho,

I thought of this point -- this chip reminds me of the GTX 560 Ti 448 chip for the 5XX family or the GTX 465 for the 4XX Family --and more-so a third tier GK-110 derivative.

Product offerings for GK-110.

2688 CUDA Cores - GeForce/Professional

2496 Cuda cores - Professional

2304 CUDA cores - GeForce

Still no fully enabled core.

Based on market conditions and competitive advantages nVidia can use a third tier GK-110 derivative as their flagship for the 7XX series compared to a fully enabled GF-110 for the 5XX series.

Based on market conditions and competitive advantages, nVidia offered a new sector for compute and gaming named Titan and this wasn't a fully enabled core.

Translation: Amd couldn't compete with the monolith and this generation was the tipping point where there was separation between the two.

Imho,


Market conditions and competitive advantages have changed thanks to AMD and nvidia offering evolutionary and incremental price performance with 28nm, with the market accepting it, rightly or wrongly!

Imho,

Not really, based on AMD and nVidia offered impressive price/performance over the years based on strong competition. This is the first arch generation where both had windows of opportunity or competitive advantages. Maybe the first in history. AMD's execution advantage and launch prices and setting 28nm performance -- nVidia, sustaining this set price/performance with GK-104, extended it with the monolith.

So, let's see.... It's AMD's fault? Poor nVidia getting stuck in this situation with no other choice. Please be merciful nVidia and just finish us off. :D
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Since the big die strategy nvidia has offered smaller GPU's twice with the flag-ship moniker -- 9800 GTX (G92) -- and the GTX 680 ( GK-104) -- it's not like they haven't done this before.

Yes and one can guess that the reason is because there was no big die GPU ready to launch on those years.

My point still stands.

Nvidia has a big die GPU strategy. Based around their Tesla Chips. FACT.

GK110 is their Kepler Tesla Chip and it was to expensive and low yielding to make the desktop line up this time around. Instead they seized the day and launched it as a $1000 Titan. Whilst bitching to TSMC about the cost of 28nm and 20nm and so on.

Bad news for consumers. Nvidia has worked out that people will pay $1000 for its Big die GPU's.

Amd needs to knock 20nm out the park because a shift in Nvidias pricing model has changed for ever.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yes and one can guess that the reason is because there was no big die GPU ready to launch on those years.

My point still stands.

Nvidia has a big die GPU strategy. Based around their Tesla Chips. FACT.

GK110 is their Kepler Tesla Chip and it was to expensive and low yielding to make the desktop line up this time around. Instead they seized the day and launched it as a $1000 Titan. Whilst bitching to TSMC about the cost of 28nm and 20nm and so on.

Bad news for consumers. Nvidia has worked out that people will pay $1000 for its Big die GPU's.

Amd needs to knock 20nm out the park because a shift in Nvidias pricing model has changed for ever.

You're assuming AMD will save us and not just follow suit. I'm not so optimistic.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Yes and one can guess that the reason is because there was no big die GPU ready to launch on those years.

My point still stands.

Nvidia has a big die GPU strategy. Based around their Tesla Chips. FACT.

GK110 is their Kepler Tesla Chip and it was to expensive and low yielding to make the desktop line up this time around. Instead they seized the day and launched it as a $1000 Titan. Whilst bitching to TSMC about the cost of 28nm and 20nm and so on.

Bad news for consumers. Nvidia has worked out that people will pay $1000 for its Big die GPU's.

Amd needs to knock 20nm out the park because a shift in Nvidias pricing model has changed for ever.

I believe people buying Titan had to pay for "DP" even if they use it or not.Titan is more compute focused than a gaming monster.I admire it's engineering prowess even if I can't admire it's pricing.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
You're assuming AMD will save us and not just follow suit. I'm not so optimistic.

AMD cant justify a 380mm2 GPU for $1000, they also dont have any large die GPU's at all.

The price of the next AMD 8970 is going to be $500-$550 unless AMD wants to be agressive. I think AMD will be first with 20nm but they wont be generous on pricing.
 
Status
Not open for further replies.