GTX680 reall a 660Ti?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Actually from what I've seen looking back at GF114 vs GF110 there is a 40% TDP increase for a 30%~ performance increase.

GF110 isn't all that less efficient than GF114.

Keep in mind Nvidia is shipping GK104 with 8+6+6 connectors on the reference PCB.

GK104 gave the so called "enthusiast" websites like [H] a product that has a high end name and marginally more performance than the turd AMD put out while being considerably more quite and used less power to do it.

Neither the 7970 nor the 680 are what I'd call a next gen enthusiast card, Nvidia pandered to the weakest link in our market, the people who buy a i7-2600k and see how far it can go on the stock cooler than complain about how hot/loud it gets at 4GHz.


Still waiting for an upgrade/680 oc'ing getting fixed/price to go down, hopefully we see a 685 or something based on GF110 with a 250+ TDP that all the so called "enthusiast" websites hate but I can put on water and get 50%+ more performance out of than this 680.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
Actually from what I've seen looking back at GF114 vs GF110 there is a 40% TDP increase for a 30%~ performance increase.

GF110 isn't all that less efficient than GF114.

Keep in mind Nvidia is shipping GK104 with 8+6+6 connectors on the reference PCB.

GK104 gave the so called "enthusiast" websites like [H] a product that has a high end name and marginally more performance than the turd AMD put out while being considerably more quite and used less power to do it.

Neither the 7970 nor the 680 are what I'd call a next gen enthusiast card, Nvidia pandered to the weakest link in our market, the people who buy a i7-2600k and see how far it can go on the stock cooler than complain about how hot/loud it gets at 4GHz.


Still waiting for an upgrade/680 oc'ing getting fixed/price to go down, hopefully we see a 685 or something based on GF110 with a 250+ TDP that all the so called "enthusiast" websites hate but I can put on water and get 50%+ more performance out of than this 680.

Exactly what i want, ill take the normal TDP of 250w+ for a REAL enthusiasts card. Im thinking a GTX 685 will come.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
So they did a better refresh on the GTX460 than they did on the GTX280 - how does that prove anything related to a generation leap?

AMD/ATi:
X700 vs 9800XT - nope
X1600 vs X850XT - nope
HD2600XT vs X1950XTX - nope
HD4850 vs HD3870 - yes... here's the only one for AMD/ATi (HD48xx was a killer product)
HD5770 vs HD4890 - nope (no refresh here), same for HD5830
HD7850 vs HD6970 - nope - jacked up pricing too on the 7-series

nVidia:
6600GT vs FX5900Ultra - yes... here's the only one for nVidia (FX series was a total failure)
7600GT vs 6800Ultra - nope (no refresh here)
8600GT vs 7900GTX - nope
GTS250 vs 9800GTX - those are the same cards... D:
GTX460 vs GTX285 - nope
GTX660Ti(?) vs 580GTX - can't say yet - wouldn't hold my breath though!

I'm saying that that 460 could easily have GTX560TI performance if they wanted. Only the thermals would be different from GTX560TI.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
I'm saying that that 460 could easily have GTX560TI performance if they wanted. Only the thermals would be different from GTX560TI.

And also would:
- have lower yields (=higher price), as less chips would hit target clocks
- need better cooling (=louder or beefier=pricier)
- need better PSU (OEMs don't like it)
- have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really ;)
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
And also would:
- have lower yields (=higher price), as less chips would hit target clocks
- need better cooling (=louder or beefier=pricier)
- need better PSU (OEMs don't like it)
- have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really ;)

I think GF100 had more of a say than the things you listed, Nvidia was behind with the 480, missing their target cores and probably clocks as well, which affected the 470 as well.

GF104 had to be clocked low enough not to outpace the 470, and still allow the 470 to look decent considering it was $100 more.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
And also would:
- have lower yields (=higher price), as less chips would hit target clocks
- need better cooling (=louder or beefier=pricier)
- need better PSU (OEMs don't like it)
- have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really ;)

But if GF100 flopped any more than it did they had plan B. GTX680 seems like plan B all the way.
 

rusina

Member
Mar 20, 2012
31
0
66
Huh? Since when is "high end" determined by die size? It is the fastest GPU available, that alone makes it "high end".

I think people are grasping at straws here. Probably the same people that complained about Fermi 1.0 being big, loud, and hot.
Die size is one thing indicating this. There's poor double precision performance and relatively low power consumption as well. It's easy to see that this GK104 isn't the best possible GPU Nvidia could do for consumer market.

Also the claim was that originally GK104 wasn't meant as high end GPU. That doesn't necessary mean that Nvidia couldn't sell it as one.

Well... we're now seeing a 28 nm GPU vs a 40 nm GPU. If the design were exactly the same as the 560 ti, it would still be significantly smaller due to feature size change... on the order of 50%. So... you've got a chip that SHOULD be half as big but is only 20% smaller, meaning there's a whole bunch of extra stuff.
It should be only half as big if it was only die shrink. For example GTX 460's GPU had 500 million transistors more than GPU on GTX 260. It was still faster and smaller.
 

Mopetar

Diamond Member
Jan 31, 2011
8,162
6,890
136
I think GF100 had more of a say than the things you listed, Nvidia was behind with the 480, missing their target cores and probably clocks as well, which affected the 470 as well.

GF104 had to be clocked low enough not to outpace the 470, and still allow the 470 to look decent considering it was $100 more.

I don't think it affected the 470 too much. GF100 certainly had problems as the most CUDA cores any of its products shipped with was 480, but the 470 had 448 cores enabled where as GF104 only had 336 cores, leaving plenty of breathing room. The differences in die size alone were quite staggering, leaving plenty of room for the 470 that it wouldn't be necessary to artificially gimp the 460.
 

Mopetar

Diamond Member
Jan 31, 2011
8,162
6,890
136
But if GF100 flopped any more than it did they had plan B. GTX680 seems like plan B all the way.

I'm guessing that nVidia just didn't want to repeat the whole thing a second time. Assuming that GK100 was anything like GF100, it would have been massive, probably hot (GK104 is a little hotter than expected, but that could just be the cooler design), and likely would need to have cores disabled just to yield decently. nVidia didn't want to come stumbling out of the gate again, especially if their wafers were even more limited this time around. It just wouldn't be cost effective, or allow them to get things done on time.

I don't know if they rolled the dice or had enough information about what AMD was doing to make a good judgement call, but it's worked out well enough for them. We don't have any real clue what the schedule was looking like on GK100, but it was apparently either behind by enough that nVidia needed to release something or there were some design hurdles that couldn't be fixed without a major overhaul so that it was scrapped. Without any better evidence it's just as likely that nVidia made a decision to cancel a GK100 that might have eventually worked, albeit with lower yields and at a later date than hoped for, or they had no choice but to cancel a GK100 that had no real hope of working as planned without major work and months of time.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I don't think it affected the 470 too much. GF100 certainly had problems as the most CUDA cores any of its products shipped with was 480, but the 470 had 448 cores enabled where as GF104 only had 336 cores, leaving plenty of breathing room. The differences in die size alone were quite staggering, leaving plenty of room for the 470 that it wouldn't be necessary to artificially gimp the 460.

480 > 470 was only one less cluster, the only way to go from $350 to $500 when the performance difference per clock was going to be around 5% was to reduce the clock rate of the 470 noticeably lower than the 480, the 480 was already gimped at 700MHz...

So you drop per clock performance by 5%, you need another drop in performance, starting at 700MHz base you're going to need a decent performance hit between the two, 250mb of vram isn't going to be enough to warrant $150.... 5% per clock + 13% clock reduction = $350 470



Anyways, I'm just saying the 460's TDP is so low and it overclocked well enough that they obviously had room both in TDP and in clocks, but were limited more so by GF100 than they were the other factors listed.


GF114 is GF104, which is what GK104 is. Will we ever see another GF100, "BigK" card? I dunno, Nvidia took so much shit over GF100 and barely skated by with GF110 what's the point?

Look at the reviews for the 680, reviewers seem to love these cards.. None of them care it has awful overclocking potential, none of them care it isn't much faster than the 580 they've had for over a year. They don't even care it has no compute power.

All they care about is that it has better perf/watt than the 580/7970, it's barely any faster, noticeably quieter, and it's called the GTX 680. Personally I've never felt so detached from review sites before in my life, they've seemingly killed BigK and welcomed a product with open arms that has no place in a high end system, yet carries a high end price tag.
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
I do hope that everyone complaining that it should be called a 660 TI (and priced around the 560TI mark) actually realise that if nvidia did do this it would basically kill AMDs high end graphics division.

gtx670ti.jpg


I think this 660 talk started after only after yesterday's release and the architecture comparison to GF114 that GK104 is.

The picture above surfaced several weeks ago indicated it may have been originally targeted as a GTX 670 Ti and possibly the second batch of silicon (what do they call it A2?) allowed them to clock fast enough to be marketed as the GTX 680.

Or its possible that the GPU above is completely unrelated and is now something like GK106.
 
Last edited:

Ajay

Lifer
Jan 8, 2001
16,094
8,111
136
I think this 660 talk started after only after yesterday's release and the architecture comparison to GF114 that GK104 is.

The picture above surfaced several weeks ago indicated it may have been originally targeted as a GTX 670 Ti and possibly the second batch of silicon (what do they call it A2?) allowed them to clock fast enough to be marketed as the GTX 680.

Or its possible that the GPU above is completely unrelated and is now something like GK106.

I think most of the 660 talk resulted from the GPU being called the GK104. The 670 Ti talk started because there were some rumors of it being called such - but as you point out, that could have just been confusion by some after seeing 570Ti marked parts (which may have been mock-ups of the real 670 Ti that has yet to come. It's all pretty crazy actually :\
 

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
http://www.overclock.net/t/1254426/...gtx-690-is-a-prime-example-of-nvidia-reshaped

Ya, well to all of you who doubted/flamed me...guess what? I WAS RIGHT!

I never doubted you... I had a suspician, but just needed evidence... When I saw leaked test results for the GTX 670 it just didn't add up because performance was too close to the GTX 680 - which is similar like the GTX 560ti and GTX 560... Now they are selling the GTX 660ti (Now the GTX 680) and GTX 660 (Now the GTX 670) as high end cards because the performance does designate them as high performance... it also may explain the low supply because they were caught off guard by the performance of the card...

Good Job btw! :biggrin:
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
http://www.overclock.net/t/1254426/...gtx-690-is-a-prime-example-of-nvidia-reshaped

Ya, well to all of you who doubted/flamed me...guess what? I WAS RIGHT!

Please credit the proper source, which is VR Zone not some forum.

http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html

This reminds me of the Intel Core scenario where Intel messed up with Netburst after a while and had to dig deep into their Israeli "high efficiency/mobile/Pentium M" team which was working on innovative energy efficiency measures. They took the high-efficiency architecture lessons from Pentium M and turned those lessons into Core and Core 2 Duo.

First there were energy guzzlers, but the heat/power/noise got so bad that performance hit a wall.

Then they turned into high-efficiency architectures.

Then they supercharged the high-efficiency architectures to make something both fast and relatively efficient.

P.S. This is also somewhat old news. TPU has had this article up for ages: http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
You never own the fastest video card made very long because a faster one is just around the corner! That being said last week I bought what was the fastest Nvidia card till the 690 GTX debuted. Did I pay too much ? All relative. I was in the market for a modern single GPU to power 3 monitors at a combined 5760 x 1080 resolution. My realistic choices were an AMD 7970 vs a Nvidia 680 GTX. Was I wrong? I don't think so but that's only me. BTW I have the card and but fot a failed ssd the system runs much better and smoother in games than the 6970. For me it was worth it. I have no buyers remorse as I'm flying a WWI plane over 3 monitors with incredible fps in high resolution. My fellow posters who own the 7970 can make the same claims and I'm glad for them also.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
So i heard several rumors about this and found several people posting about it. is it true that nvidia released their GTX 660Ti (or other lower end card) as a 680 because they didnt consider ATI to offer enough competition?

Don't think it was that simple, the gk110 would of been the 680, but
Probably had yield issues. The gk1114 didn't have it's big brother compute, so clocked well enough to beat or equal the 7970, so let's get it out the door to start 28nm line up.....probably why they have the 690 out now rather than the smaller gpu line up.....
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
We agree with everything you are saying...all we are trying to say is this card was not supposed to be nv top end card.

What the heck are they going to call the real 680 when it comes out?if i was nv i would of called it a 660ti just to bust amds ballz and to let them know they have a monster waiting.

The op asked if its a relabled gpu and it is 100% not a real 680 just called one because it beats amds top end gpu.

Dont forget its on a 256bit bus also
Interestingly enough, this is exactly what I would of done too.....that would of killed any enthusiasm for the AMD line up with the thought of higher end parts to come. I guess, it's moot, as their is no further high end, but I guess they wouldn't be able to charge $500 for a mid range.....
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
This reminds me of the Intel Core scenario where Intel messed up with Netburst after a while and had to dig deep into their Israeli "high efficiency/mobile/Pentium M" team which was working on innovative energy efficiency measures. They took the high-efficiency architecture lessons from Pentium M and turned those lessons into Core and Core 2 Duo.
If memory serves me right, Pentium D (Netburst) had also been developed by the same Israeli team.