Any GeForce 660/660 Ti Speed Theories?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Anyway, one other factor might be the PCB and power circuitry: so while an over-engineered board (say 7970 @ able to OC from 200W to 350W) might 'waste' High% of power;a board with little room (say a reference GT670) might 'waste' Medium% in power;

I think they can squeeze 10-15W reduction in power by dropping the complexity of the PCB with 192-bit bus. The bus tends to be power consumption heavy. Also, the TDP =! real world power consumption, and well those specs are still not 100% proven (although Sweclockers says it's 100% correct leak). BTW, HD7970 doesn't go from 200W to 350W of power. It's more like from 190W to 237-245W at 1150mhz. 350W of power on a 7970 is probably when applying 1.3-1.35V+ and overclocking to 1350-1375mhz on water. If those specs are correct, it's odd how NV was able to get 192-bit bus and 32 ROPs working though unless they decoupled them somehow OR the card has 2 failed memory controllers and all this time NV was piling up all the failed GTX670 chips.
 

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
BTW, HD7970 doesn't go from 200W to 350W of power. It's more like from 190W to 237-245W at 1150mhz. 350W of power on a 7970 is probably when applying 1.3-1.35V+ and overclocking to 1350-1375mhz on water. If those specs are correct, it's odd how NV was able to get 192-bit bus and 32 ROPs working though unless they decoupled them somehow OR the card has 2 failed memory controllers and all this time NV was piling up all the failed GTX670 chips.

Yes, I was actually praising the engineering on those 7970 PCBs, but the quoted is from ht4u.net and it is indeed
MSI Radeon HD 7970 Lightning [OC: 1250 / 1600 MHz – 1,3 Volt] 336,97W

As for the GT660, I'd say pilling up failed 670s is exactly what they have been doing: Nvidia are very very good at die harvesting.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Same as 550ti. Its doable. You basicly suffer performance penalty on some parts of the memory.

Also ROPs aint linked to the memory bus.
with its 192 bit bus, the 550 ti has 24 ROPs not 32 and that is because they supposedly are linked to the bus width on Nvidia's architecture.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Who the hell gives GPU power consumption AT THE WALL?

Surely not ze germans.

Nice chart btw
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Looks like we did get trolled this generation. NV is rumored to sell the same GK104 chip with 2GB of VRAM to us for $299 now (August 16th). :D

GK104 = mid-range Kepler is all but sealed now if these shots are real.

0WK5j.jpg

tVc43.jpg

XQ5BO.jpg

Ry5pZ.jpg

zoTCL.jpg


If NV can afford to sell GK104 for $299 just 4.5 months after launching GTX680 for $499, despite higher wafer costs of 28nm generation, ummmm...ya....
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
If NV can afford to sell GK104 for $299 just 4.5 months after launching GTX680 for $499, despite higher wafer costs of 28nm generation, ummmm...ya....



but but... only few days ago GK104 was barely manufacturable
and now all of a sudden yields of fully functional dies are 100%?

:hmm:
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
some of our forum friends 2 :|

Anyway i've got no problem with any price in a free market, but

$500 for 290mm2 die with +25% over last gen was never imbah offer to begin with

This GPU I could buy one day for say... $250??

Remember Brad Pitt in Snatch?
It's a fair deal, take it

:D
 
Last edited:
Feb 19, 2009
10,457
10
76
With the bandwidth on this card, its not something a simple OC can catch up to the higher tier performance. ie. gtx670 OC ~= gtx680 OC, tiny difference not noticable in gameplay. But with much less bandwidth to start... not gonna happen, gtx660ti OC < gtx670.

So they are differentiating the market, this "mid-range" product is to stay mid-range in performance.

The question is why bother with such a product when a ~$310 7950 3gb with excellent custom cooler (not that lame reference crap) easily hitting gtx680 OC speeds. Or rather, when the prices on the 78xx are so low now, which are just as or more power efficient...

Still, it'll sell by the bucket load because mid-range cards typically sell great and loyal NV fans don't really care about details.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Looks like we did get trolled this generation. NV is rumored to sell the same GK104 chip with 2GB of VRAM to us for $299 now (August 16th). :D

GK104 = mid-range Kepler is all but sealed now if these shots are real.

If NV can afford to sell GK104 for $299 just 4.5 months after launching GTX680 for $499, despite higher wafer costs of 28nm generation, ummmm...ya....

That's assuming the 660 chips are fully functioning GK104's. If not (highly likely) anything they get for them are a bonus.
 

MrMuppet

Senior member
Jun 26, 2012
474
0
0
That has to be at the wall. With their 80Plus Gold PSU, that's about 300W.
Why would it be at the wall? Clearly their entire test system wouldn't draw only ~46W with a ASUS Radeon HD 7750 OC at load, so I think it's safe to assume that it's what the actual card draws, no?

Anyway, it says here (using Google translate):
The power consumption of the graphics card, we will decide on one, for this purpose in our laboratory, modified, PCI-Express adapter. The values &#8203;&#8203;obtained thus correspond only to the consumption of the graphics card itself and not the power consumption of the entire system. The power of the PCI Express slot, as well as those on the 12-volt power supply lines are simultaneously measured using a clamp ammeter. The (constant) power consumption of the 3.3 volt rail is calculated separately and is included in the overall results shown. For more details and background to find the measurements in our article about an initial power consumption of graphics cards .
http://ht4u.net/reviews/2012/amd_radeon_hd_7970_ghz_edition_tahiti_xt2_test/index14.php

edit: Here are the charts sorted from lowest to highest power consumption (the first chart is for idle, the second "Load (games)" and the third Furmark obviously):
http://ht4u.net/reviews/2012/amd_radeon_hd_7970_ghz_edition_tahiti_xt2_test/index15.php
edit2: Those are at stock, so for reference only, I guess.
 
Last edited:

Haserath

Senior member
Sep 12, 2010
793
1
81
Nvidia sold a GF100 die for under $300 a couple of months after launch. Does that mean GF100 was always meant for the $300 price point?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Nvidia sold a GF100 die for under $300 a couple of months after launch. Does that mean GF100 was always meant for the $300 price point?

I think that was more about random price cuts but NV never lowered the official MSRP of the GTX470 a couple months after launch. The MSRP on the GTX470 was $350 and it was almost 6 months before cards started to drop significantly (and that's because GTX570/580/6950 were introduced). Yes, there were occasional sales on the 470 with rebates (like that TigerDirect GTX470 for $190 a piece sale in July 2010). Also, that time was tricky since HD5850 was going for $260-280 vs. $350 for a GTX470!

newegg.png


Also, GTX460 was not made using GF100 die, which is the main point, GK104 is going into a mid-range GTX660Ti. GTX465 had used failed GF100 dies, as a limited card run. NV has in the past released mid-range cards on the same chip as the flagship (G92 for example), but that was not on the 1st round of a new nm node. They waited until wafer prices dropped. Original G80 (90nm) was the flagship with G92 (8800GT) serving as the midrange almost 1 year later. Then for "fake/arbitrary" GeForce 9 generation they used G92 in 9800GTX "refresh", but that was because it was built on the much smaller/cheaper 65nm. Before that, we'd have to go all the way back to GeForce 4 when NV used 1 chip for mid-range to flagship on the 1st round of nm node process. In this case, right off the bat with supposedly expensive 28nm wafer node they suddenly have no problem selling GK104 for $300? 3DVagabond made a valid point that these could be failed GK104 chips which means NV would rather sell them than throw them out. However, if GTX660Ti will be a mainstream card, that's a lot of failed chips they'd need to meet the demand.

The question is why bother with such a product when a ~$310 7950 3gb with excellent custom cooler (not that lame reference crap) easily hitting gtx680 OC speeds. Or rather, when the prices on the 78xx are so low now, which are just as or more power efficient...

Still, it'll sell by the bucket load because mid-range cards typically sell great and loyal NV fans don't really care about details.

There you go. That reference 660Ti cooler won't stand a chance against PowerColor PCS+ or MSI TwinFrozr. I've been recommending the 7950 card and not waiting for GTX660Ti because of overclocking, great aftermarket coolers, bonus 3GB of VRAM for mods. I somehow doubt an overclocked GTX660Ti could beat an overclocked 7950.
 
Last edited:

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Looks like the 660Ti is getting the same shitty reference cooler the GTX 670 did. It has this annoying humming noise, even in Idle. This is due to the fan being mounted in a plastic housing. Many reference GTX 670's also had bad coilwhine.

In short, better stay away and buy custom cards/with custom coolers instead.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Looks like the 660Ti is getting the same shitty reference cooler the GTX 670 did. It has this annoying humming noise, even in Idle. This is due to the fan being mounted in a plastic housing. Many reference GTX 670's also had bad coilwhine.

In short, better stay away and buy custom cards/with custom coolers instead.

Right, but chances are most of those custom versions will then be $10-20 more expensive. In that case unless GTX660Ti has 35-40% overclocking headroom, HD7950 will probably be the enthusiast choice in the segment.

MSI TwinFrozr 7950 is now $310 with 3 free games
Gigabyte Windfroce 3x is $320 with 3 free games

As a single-card, 660Ti doesn't look much better than 7950 right now. I can see how it will do well for the mainstream market that wouldn't be overclocking and appreciate the lower power consumption though. The dark horse may be $650 custom cooled GTX660Ti SLI. That could mop the floor with a single GTX680. :thumbsup:
 
Last edited:

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Right, but chances are most of those custom versions will then be $10-20 more expensive. In that case unless GTX660Ti has 35-40% overclocking headroom, HD7950 will probably be the enthusiast choice in the segment.

MSI TwinFrozr 7950 is now $310 with 3 free games
Gigabyte Windfroce 3x is $320 with 3 free games

The 660Ti is a tough sell for me to be honest. I can see how it will do well for the mainstream market that wouldn't be overclocking and get great power consumption though. The dark horse may be $650 custom cooled GTX660Ti SLI. That could mop the floor with a single GTX680. :thumbsup:

I have to admit, even if I like Nvidia cards, those MSI and Gigabyte 7950's are really looking good for their price.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
RS keep in mind that GK104 die is smaller than the GF104/114,
i.e. this gen. flagship die > previous gen mid-range

Also remember that all other GPU components are cheaper than Termie counterparts - courtesy of high perf/W, so indeed why Nvidia couldn't afford to sell GK104 for $299?
They seem to be doing fine selling larger GPUs for same money.

So what are you trying to say :hmm:
That Nvidia makes $#load of money on 670/680?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So what are you trying to say :hmm:
That Nvidia makes $#load of money on 670/680?

:biggrin: Yup. That's what I am saying. I think GK104 was upper-mid-range all along (probably planned to be GTX660Ti/670 for GTX670/680 cards) but NV couldn't get GK100 out on time (for cost, die size, capacity/wafer constrained reasons). Then when they saw HD7970, they realized they could use GK104 as a GTX670/680. Many different sources point to this such as NV being very worried before HD7970 launched (when was the last time NV was very worried that AMD would beat their flagship?), NV being 'relieved' when HD7970 launched and was underwhelmed by its performance from quotes (really? if 10% lead over 7970 is huge, then I guess I have a different definition of underwhelmed, but if GK104 was mid-range and beat 7970 by 10%, that's well undewhelming indeed for AMD), GK104 codename --> successor to GF114, lack of GPGPU compute when NV paid millions of dollars to promote it since G80, NV deciding to sell GK100 as K20 for $3-5k to professional markets, GK104 having the smallest lead from a true next generation flagship than any other NV flagship that has come before it, GK104 not increasing memory bandwidth over the previous flagship (which IIRC hasn't ever happened for NV's next flagship part)....

That's a lot of coincidences. If GK104 was profitable from day 1 at $299, it's no wonder NV can sell it in a GTX660Ti (unless 3DVagabond's view is the correct one in that most of these are stockpiled / failed GTX670 chips with damaged/disabled memory controllers).

I don't think this matters anymore though since GK100/110 isn't coming to the desktop this year it seems, so GTX680 is the flagship card for all intents and purposes. But all I was trying to say is that this generation we didn't get a true flagship from NV. 35% faster than GTX580 for $500 is not great, especially since GTX480 can get you 580 speeds and that card is more than 2 years old! Perhaps NV just decided to trade in die size and performance for efficiency though because a lot of consumers seem to care about 40-50W of power difference on flagship cards lately.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
But we already knew that this gen perf. gains will be modest.
Ever since Tahiti landed, and ever since we learned that GK104 is smaller than 300mm2.
Granted, we still kinda hoped for GK100/110 to make it.

The thing is......... Nvidia's human resources have been/are heavily diverted into Tegra.
And with that in mind I'm not too unhappy with what they brought to table.
TXAA, NVENC - x264 encoder and Bindless Textures is a neat little set of new features.


Also, Nvidia really did focus hard on perf/W this gen.
Everything in design was subjected to improving Perf/W.

In the end it was a good decision. Perf/W is what gave them all those IB mobile wins.
Perf/W is what gave them mindshare which they lost with Termie.
And they have huge expectation for mobile parts this year.

In the end perf/W is what makes possible to put $#tty parts, and GPU still won't blow up :)
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
But we already knew that this gen perf. gains will be modest.
Ever since Tahiti landed, and ever since we learned that GK104 is smaller than 300mm2.
Granted, we still kinda hoped for GK100/110 to make it.

The thing is......... Nvidia's human resources have been/are heavily diverted into Tegra.
And with that in mind I'm not too unhappy with what they brought to table.
TXAA, NVENC - x264 encoder and Bindless Textures is a neat little set of new features.


Also, Nvidia really did focus hard on perf/W this gen.
Everything in design was subjected to improving Perf/W.

In the end it was a good decision. Perf/W is what gave them all those IB mobile wins.
Perf/W is what gave them mindshare which they lost with Termie.
And they have huge expectation for mobile parts this year.

In the end perf/W is what makes possible to put $#tty parts, and GPU still won't blow up :)

True... The only reason I went for GTX 670's was that my powerhungry GTX 570 died. I think it literally burnt up because of Nvidia implementing to weak PVM circuitry on the GTX 570 VS GTX 580.

At first I bought a reference GTX 670, which I was very dissapointed with. It had bad coilwhine and a shitty cooler making a grinding noise, even at Idle. I later bought a AC Twinturbo II for it. But could never get rid of the terrible Coilwhine. It ended up in my secondary PC (Should have sent it back while I had the chance). Only reason I bought this card was because custom GTX 670's was not in stock, and because I did not think ASUS GTX 670 DC II would fit in my case (They did fit after all).

Then I went for 2x ASUS GTX 670 DC II 'Non TOP'. You may ask why I bough two. Well, because of the nice Power Usage I went all in. And because I could afford it. Very happy with these cards. No coilwhine, and very silent, even under load.

But If my GTX 570 had not died, all this may have never happened. It really had good enough performance for 1080/1200 DPI.