GTX 460 overlocks from 675 mhz to 830 mhz without a voltage bump - faster than GTX470

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
What they "need" is immaterial, what they can and will deliver to the market is the only important thing to consider. No news on refresh. No news on a dual-chip card yet.

Further, having the performance crown means very little - the reason you would want that crown is to make money. They need products with good margins and, despite the amount of FUD surrounding yields, the general consensus is that yields are low and manufacturing costs are high - not a good combination.

He said 'Fermi', not 'they'......do you understand the difference?...And as for performance crown meaning very little, tell that to the Football, Rugby or Softball world champions.....

But I guess you know what a GPU business requires ah?....not!
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
He said 'Fermi', not 'they'......do you understand the difference?...And as for performance crown meaning very little, tell that to the Football, Rugby or Softball world champions.....

But I guess you know what a GPU business requires ah?....not!

Chill dude.

You are the one misunderstanding here.

Have a look at happy medium full post:

They dont need it.
They need the faster dual card to release, they have the fastest single gpu card.

See? Fermi needs the faster dual card? Fermi has the fastest single gpu card?

And Italy by the way is in a delicate situation in the WC and France that finished 2nd last one is gone home already. :)
 
Last edited:

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Gtx 475 with 384 shaders and gf 104 core?
Sounds possible. If this gt 460 with 336 cores = a gtx 465, then a full blown 384 shader part could beat a gtx 470.

http://www.fudzilla.com/graphics/graphics/graphics/nvidias-gf104-is-crippled-from-the-start

i think I remember Charlie from semiaccurate saying the same thing.:hmm:

But like I said just a rumer .........


http://www.semiaccurate.com/2010/06/21/what-are-nvidias-gf104-gf106-and-gf108/

I love charlie, his sense of humour has an edge and makes some of us not like him much.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Chill dude.

You are the one misunderstanding here.

Have a look at happy medium full post:



See? Fermi needs the faster dual card? Fermi has the fastest single gpu card?

And Italy by the way is in a delicate situation in the WC and France that finished 2nd last one is gone home already. :)

Meh, maybe your right....!?..Italy...LOL, who would of thought we would hold them to a draw?..LOL, and France, cant stop laughing!
 

tincart

Senior member
Apr 15, 2010
630
1
0
He said 'Fermi', not 'they'......do you understand the difference?...And as for performance crown meaning very little, tell that to the Football, Rugby or Softball world champions.....

But I guess you know what a GPU business requires ah?....not!

Reading comprehension fail.

Further, video cards are not football teams. I'll let you struggle with that one for a little bit, you might need to break out the crayons and draw a diagram.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
..Italy...LOL, who would of thought we would hold them to a draw?..LOL, and France, cant stop laughing!

Didn't notice you were from New Zealand. Well done! :)

I still prefer the 7-0 hammering my country gave, minnows or no minnows. :)
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
http://www.semiaccurate.com/2010/06/21/what-are-nvidias-gf104-gf106-and-gf108/

I love charlie, his sense of humour has an edge and makes some of us not like him much.
"It looks like the initial GF104s are going to have at least one block of 32 shaders disabled if not two. The 336 shader number is likely an artifact of the stats program not being fully aware of the new chip yet, but Nvidia might have added the ability to disable half a cluster. GF108s shown at Computex had 96 of 128 shaders active."

funny how Charlie claims all this inside knowledge yet he is too stupid to realize that the GF104 appears to be using 24sp clusters.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
funny how Charlie claims all this inside knowledge yet he is too stupid to realize that the GF104 appears to be using 24sp clusters.

He addressed that in the forums.

But in fact 24sp clusters is a departure of the traditional 16sp clusters of previous architectures and even the 32sp clusters of GF 100.

And while 16sp and 32sp are very similar how they work(the execution model that all NVIDIA GPUs since GF8 operate on is an SIMD width of 32), 24sp could imply quite a bit difference on how you write drivers, CUDA applications, etc, for it.

NVIDIA could be even using the old 16sp per cluster and that would give GF 104 24x16sps clusters and the rumoured 336 cores (21 16sp clusters) card 3 disabled clusters. Might make more sense - it is not that 24sp is the same architecture as GF100 either and 16sp clusters would make all software compatible.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Reading comprehension fail.

Further, video cards are not football teams. I'll let you struggle with that one for a little bit, you might need to break out the crayons and draw a diagram.

Yes, right, right....I only took in the last line he said...my bad!, however I still think you miss the point re: the crown, too bad!
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
He addressed that in the forums.

But in fact 24sp clusters is a departure of the traditional 16sp clusters of previous architectures and even the 32sp clusters of GF 100.

And while 16sp and 32sp are very similar how they work(the execution model that all NVIDIA GPUs since GF8 operate on is an SIMD width of 32), 24sp could imply quite a bit difference on how you write drivers, CUDA applications, etc, for it.

NVIDIA could be even using the old 16sp per cluster and that would give GF 104 24x16sps clusters and the rumoured 336 cores (21 16sp clusters) card 3 disabled clusters. Might make more sense - it is not that 24sp is the same architecture as GF100 either and 16sp clusters would make all software compatible.
so did you forget that the previous architecture was based on 24sp? gtx260/gtx280/gtx275/gtx285/gtx295 and several other lower end cards all had 24sp clusters.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
so did you forget that the previous architecture was based on 24sp? gtx260/gtx280/gtx275/gtx285/gtx295 and several other lower end cards all had 24sp clusters.

http://www.anandtech.com/show/2549/4

While as you say they are physically organized in groups of 24 (actually 3 groups of 8sp) they work as groups of 32 - 4x8 groups.

The problem is that GF100 shader groups are dual issue and much more interconnected to allow so.

On G80/G90/G200 NVIDIA implementation retires one SIMD instruction per SM over 4 clock cycles - 8 SIMD "threads" per clock or 1 per core.

On GF100, you get dual issue two SIMD instructions which are retired over two clocks - 2 x 16 SIMD "threads" per clock, again 1 per core.

The old chips were organized quite differently. G80/GT200 shader clusters were quite similar and ironically were in some ways more modular - they all shared the same front end.

Now in GF100 you have the groups of 32 shaders are much more tightly interconnected. That means they can't go "burrow" the 8 missing sp from another group to have a SIMD width of 32.

EDIT: A bit more on the G80/GT200 vs GF100 differences http://www.anandtech.com/show/2977/...tx-470-6-months-late-was-it-worth-the-wait-/3
 
Last edited:

at80eighty

Senior member
Jun 28, 2004
458
5
81
Reading comprehension fail.

Further, video cards are not football teams. I'll let you struggle with that one for a little bit, you might need to break out the crayons and draw a diagram.

I lol'd. yeah it was a weird analogy.

also this development if true, is interesting - hopefully nvidia will have a v2.0 of fermi up by the holidays, id be very interested to consider them at this rate
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
NVIDIA has been behind AMD in texture fillrate this generation, I'd be interested to see if that helps them out. Still, I think they missed their market setting the price at $250. For $200, this would have been the card a lot of people have been waiting for. At $250, it just adds another "meh" into the mix, and this assumption is based completely on its projected performance, nevermind noise, power consumption, aesthetics, etc. 5850's are available for $250-260 right now, and should significantly outpace these cards. I just don't see them competing well.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
NVIDIA has been behind AMD in texture fillrate this generation, I'd be interested to see if that helps them out. Still, I think they missed their market setting the price at $250. For $200, this would have been the card a lot of people have been waiting for. At $250, it just adds another "meh" into the mix, and this assumption is based completely on its projected performance, nevermind noise, power consumption, aesthetics, etc. 5850's are available for $250-260 right now, and should significantly outpace these cards. I just don't see them competing well.

Everytime you guys post prices on cards I get all excited that we have some price drops and I look and see 285$ for the cheapest 5850.

Or are you saying ,once in a while you can find them on sale with bing cash back and a rebate for 250/260$?

Where do you shop?

I don't find a card for a 250$ thats cooler,less power ,and quieter (then a gtx 465) and overclocks to be faster then a gtx 470 a "Meh" card.

Another thing , as we have been seeing, it's not supposed to compete with anything it's supposed to fit in between other cards per price and performance. Just like the gtx 470 fits in between the 5850/5870 price performance wise and the gtx 480 fits in beetween the 5870 and 5970.
The gtx 465 fits in between the 5830 and 5850.

I have a strong feeling this card will replace the 250$ gtx 465 with better performance, thermals and overclocking.

The 768mb model will fit in between the 5770 and 5830 and be priced around 200$, unless the 5830's price keeps dropping.

Make sense? no competition , no price war,sucks for us!
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Everytime you guys post prices on cards I get all excited that we have some price drops and I look and see 285$ for the cheapest 5850.

Or are you saying ,once in a while you can find them on sale with bing cash back and a rebate for 250/260$?

Where do you shop?
TigerDirect. Two models are available for $290 - 13.2% cashback, gets you the card for ~$252. There have been deals like this for months.

I don't find a card for a 250$ thats cooler,less power ,and quieter (then a gtx 465) and overclocks to be faster then a gtx 470 a "Meh" card.
It's slightly faster than the GTX470 (and within the margin of error) after receiving a 33% overclock. Considering you have a 5850 available at the same price that competes with the GTX 470 at stock, and typically gets a 30-40% overclock with no sweat, I fail to see where one would be impressed. What, NVIDIA finally released a card that wasn't a flaming pile of suck (flaming being the operative word there)?

Another thing , as we have been seeing, it's not supposed to compete with anything it's supposed to fit in between other cards per price and performance. Just like the gtx 470 fits in between the 5850/5870 price performance wise and the gtx 480 fits in beetween the 5870 and 5970.
The gtx 465 fits in between the 5830 and 5850.
The "fits in" argument is a joke at best. The performance is so close between all of these cards you could interchange any card in the high end and no one could tell the difference in most games. Add overclocking into the mix and the point becomes even stronger. Price and scalability is all that matters.

I have a strong feeling this card will replace the 250$ gtx 465 with better performance, thermals and overclocking.

The 768mb model will fit in between the 5770 and 5830 and be priced around 200$, unless the 5830's price keeps dropping.

Make sense? no competition , no price war,sucks for us!
Hopefully this is the first step in NVIDIA making a halfway decent product with the Fermi architecture. Competition is always a good thing.
 

tincart

Senior member
Apr 15, 2010
630
1
0
I don't find a card for a 250$ thats cooler,less power ,and quieter (then a gtx 465) and overclocks to be faster then a gtx 470 a "Meh" card.

You seem to be extremely certain about the detailed specifications and performance of an unreleased card that no one has reviewed yet based on an unconfirmed leak from one site with one synthetic benchmark. Your powers of inference astound me.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
You seem to be extremely certain about the detailed specifications and performance of an unreleased card that no one has reviewed yet based on an unconfirmed leak from one site with one synthetic benchmark. Your powers of inference astound me.

Detailed specs have been out for days.
Its very easy to speculate, just insert card in correct slot of price and performance.
It's been this way since the fermi launch.

price /performance

5970-700$
gtx480-479$
5870-390$
gtx470-330$
5850-285$
gtx 465-250$ soon to be gt460 1gb 256bit mem 250$
5830-220$
gt460- 768mb memory 192 bit mem 200$
5770 -160$
gt450- 140$
5750-120$

There it is ,in a nice neat organized package come August.
I bet you I'm dam close?
No price war and no competition, perfect for the gpu companies and sucks for us!
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I know but for it to be considered good (re power consumption) it should at least match the competition. A 5850 will use 130W or less. If this performs less than a 5850, it should use even less power

says who? Who really cares that one product uses 170 watts instead of 150?
It's when its 250watts+ and heat,noise and power draw becomes a problem things start to suck.

The gtx 260,280,285 /4870,4890,4870x2 were all power hungry 180 watts+, sometimes loud, hot cards, no one cared then.
If you do,well we all know that, because thats all you post about.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
The gtx 260,280,285 /4870,4890,4870x2 were all power hungry 180 watts+, sometimes loud, hot cards, no one cared then.
If you do,well we all know that, because thats all you post about.

Because there was no alternative?

It isn't the same this time.

And you can bet if the price of the GTX480 was the price of the 5870 and vice-versa and the same for GTX470 and 5850, NVIDIA would receive more praise even consuming more power.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Because there was no alternative?

It isn't the same this time.

And you can bet if the price of the GTX480 was the price of the 5870 and vice-versa and the same for GTX470 and 5850, NVIDIA would receive more praise even consuming more power.

Truthfully, the only time I worry about power and heat is when my case is too small,has bad airflow, and my psu is too weak.

Other then that, give me the fastest thing I can afford. :)
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
Truthfully, the only time I worry about power and heat is when my case is too small,has bad airflow, and my psu is too weak.

Other then that, give me the fastest thing I can afford. :)

Oh, that's probably why you still list your 5750 in the sig?