NVIDIA preparing revised GTX 560 Ti with 448 cores

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I don't get worked up on the name but more-so the actual costs, feature set and performance to the consumer. I think the GF110 is coming to a close and have 448 core GF110's in inventory, that they can actually sell to the consumer. Why throw them away, when they can bring in revenue for the company and value to the consumer?

I also tend to think they're all based on the Fermi architecture, GF-114 or GF 110 and the major differentiation is the amount of cores. I think the naming of the GTX 560TI + is mainly aimed for it's potential price-point placement more than anything.
Selling what can be sold is okay, bait and switch is not. Like others have said, GTX570 SE and GTX 565 are all more suitable names.

It is all about yield, not design. Unless GF110 has 100% yield, selling GF114 recycle bin is all about profit. It is a good decision to sell those, just bad naming. It will be more ethical to simply put GF114 into EOL milk GF110, consumer will understand. If they price it right, they will even gets the :thumbsup:.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Got an $800 video card budget and a manly PSU as the result of bitcoin mania. Pretty sure these cards aren't getting any of it -- just not exciting enough this late in the product cycle. Unless NV surprises and these are about $150 street price that is.

Aside from microstuttering the other problem with multi-gpu is not all applications and games support it. I run 2x5830s, and more than half the time (older games, Linux) they're just a 1x5830.

Definitely looking forward to the 28nm cards. Bring em on! And this time without the 12+ month falling behind on nvidia's part.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Selling what can be sold is okay, bait and switch is not. Like others have said, GTX570 SE and GTX 565 are all more suitable names.

It is all about yield, not design. Unless GF110 has 100% yield, selling GF114 recycle bin is all about profit. It is a good decision to sell those, just bad naming. It will be more ethical to simply put GF114 into EOL milk GF110, consumer will understand. If they price it right, they will even gets the :thumbsup:.

I don't understand, bait and switch, in this context.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I don't understand, bait and switch, in this context.

Yeah, normally bait and switch is promise you more, give you less, not give you more but have a confusing name. It's pretty much the opposite of bait and switch, assuming clock speeds are reasonable.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,039
540
136
I hate when companies do things like this. People usually base their buying off of the available reviews, which, for the 560ti are all on the original spec. Buyers are being lured in based on the performance of the original part and being switched to something else. The revised 560ti would have to excel in every metric compared to its predecessor IMO, or the consumer is being offered a raw deal. It was the same back when ATI changed the specs on the 1900gt from rev1 to rev2 and it stands today.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
I dont see what some of you are all getting worked up about..

This is not an "ethical" fail by Nvidia. It will be quite clearly labeled, like the 260 with 216 cores and very likely priced higher than todays 560ti.

You have to understand that Nvidia wants to capitalize on the name "ti" and i fully understand such a move without making a stink about "bait and switch" which by the way Seero shows how completely out of touch you can be at times.


Bring it on Nvidia and put even more pressure on AMD, and hopefully AMD will get the picture and release some cards that give is a real increase over the 5870.

The way i see it, the only thing the 6950 and 6970 have on the 5870, is the standard 2gb of ram.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Yeah, normally bait and switch is promise you more, give you less, not give you more but have a confusing name. It's pretty much the opposite of bait and switch, assuming clock speeds are reasonable.
Bait and switch is one thing being advertised as A, but consumer get B instead. The fact that GTX560 Ti is the best crop from GF114 and GTX560(448) is the worst crop of GF110, so obviously GTX560(448) bleed more than Ti.

Nvidia likes to fuse off SMs to increase yield, but how do they determine which SM to fuse off? The answer is they don't. Each SM in GTX580 is better than SM in GTX570. That is, suppose you can unfuse the disabled SM in GTX570, its performance is still going to be worst than a real GTX580 because there is a reason why the 2 chips that actually comes from the same wafer. QA simply fireup the chip and see how much electricity is needed for those chips to function properly as well as measuring heat produced. Those that are used in GTX580 are likely to consume less electricity and produce less heat compare to the ones in GTX570. This process is known as binning.

Disabling SM will reduce electricity consumption and heat, but it doesn't fix the problem. The disabled SM would be perfectly good and the bad ones are still in use (statistic and distribution are skipped). This is the same with disabling 2 SMs, except that the odds are worst. We know that GTX460 is far better than GTX465 even though they have the fewer number of Steam Processors. It runs as faster, cooler and with lower electricity consumption. In fact, people were avoiding GTX465 eventhough it has one more SLI connector. We have seen it, it isn't new, the new GTX560(448) is going to produce more heat and use more electricity even if you reduce the SP to 384. The illusion of performance gain due to the extra SPs will not make the bleeding effect disappear. At the end, it will be more SPs lower clock. More electricity consumption lower performance/watt.

It isn't like we are paying 600 bucks for it so it is okay, but most buyer buy video cards based on word of mouth. When sales say "this is GTX560" than the buyer will thing it is the GTX560Ti that everyone had their thrumbsup with, and not the next gen of the fail GTX465.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
...Seero shows how completely out of touch you can be at times...
You have no idea how picky I am when it comes to hardware. However, your assessment on me is very valid. I take this as a complement as you said "at times" instead of "most of the time".
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
That it's a crappy card for the price, perhaps? Inefficient, loves to suck power, very expensive, etc. Your "castrated" part argument makes little sense, if any. It has no bearing on a product's price/performance, and the GTX 570 is easily the better card of the two because of its performance for the price, heat output, and power consumption.

The purpose of the GTX 580 from the beginning was not only to finally be able to bring a full GF100 core to the market but since NVIDIA knew AMD would release a card faster than the HD 5870 they needed to retain their single card crown. It's a high-margin product for NVIDIA to use as a marketing tool so fools can buy it; nothing more.

We need a next gen because it brings with it significantly higher performance at previous price points as well as lower noise, lower heat output, and lower power consumption. We also need it for so-called enthusiasts to give NVIDIA money so we can have our slightly slower but otherwise much better cards. :cool:

Sure, as soon as you get higher than the Radeon HD 6870 you lose price/performance, but diminishing points of return get HUGE when you look at the GTX 580.




I can't wait till they come out with the new cards so I can buy the castrated card and tell everyone that bought the "full" version they are stupid.

Come to think of it I'm gonna run out and buy the new v6 mustang instead of the v8 because the price/performance is better in 6 cylinder.

I'm sorry but the GTX 570 is known to be built using inferior components than a 580 and far less headroom in overclocking and ultimately performance. A 580 consistently offers 20% + more performance over a 570 when the gpu is the bottleneck. When using features like 120hz, 3d or 2560x1600 this can be a noticeable and appreciated difference. I have the ability to use all those features hence why I value the GTX 580.

Other people can disagree as this is my opinion only.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I can't wait till they come out with the new cards so I can buy the castrated card and tell everyone that bought the "full" version they are stupid.

Come to think of it I'm gonna run out and buy the new v6 mustang instead of the v8 because the price/performance is better in 6 cylinder.

I'm sorry but the GTX 570 is known to be built using inferior components than a 580 and far less headroom in overclocking and ultimately performance. A 580 consistently offers 20% + more performance over a 570 when the gpu is the bottleneck. When using features like 120hz, 3d or 2560x1600 this can be a noticeable and appreciated difference. I have the ability to use all those features hence why I value the GTX 580.

Other people can disagree as this is my opinion only.
it is true that the reference gtx570 is not nearly as robust as the gtx580. that being said some gtx580s do not oc worth a crap though. in fact, overclocks can range all over the place for both the gtx570 and gtx580 reference cards. my gtx570 can only go about 7% more on the core before needed additional voltage to be perfectly stable. that's pretty sad considering my stock voltage is already at 1.013 which is not exactly low voltage fro a gtx570. this is the first Nvidia card I have owned that could not do around 20% or better on stock voltage. in fact I have never raised the voltage on the cards I owned.
 
Last edited:

SHAQ

Senior member
Aug 5, 2002
738
0
76
You are missing one thing, and it happens to be a major factor to some of us. Microstuttering. I can tell immediately that a system is using dual gpu's. Put me infront of a controlled test and I could pick them out in less then the time it took to write this post.

You do have a point, but it doesn't pertain to everyone. For the rest of us, we prefer a single high powered card.

You can tell the difference even at 120 FPS? Take 2 systems artificially limited at 120 FPS. One system with SLI and one without and you can see the difference? If your eyesight were that good I doubt you would be satisfied with gaming at all. There would always be something that bothered you between ghosting,input lag,mouse sensitivity etc.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
You can tell the difference even at 120 FPS? Take 2 systems artificially limited at 120 FPS. One system with SLI and one without and you can see the difference? If your eyesight were that good I doubt you would be satisfied with gaming at all. There would always be something that bothered you between ghosting,input lag,mouse sensitivity etc.


No I wouldn't be abe to tell at 120 fps. The point of Sli would be to run at higher settings or to mke the settings you are playing at more playable.

There are numerous games that will make sli'ed 580's slow up to the point where microstuttering becomes very relevant.

Metro 2033 will be way less than 60 fps @ 2560 without AA.
Crysis still kills video cards

there are plenty more but you get the picture.


On top of that, 3d vision puts a serious hurting on systems even without maxed out IQ.

I just fired up my brand new SLI'ed Galaxy GTX 480's and i was easily able to make them crawl. Best $325 I've ever spent ;)

You guys can say what you want but I know whats fact and whats not. Look at the video cards in my sig. I have every single one of them at my disposal and have a firm grasp of what each one is capable.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I can't wait till they come out with the new cards so I can buy the castrated card and tell everyone that bought the "full" version they are stupid.

Come to think of it I'm gonna run out and buy the new v6 mustang instead of the v8 because the price/performance is better in 6 cylinder.

I'm sorry but the GTX 570 is known to be built using inferior components than a 580 and far less headroom in overclocking and ultimately performance. A 580 consistently offers 20% + more performance over a 570 when the gpu is the bottleneck. When using features like 120hz, 3d or 2560x1600 this can be a noticeable and appreciated difference. I have the ability to use all those features hence why I value the GTX 580.

Other people can disagree as this is my opinion only.

And here you go and completely missed the point of everything I said. Nice try, but car analogies tend to fail, and in this case it did. The V8 Mustang is still good bang-for-buck in comparison to many other cars, so there's not much of an argument to be made here. The point is not getting to hugely diminishing points of return, like is the case of for the highest-end product (like a 990X or GTX 580). If all I cared out was simply bang-for-buck I would've mentioned anything higher than an i5-2500K or a Radeon HD 6870 not making sense.

There's GTX 570s that are only ~$10 more than the reference versions and those come with 6+2 or 8+2 power phase and have shown to be very reliable. They also run cooler than the reference version.

A GTX 580 is most certainly NOT 20% faster than the GTX 570. On average it's 14% faster at 1920x1200 with AA. Two, a single GTX 580 is not enough for most modern games at 2560x1600, OR for 1920x1200 with 4xAA at 120Hz or with 3D enabled. Three, titles that are unplayable at max settings with AA on the GTX 570 are unplayable on the GTX 580 as well. Factor in that it costs over 40% more than the non-reference versions while giving you only 15% more performance and it looks like a very bad deal, unless you only care about e-peen and showing off. If you think that's acceptable Newegg has an ASUS MARS II they can sell you.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
No I wouldn't be abe to tell at 120 fps. The point of Sli would be to run at higher settings or to mke the settings you are playing at more playable.

There are numerous games that will make sli'ed 580's slow up to the point where microstuttering becomes very relevant.

Metro 2033 will be way less than 60 fps @ 2560 without AA.
Crysis still kills video cards

there are plenty more but you get the picture.


On top of that, 3d vision puts a serious hurting on systems even without maxed out IQ.

I just fired up my brand new SLI'ed Galaxy GTX 480's and i was easily able to make them crawl. Best $325 I've ever spent ;)

You guys can say what you want but I know whats fact and whats not. Look at the video cards in my sig. I have every single one of them at my disposal and have a firm grasp of what each one is capable.

You should have qualified the statement. At below 60 or below 30 or whatever. I got SLI to be able to turn 30 FPS choppy games into 60 FPS smooth games and still have high settings. If people are running 30" or 3 monitors they should be using 3 cards not 2. That's why I never have upgraded past 1 1080p monitor and 2 card SLI. The cost/benefit ratio stinks. But hey if someone wants 3 monitors and it runs like crap and they're happy who am I to argue with it? I would never pay a lot of money for a setup that runs like garbage. Maybe other people feel differently...
 

Squirtmonster

Member
Jan 25, 2011
43
0
0
That definitely will not happen. All they could do is raise the clock speeds and voltage, and it would raise power consumption far too high (higher-than-GTX-480 high). Why? There's no other GPU core configuration they can use. They're already using the full GF110 core for the GTX 580, one with a single disabled GPU cluster for the GTX 570, and now two disabled GPU clusters for this supposed GTX "560 Ti" or GTX 565. And if all they do is raise clock speeds, they'd be competing against pre-overclocked cards which will probably be priced lower and have a better third-party heatsink solution.

How is this confusing?

The gtx560 was a revised gtx460 with higher clocks and about the same power consumption.

They just have to make the new gtx560ti a gtx570 with 1 extra cluster disabled, and lower clocks.
Make a revised gtx570 (A.K.A.gtx575) with 1 cluster disabled ,higher clocks, and about the same power consumption as they did with the gtx560 vs the gtx460.
Do the same revision enhancements to the gtx580 to make a gtx585.
10% more performance in the same power envelope.

ELO the current gtx560ti, gtx570 and gtx580, just like they did last round when they elo'ed the gtx260 192 and gtx280 and made the gtx260 (216), gtx275 and gtx285.

How is this any different?
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
And here you go and completely missed the point of everything I said. Nice try, but car analogies tend to fail, and in this case it did. The V8 Mustang is still good bang-for-buck in comparison to many other cars, so there's not much of an argument to be made here. The point is not getting to hugely diminishing points of return, like is the case of for the highest-end product (like a 990X or GTX 580). If all I cared out was simply bang-for-buck I would've mentioned anything higher than an i5-2500K or a Radeon HD 6870 not making sense.

There's GTX 570s that are only ~$10 more than the reference versions and those come with 6+2 or 8+2 power phase and have shown to be very reliable. They also run cooler than the reference version.

A GTX 580 is most certainly NOT 20% faster than the GTX 570. On average it's 14% faster at 1920x1200 with AA. Two, a single GTX 580 is not enough for most modern games at 2560x1600, OR for 1920x1200 with 4xAA at 120Hz or with 3D enabled. Three, titles that are unplayable at max settings with AA on the GTX 570 are unplayable on the GTX 580 as well. Factor in that it costs over 40% more than the non-reference versions while giving you only 15% more performance and it looks like a very bad deal, unless you only care about e-peen and showing off. If you think that's acceptable Newegg has an ASUS MARS II they can sell you.

Back on topic I guess. I fold my hand. I've noticed quite a trend with you. You are right.

I just want to get my history straight.

Ti4600's sucked cause Ti4200's offered a bulk of the performance for less money
9800 xt's sucked cause 9800 pro's offered a bulk of the performance for less money
x800 XT's sucked cause x800xl's
6800 Ultra's sucked because 6800 GT's offered a bulk of the performance for less money
7800 GTX's sucked cause 7800 GT's offered a bulk of the performance for less money
8800 GTX's sucked cause 8800 GT's offered a bulk of the performance for less money
GTX 280's sucked cause GTX 260 core 216's offered a bulk of the performance for less money

same with 5850's compared to 5870's
GTX 480's must suck because GTX 460's were a good bang for the buck

Needless to say I understand.

I have a great idea... How about the gpu developers design the castrated parts to begin with and skip full gpu's all together.

Hey MR. GPU engineering guy... That's way too many shader cores in there. How about you knock a few out, the guys who are going to buy these don't want them.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Back on topic I guess. I fold my hand. I've noticed quite a trend with you. You are right.

I just want to get my history straight.

Ti4600's sucked cause Ti4200's offered a bulk of the performance for less money
9800 xt's sucked cause 9800 pro's offered a bulk of the performance for less money
x800 XT's sucked cause x800xl's
6800 Ultra's sucked because 6800 GT's offered a bulk of the performance for less money
7800 GTX's sucked cause 7800 GT's offered a bulk of the performance for less money
8800 GTX's sucked cause 8800 GT's offered a bulk of the performance for less money
GTX 280's sucked cause GTX 260 core 216's offered a bulk of the performance for less money

same with 5850's compared to 5870's
GTX 480's must suck because GTX 460's were a good bang for the buck

Needless to say I understand.

I have a great idea... How about the gpu developers design the castrated parts to begin with and skip full gpu's all together.

Hey MR. GPU engineering guy... That's way too many shader cores in there. How about you knock a few out, the guys who are going to buy these don't want them.

Quite the classic post and I wish I could fit this in my sig. I also see that you have done a bit of housecleaning with your GPU's.
Do those Galaxy 480s have the triple fan coolers on them?
 
Feb 19, 2009
10,457
10
76
They should have a lot of crap dies sitting around that can't be a full fledged 570, why not use them and make some $$?

I have no problems if they price it right.. really, more choices are good. Can't be bad. Naming convention has been broken a long time ago.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Quite the classic post and I wish I could fit this in my sig. I also see that you have done a bit of housecleaning with your GPU's.
Do those Galaxy 480s have the triple fan coolers on them?

no triple slot but they remain quite cool, relativley speaking ofcourse.
as far as house cleaning, my 5870's went to my brother and a friend. I'm sure I will end up with atleast one of them back.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Back on topic I guess. I fold my hand. I've noticed quite a trend with you. You are right.

I just want to get my history straight.

Ti4600's sucked cause Ti4200's offered a bulk of the performance for less money
9800 xt's sucked cause 9800 pro's offered a bulk of the performance for less money
x800 XT's sucked cause x800xl's
6800 Ultra's sucked because 6800 GT's offered a bulk of the performance for less money
7800 GTX's sucked cause 7800 GT's offered a bulk of the performance for less money
8800 GTX's sucked cause 8800 GT's offered a bulk of the performance for less money
GTX 280's sucked cause GTX 260 core 216's offered a bulk of the performance for less money

same with 5850's compared to 5870's
GTX 480's must suck because GTX 460's were a good bang for the buck

Needless to say I understand.

I have a great idea... How about the gpu developers design the castrated parts to begin with and skip full gpu's all together.

Hey MR. GPU engineering guy... That's way too many shader cores in there. How about you knock a few out, the guys who are going to buy these don't want them.

Again, missing the point entirely. The target is not getting to the level of hugely diminishing points of returns. I've explained this TWICE to you and you seem to not understand this simple concept. Like I told you before also, if all I mentioned in my arguments was bang-for-buck I wouldn't have mentioned the GTX 570 as being a good buy. Another thing you missed. For a few people a Performance card isn't enough, and for that there's the lower-tier Enthusiast card. What you want to avoid here is getting is getting to a point where you end up paying significantly more for a small performance increase that won't even make games that were previously unplayable playable.

Talking about the HD 5850 and HD 5870, the HD 5870 has 15% higher performance and costed 33% more when introduced. Clearly that's not as bad as the GTX 580 and 570, but the HD 5850 was definitely a better buy then. If you're lucky you can get an HD 5870 for $200 or so now, which is excellent.

Your "castrated" GPU argument, again, fails because it simply makes no sense. Both Cayman and Barts have great yields by now. The majority of the dies just have clusters laser cut even though they were functional so they can fill demand at the lower price point; that's it. In the case of Barts it was a Performance product to begin with so there's no premium in price/performance for going for the part with fully enabled GPU clusters (HD 6870).

In the case of CPUs points of diminishing returns is even worse. With the i7-990X you're getting 4% more performance than the i7-980 for a 70% higher price.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Again, missing the point entirely. The target is not getting to the level of hugely diminishing points of returns. I've explained this TWICE to you and you seem to not understand this simple concept. Like I told you before also, if all I mentioned in my arguments was bang-for-buck I wouldn't have mentioned the GTX 570 as being a good buy. Another thing you missed. For a few people a Performance card isn't enough, and for that there's the lower-tier Enthusiast card. What you want to avoid here is getting is getting to a point where you end up paying significantly more for a small performance increase that won't even make games that were previously unplayable playable.

Talking about the HD 5850 and HD 5870, the HD 5870 has 15% higher performance and costed 33% more when introduced. Clearly that's not as bad as the GTX 580 and 570, but the HD 5850 was definitely a better buy then. If you're lucky you can get an HD 5870 for $200 or so now, which is excellent.

Your "castrated" GPU argument, again, fails because it simply makes no sense. Both Cayman and Barts have great yields by now. The majority of the dies just have clusters laser cut even though they were functional so they can fill demand at the lower price point; that's it. In the case of Barts it was a Performance product to begin with so there's no premium in price/performance for going for the part with fully enabled GPU clusters (HD 6870).

In the case of CPUs points of diminishing returns is even worse. With the i7-990X you're getting 4% more performance than the i7-980 for a 70% higher price.


Your point is your point alone and only valid to you. You can keep re-iterating it, and by your post count/join date ratio I would say your going to keep doing so.

I'm glad you enjoy bang for the buck cards, thats great. I enjoy getting the top end cards so I don't have to settle for overclocking/bios flashing and convincing others my way is the best to be happy.

Good day sir