Saarland University Professor says 8800GTX might actually Have 160 stream processors!!!!

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Doesnt surprise me... I thought the 8800 GTX was cut down since day 1, since they didnt (and still dont) need anything more... Kinda how they kept the 7800 at 450 core back in the day, always the same "trick"
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Nightmare225
Originally posted by: TecHNooB
fudzilla eh?

ROFL, I didn't realize this was Fud's site. Oh, well, lets just treat this as a really cool rumor, then... ;)

Are you supposed to take this site seriously? It looks really nice and professional, but cmon... LOL it sounds like hes making fun of himself
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Complete crap, this is no more than way back when, when so many idiots believed nVidia purposefully held back their cards with their drivers until a "rainy day" where they'd magically release a new driver that would drastically boost performance.

There is no good reason to release a card in a crippled state if you can do it otherwise. The notion that nVidia is pulling punches in a heavy weight fight is laughable. What's the reasoning? That once R600 comes, nVidia magically boosts performance of their current cards to match it or better keep up? Why not have it done from the beginning, so that there would have been even more reason for people to adopt nVidia high end, and once R600 comes the difference is less drastic and AMD then looks foolish for taking so long to finally have a mere comparable solution?
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Extra SPs are there but actually noone as of now, wasn't able to see them? :roll:
That was a rumour which was spread in the early days and found that it had no base whatsoever..
Architecturally speaking it seems that it is very easy for them to "fit" those extra 32 SPs in a refresh or include the missing MUL... But I find this rumour impossible and utter crap..
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: bunnyfubbles
Complete crap, this is no more than way back when, when so many idiots believed nVidia purposefully held back their cards with their drivers until a "rainy day" where they'd magically release a new driver that would drastically boost performance.

There is no good reason to release a card in a crippled state if you can do it otherwise. The notion that nVidia is pulling punches in a heavy weight fight is laughable. What's the reasoning? That once R600 comes, nVidia magically boosts performance of their current cards to match it or better keep up? Why not have it done from the beginning, so that there would have been even more reason for people to adopt nVidia high end, and once R600 comes the difference is less drastic and AMD then looks foolish for taking so long to finally have a mere comparable solution?

Because Nvidia has always done it? :confused:

Or do you think Nvidia just by some incredible coincidence always finishes a performance boosting driver JUST the day before Ati launches their card? Thats ridiculous, they obviously wait... maybe to surprise Ati, who knows... The fact is they do it

And I wouldnt be surprised if aside from 8900 they also released a bug free performance vista driver as soon as the r600 hits the shelves... Nvidia is just creepy like that
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
All those people who saying this is untrue are probably people that thought the R600 would be out a month or two after the G80 and have their panties in a wad. I hope it's true news... I'd love it!
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
It would be a very bad move for any company to deliberately withhold performance on their product, just in case a rival should release a product that challenges it.

First of all, I'm sure that the higher the performance delta between Nvidia and ATI, the higher their sales figures are going to be. So Nvidia would be trying to get the most out of the card at all times.

Secondly, how would you feel if you picked up a single 8800 GTS or GTX and it didn't quite have the performance you wanted at high resolutions so you had to pick up a second one and SLI them. Now all of a sudden, Nvidia says "Taa-Daa!! We're giving you a treat! We're posting drivers that will release performance in your card that you should have been able to enjoy from the first day you bought your card!" Now you find that your single card would have had enough performance to play at your preferred resolution, but you had to shell out an extra $500 because Nvidia didn't want you to be able to use the full performance until they said so.

Until there's more than just rumors of some professor who supposedly has proof that no one else has seen, I think we can safely file this away as more Fu(a)d.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Uhm... I'm pretty sure that even though I shelled out $500 for my card, I'd be ECSTATIC if it unlocked to 160 stream processors. If I had enough money to buy two, and did so, I'd be ECSTATIC as well. I mean, the performance delta was already double that of any single ATi card out, so there wasn't really a NEED to be any faster.

Edit: Plus resell value would hold very well if this is true, so you could sell you card for near as much as you originally paid, and you would have had your desired performance for however many months you already had you card.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
It's rather ridiculous to even think that Nvidia would bother to do that. What's somewhat more believable, but still not probable, would be that they fabbed them with the extra shaders and disabled a block due to yield issues. Thus, the GTS would have 2 blocks disabled rather than just 1. The G80 is a big complex chip, and I could see them have at least some initial issues with the early silicon. If that is the case, they would have laser-cut the extra shaders, and no driver is going to bring them back. The 8900 could then just be a respin with all of the shaders enabled on the high-end model and maybe a speed bump.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
It's possible that they planned it with 160 shaders but had heat and power problems so they disabled some of them...I doubt it though, in that case they would have just redone the chip.

About some of the previous posts, companies hold back ALL THE TIME, it steals the spotlight and can boost stocks quite a bit.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
This whole holding back performance thing is stuff internet forum legends and myths are made of. Can any of you actually name a concrete example where a company intentionally held back a significant amount of performance on a product, and then later released it via a driver? Now, I know that drivers will tend to improve over time, but I'm talking about a significant margin... Like, if X2900XT beats the 8800GTX at lanuch, but the next day, through a driver upgrade, the 8800GTX beats the X2900XT across the board... Has that EVER actually happened, or is it a myth?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
What doesn't surprise is that the rumor came from fudzilla. And I still dont believe it.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: nitromullet
This whole holding back performance thing is stuff internet forum legends and myths are made of. Can any of you actually name a concrete example where a company intentionally held back a significant amount of performance on a product, and then later released it via a driver? Now, I know that drivers will tend to improve over time, but I'm talking about a significant margin... Like, if X2900XT beats the 8800GTX at lanuch, but the next day, through a driver upgrade, the 8800GTX beats the X2900XT across the board... Has that EVER actually happened, or is it a myth?

Geforce 3 vs Radeon 8500 is a famous enough example. Text

Having said that, this rumor is ridiclous. Its possible that a revised version of G80 may have extra shading units among other things available to it, but those who are expecting their current G80 to be magically unlocked somehow and grow extra shaders are in for a disappointment.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: Gstanfor
Originally posted by: nitromullet
This whole holding back performance thing is stuff internet forum legends and myths are made of. Can any of you actually name a concrete example where a company intentionally held back a significant amount of performance on a product, and then later released it via a driver? Now, I know that drivers will tend to improve over time, but I'm talking about a significant margin... Like, if X2900XT beats the 8800GTX at lanuch, but the next day, through a driver upgrade, the 8800GTX beats the X2900XT across the board... Has that EVER actually happened, or is it a myth?

Geforce 3 vs Radeon 8500 is a famous enough example. Text

Having said that, this rumor is ridiclous. Its possible that a revised version of G80 may have extra shading units among other things available to it, but those who are expecting their current G80 to be magically unlocked somehow and grow extra shaders are in for a disappointment.

I remember that driver release, it left far too many impressionable people expecting the same thing with every new release and some were left scratching their heads when the 8500 pulled away from the GF3 line and was later able to nip at the heels of the GF4.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Gstanfor
Originally posted by: nitromullet
This whole holding back performance thing is stuff internet forum legends and myths are made of. Can any of you actually name a concrete example where a company intentionally held back a significant amount of performance on a product, and then later released it via a driver? Now, I know that drivers will tend to improve over time, but I'm talking about a significant margin... Like, if X2900XT beats the 8800GTX at lanuch, but the next day, through a driver upgrade, the 8800GTX beats the X2900XT across the board... Has that EVER actually happened, or is it a myth?

Geforce 3 vs Radeon 8500 is a famous enough example. Text

Having said that, this rumor is ridiclous. Its possible that a revised version of G80 may have extra shading units among other things available to it, but those who are expecting their current G80 to be magically unlocked somehow and grow extra shaders are in for a disappointment.

Notice how the article talks about the fact the the driver includes a new OpenGL 1.3 ICD and improved memory access, and well sure enough, you see improvements in an OpenGL title and at higher resolutions... It shows an evolutionary improvement over the preceding driver, what it doesn't show is that NV intentionally restrained their card, and then unleashed it at the right competitive moment.

Don't get me wrong, I EXPECT NV to increase the performance of the 8800GTX from the day that I bought mine (launch day) because the driver is lacking somewhat, but I don't think that they are (or have ever) neutered their own flagship card just to unleash the true power later.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I think you are wrong personally.
While most of us will be happy with the new Detonator 4 drivers, it will unfortunately steal some thunder from another major announcement coming this week. NVIDIA is playing their cards very well once again; soon it?ll be time to see ATI?s hand but don?t assume that NVIDIA has shown all this early.

JHH is a master strategist and his business cunning is a major reason why nvidia is where they are today (and why his competitors find themselves where they are).
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: bunnyfubbles
There is no good reason to release a card in a crippled state if you can do it otherwise.

actually, given the right circumstances, there are many good reasons.

 

chrismr

Member
Feb 8, 2007
176
0
0
Originally posted by: BladeVenom
What next? Will people start saying that CPUs aren't being set at their full potential.

They aren't... Core 2's are able to run at higher frequencies across the board than what they are released at. for example, an e4300 clocked at 1.8GHz runs very happily over 3GHz.

Although that is not exactly in the same league as GPU's being intentionally crippled, it still shows your analogy is just... well, wrong.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
JHH is a master strategist and his business cunning is a major reason why nvidia is where they are today (and why his competitors find themselves where they are).

I don't disagree with that statement at all. NVIDIA, in general, is very savvy. However, I would question the wisdom of intentionally crippling your own product with the intention of unleashing them later. Video cards and gpus probably get reviewed more often in their first month of exisitence then ever, and those numbers are the ones that people will find when they hit google to look for benchmarks. That doesn't even take into account the buzz that a new gpu has in the community when it launches and the amount of money spent for launch parties etc... How many times have people said that the video forum is boring without a launch from ATI..? Yeah, sounds like a great time for NV to unleash the power of an almost 6 month old gpu (yawn)... The marketing buzz for the 8800 cards is over, and no amount of driver tweaking is going to revive it. Besides, NV doesn't even have SLI working yet for the 8-series under Vista, which I would consider "basic functionality" for an SLI capable card such as the 8800GTX... NV would make a lot more money if it was actually a realistic option for people running Vista to buy two cards...

This thread is proof of that alone... Over 24hrs old, and barely halfway down the second page... There is no real benefit to using a strategy that cripples your own product...