Saarland University Professor says 8800GTX might actually Have 160 stream processors!!!!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
OMGD fudzilla is now here! Should make for exciting and contradictory bs, but at least use a real site. :p
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
What you are saying is true and if a product wasn't capable of convincingly destroying its competition you wouldn't do it.

G80 doesn't suffer that problem though, there is nothing in the consumer market capable of getting withing bulls-roar of it, so consumers have already been well and truly convinced of its power, if you are able to achieve that and still have enough spare power in reserve to trouble/embarrass the upcoming competition as well as the current competition then that is an awesome situation and definitely will add to the reputation of your product and those based off of it.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Gstanfor
What you are saying is true and if a product wasn't capable of convincingly destroying its competition you wouldn't do it.

G80 doesn't suffer that problem though, there is nothing in the consumer market capable of getting withing bulls-roar of it, so consumers have already been well and truly convinced of its power, if you are able to achieve that and still have enough spare power in reserve to trouble/embarrass the upcoming competition as well as the current competition then that is an awesome situation and definitely will add to the reputation of your product and those based off of it.

The thing is what you are saying just doesn't add up to what we actually see in the real world... True, we do tend to see a gradual increase in performance over time, but that is simply due to driver maturity. However, what we have also seen NV do in the past is enabling IQ reducing optimizations in an effort to boost FPS... Optimizations such as these make way more sense if a company is trying to seek a competitive advantage because release day reviews tend to be run at default settings and always seem to be done in a hurry. This gets them the highest FPS on the board at launch day, which is when it counts...

Look, I bought an 8800GTX on launch day, and I read, flipped through, and compared benchmarks from about 10-15 different such articles on that day. Since then, I haven't given much thought to it and I haven't compared 8800GTX benchmarks to anything anymore. The point is that this is a fast, competitive industy and the customer has a short attention span... NVIDIA had my attention on Nov. 8, and subsequently got my money. The only reason anyone has given a single thought to the 8-series drivers is because they are somewhat lacking even 5 months after the launch. Fact-of-the-matter is that even if they do did pull an addtional 30% increase in performance out of the 8800GTX it's not as if I'd buy another one... because I wouldn't be able to even use both of them...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
if nvidia releases a "performance enhancing" driver that increases performance at the expense of IQ i'll be as pissed off about it as anyone else -- I really don't believe that is likely to happen though. Its simply that nvidia hasn't yet shown us everything that G80 is capable of. nvidia is always seeking to gain or enlarge a competitive adavantage -- if they stop doing that, they risk losing top spot.

nvidia isn't going to all this trouble to impress the enthusiasts (though its nice if they are impressed) - its aimed at normal consumers and is all about strengthening the nvidia brand in their eyes.

If you don't want to see it the way I've presented it, thats fine. Fact of the matter is, most average consumers will get the message nvidia is pushing, and in the end thats all they (nvidia) care about.

Personally I'm hoping that we will see a budget chip from nvidia this generation that will rival the success the GF2MX enjoyed amongst average consumers.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
If they're there, it'd be pretty easy to just look at a blowup picture of the chip and count the execution units. Anyone want to crack open their 8800gtx and find out for us?
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: zephyrprime
If they're there, it'd be pretty easy to just look at a blowup picture of the chip and count the execution units. Anyone want to crack open their 8800gtx and find out for us?

Sure, just a minute while I get my chisel and hammer.....
 

SoBizarre

Member
Dec 7, 2000
58
0
0
I'm shocked. Next thing we know, somebody is going to come in and tell us that our brains are not used to their full potential. :Q
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SoBizarre
I'm shocked. Next thing we know, somebody is going to come in and tell us that our brains are not used to their full potential. :Q

That depends, are using the latest driver?
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: zephyrprime
If they're there, it'd be pretty easy to just look at a blowup picture of the chip and count the execution units. Anyone want to crack open their 8800gtx and find out for us?

I just did... and guess what? there are actually another 128 stream processors, for a total of 256! I put a little conductive ink where they were laser cut and now I score higher than a pair of 8800GTXs in SLi in 3Dmark!
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: SoBizarre
I'm shocked. Next thing we know, somebody is going to come in and tell us that our brains are not used to their full potential. :Q

Kidding, right? In case you dont know, we only use about 10% of their potential
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
Originally posted by: ShadowOfMyself
Originally posted by: SoBizarre
I'm shocked. Next thing we know, somebody is going to come in and tell us that our brains are not used to their full potential. :Q

Kidding, right? In case you dont know, we only use about 10% of their potential
In case you don't know, that's an oft-debunked myth.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I guess no one remembers the original Detonator drivers.

That put the original TNT/TNT2 cards through the roof.
 

Mallet

Member
Aug 22, 2005
25
0
0
Originally posted by: bunnyfubbles
Complete crap, this is no more than way back when, when so many idiots believed nVidia purposefully held back their cards with their drivers until a "rainy day" where they'd magically release a new driver that would drastically boost performance.

There is no good reason to release a card in a crippled state if you can do it otherwise. The notion that nVidia is pulling punches in a heavy weight fight is laughable. What's the reasoning? That once R600 comes, nVidia magically boosts performance of their current cards to match it or better keep up? Why not have it done from the beginning, so that there would have been even more reason for people to adopt nVidia high end, and once R600 comes the difference is less drastic and AMD then looks foolish for taking so long to finally have a mere comparable solution?

How about their recent drivers that enabled "Transparency anti-aliasing" on Geforce 6 series cards, a feature thought to only be supported on the 7 series.

I wouldn't think they would do it to hide performance for later, but if they were worried about the silicon yield they may have added processors so they could disable a defective group and not have to scrap that huge 400 million transistor peice of silicon.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
How about their recent drivers that enabled "Transparency anti-aliasing" on Geforce 6 series cards, a feature thought to only be supported on the 7 series.

...but where is the competitive advantage in doing this? I mean, they enabled a feature on the 6-series cards a few months before the 8-series launched, well after the 6-series would have any marketing impact. While it's a nice benefit for 6-series owners, I fail to see what positive impact it had on the competition between NVIDIA and ATI. My guess is that NV might have discovered a new or more efficient way to implement trAA in their drivers and realized that it was feasible to apply this method to the 6-series hardware. IMO, the reason trAA wasn't ever implemented during the 6-series "prime" is because it would have incurred too much of a performance hit. Holding back a feature because of adverse side effects or poor performance is quite different that intentionally hampering high end hardware with the intention of unleashing its potential at a later date.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Mallet
Originally posted by: bunnyfubbles
Complete crap, this is no more than way back when, when so many idiots believed nVidia purposefully held back their cards with their drivers until a "rainy day" where they'd magically release a new driver that would drastically boost performance.

There is no good reason to release a card in a crippled state if you can do it otherwise. The notion that nVidia is pulling punches in a heavy weight fight is laughable. What's the reasoning? That once R600 comes, nVidia magically boosts performance of their current cards to match it or better keep up? Why not have it done from the beginning, so that there would have been even more reason for people to adopt nVidia high end, and once R600 comes the difference is less drastic and AMD then looks foolish for taking so long to finally have a mere comparable solution?

How about their recent drivers that enabled "Transparency anti-aliasing" on Geforce 6 series cards, a feature thought to only be supported on the 7 series.

I wouldn't think they would do it to hide performance for later, but if they were worried about the silicon yield they may have added processors so they could disable a defective group and not have to scrap that huge 400 million transistor peice of silicon.

that one was actually pretty annoying. R4xx owners got the capability (such as it is on ATi gpu's) a good 6 to 9 months earlier on. It was pretty obvious to most that if R4xx could be made to permorm such tricks then nv4x could also (nv4x being a far more advanced piece of hardware). nvidia chose to keep us waiting though. It was a feature that should have been there from the start as anyone who has had to put up with moire patterns and rainbow arches in textures (the hinged bridge in Mafia being an excellent example) will attest.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: nitromullet
How about their recent drivers that enabled "Transparency anti-aliasing" on Geforce 6 series cards, a feature thought to only be supported on the 7 series.

...but where is the competitive advantage in doing this? I mean, they enabled a feature on the 6-series cards a few months before the 8-series launched, well after the 6-series would have any marketing impact. While it's a nice benefit for 6-series owners, I fail to see what positive impact it had on the competition between NVIDIA and ATI. My guess is that NV might have discovered a new or more efficient way to implement trAA in their drivers and realized that it was feasible to apply this method to the 6-series hardware. IMO, the reason trAA wasn't ever implemented during the 6-series "prime" is because it would have incurred too much of a performance hit. Holding back a feature because of adverse side effects or poor performance is quite different that intentionally hampering high end hardware with the intention of unleashing its potential at a later date.

I've never bought the performance hit argument. THe GPU makers just need to provide the features -- I'll decide if I think they are a performance hit or not. (it doesn't really slow nv4x down that much either). nvidia held back the feature so they would have a selling point for g7x (nv4x's refresh).
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
you guys are all idiots.
IF there are 160 sps on the 8800gts they will NEVER be actuivated in the current cards. NV might just be keeping them locked and will sell 160 sp cards as 8800 ultra or something like that.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: tanishalfelven
you guys are all idiots.
IF there are 160 sps on the 8800gts they will NEVER be actuivated in the current cards. NV might just be keeping them locked and will sell 160 sp cards as 8800 ultra or something like that.

Uhm... if there are the extra SP and they AREN'T laser cut... a bios flash would enable them.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: allies
Originally posted by: tanishalfelven
you guys are all idiots.
IF there are 160 sps on the 8800gts they will NEVER be actuivated in the current cards. NV might just be keeping them locked and will sell 160 sp cards as 8800 ultra or something like that.

Uhm... if there are the extra SP and they AREN'T laser cut... a bios flash would enable them.

what is they are laser cut. genius. you think nvidia is stupid enough to leave them uncut. i remember what happened with the x800gtos. but those were mid-range cards so it wasn't that bad. this is a high end that (if the rumor is true) might even qual the next high end. you think nvidia is gonna allow that.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
We should all have IL-2 and play it online 128 player server. German (ATI) vs Britain/USA (Nvidia)

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Gstanfor
I've never bought the performance hit argument. THe GPU makers just need to provide the features -- I'll decide if I think they are a performance hit or not. (it doesn't really slow nv4x down that much either). nvidia held back the feature so they would have a selling point for g7x (nv4x's refresh).

My point is that the feature only really became usable over time because of improvments in the drivers with regards to trAA performance, not that they unlocked some previoulsy hampered part of the hardware. My guess is that G70 is based on NV40, so the actual hardware was there to do trAA, but initially is was abysmally slow. However, once G70 launched, they figured they had the power to enable the feature. If you remember back to when G70 launched, there were a few glithches with trAA as well (white textures in HL2 come to mind) which were only resolved over time. Given that, obvioulsy some development has gone into the trAA feature. I do agree with the idea that they really should let me decide if it's worth it or not to enable a feature based on peformance, but it's also possible that trAA wasn't realistically usable when NV40 was in the works because of the state of the drivers at that time.

if nvidia releases a "performance enhancing" driver that increases performance at the expense of IQ i'll be as pissed off about it as anyone else -- I really don't believe that is likely to happen though.

I think in a way NV actually tried to do something sort of like that with their new CSAA modes, which don't look as good as trSSAA but have less of a performance hit. However, they caught the wrath of guys like BFG10K, and re-implemented the trSSAA. I'm honestly very glad that they re-implemented trSSAA, but I couldn't blame NV for that one. Last year G7x had superior AA to R5xx but ATI had prettier colors and noticably better AF, and just about every single review site gave the IQ nod to ATI, but very few mentioned NV's superior AA...
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: tanishalfelven
Originally posted by: allies
Originally posted by: tanishalfelven
you guys are all idiots.
IF there are 160 sps on the 8800gts they will NEVER be actuivated in the current cards. NV might just be keeping them locked and will sell 160 sp cards as 8800 ultra or something like that.

Uhm... if there are the extra SP and they AREN'T laser cut... a bios flash would enable them.

what is they are laser cut. genius. you think nvidia is stupid enough to leave them uncut. i remember what happened with the x800gtos. but those were mid-range cards so it wasn't that bad. this is a high end that (if the rumor is true) might even qual the next high end. you think nvidia is gonna allow that.


What if they aren't? It's easy to play that game. What happened with the x800GTOs? What was bad about them? I don't know WTF you're talking about... the fact that they could be unlocked did not hurt ATi in any way, rather helped them.

Since nvidia has gone through a die shrink, they'll be able to have much increased clockspeeds, GDDR4, AND the extra 32 SP (if they are on the die). So even if the currect 8800gtx unlocks, it will still be 10-20% slower. I don't KNOW what nvidia has on its mind, and neither do YOU, so I'm not going to assume anything. It would be fantastic if all 8800gtx owners had the opportunity to see if their theoretical extra SP are functional, and if not, who cares.

Don't act like you're the sh!t and know everything.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: allies
Originally posted by: tanishalfelven
Originally posted by: allies
Originally posted by: tanishalfelven
you guys are all idiots.
IF there are 160 sps on the 8800gts they will NEVER be actuivated in the current cards. NV might just be keeping them locked and will sell 160 sp cards as 8800 ultra or something like that.

Uhm... if there are the extra SP and they AREN'T laser cut... a bios flash would enable them.

what is they are laser cut. genius. you think nvidia is stupid enough to leave them uncut. i remember what happened with the x800gtos. but those were mid-range cards so it wasn't that bad. this is a high end that (if the rumor is true) might even qual the next high end. you think nvidia is gonna allow that.


What if they aren't? It's easy to play that game. What happened with the x800GTOs? What was bad about them? I don't know WTF you're talking about... the fact that they could be unlocked did not hurt ATi in any way, rather helped them.

Since nvidia has gone through a die shrink, they'll be able to have much increased clockspeeds, GDDR4, AND the extra 32 SP (if they are on the die). So even if the currect 8800gtx unlocks, it will still be 10-20% slower. I don't KNOW what nvidia has on its mind, and neither do YOU, so I'm not going to assume anything. It would be fantastic if all 8800gtx owners had the opportunity to see if their theoretical extra SP are functional, and if not, who cares.

Don't act like you're the sh!t and know everything.

oh hush.
you were all debating if nvidia would enable them in a future release. i do know enough to know that IF they're there nobody will ever see them on todays 8800gtxs.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Gstanfor
if nvidia releases a "performance enhancing" driver that increases performance at the expense of IQ i'll be as pissed off about it as anyone else -- I really don't believe that is likely to happen though. Its simply that nvidia hasn't yet shown us everything that G80 is capable of. nvidia is always seeking to gain or enlarge a competitive adavantage -- if they stop doing that, they risk losing top spot.

nvidia isn't going to all this trouble to impress the enthusiasts (though its nice if they are impressed) - its aimed at normal consumers and is all about strengthening the nvidia brand in their eyes.


If you don't want to see it the way I've presented it, thats fine. Fact of the matter is, most average consumers will get the message nvidia is pushing, and in the end thats all they (nvidia) care about.

Personally I'm hoping that we will see a budget chip from nvidia this generation that will rival the success the GF2MX enjoyed amongst average consumers.

I thought viral marketers weren't allowed on these boards. But I've been gone awhile so either way, I guess it doesn't really affect me so much..

The issue of whether or not graphics card companies "hold back" their cards or not is a good one.

It does make good business sense for a company to hold back its product somewhat (if it can still maintain an advantage over its competitor, like the 8800GTX has over everything else) and then unleash a performance-improving driver update at a later date to trump the competitor's new product.
----------

But on the other hand, these are very complex cards, and every new generation adds something (like the larger data bus and unified shaders on G80). Completely optimized drivers are not something that can just happen overnight; doesn't everyone remember ATI's update for the X1800/X1900 memory bus that happened about 3-6 months after the X1900 was released? It improved performance by about 5-20%.

It makes absolute sense for a company like ATI or Nvidia to wait until the competitor's announced product to release their "big" performance update, but it's not like they're just sitting on this thing for months and months. It takes awhile to make sure the new releases are bug free, so while they're working on a big improvement to memory bandwidth or whatever, they're doing extensive Q&A on driver stability.