Nvidia Speculations - "I'll be back!"

Avatar007

Junior Member
Sep 16, 2003
10
0
0
Interesting speculations from Josh - NVIDIA News and Insights - Penstarsys.com

...
There are also some indications that these parts (NV38/NV36) could well be redesigned enough to partially make up the performance discrepancy between NVIDIA parts and ATI parts.
The NV30 was well underway and nearly finalized when the DX9 standard was agreed upon. NVIDIA originally went at DX9 alone, and thought it may be able to force Microsoft to make its own design the basis for DX9. During the initial stages of DX9 development, NVIDIA removed itself from the group developing the technology standard.
Only when DX9 was far into the development stages did NVIDIA rejoin the group. By then DX9 was nearly finalized, and ATI had a very good idea what it would be like (and had been working on the R300 core since the basic inception of DX9).
NVIDIA was left with an underperforming part in floating point fragment programs, and they knew it.
The NV35 development showed that NVIDIA did realize it made some significant mistakes with the NV30, and the NV35 was designed to work around those problems. VIDIA is a smart company, and when the final specifications for DX9 were made official, NVIDIA knew they would run into problems in the future.
This was over 1.5 years ago, and during that time design changes to the NV36 and NV38 could be implemented to help this situation.

My belief (and it is only a belief) is that the NV36 and NV38 parts will be much better PS 2.0 performers than the NV30, NV31, NV34, and NV35 parts.

Lets assume for once this is true.
It would mean that:

  • - NVidia will have to admit that their previous NV35 generation boards had DX9 shortcommings (PS 2.0/ARB)
    - Existing (expensive) FX3900-card owners will be... dissapointed.
    - All previous FX/G-boards obsolete.

    - Round 3 could be a NVidia round again.
    - Half-Life 2 with all his glory best played on an NVidia board after all.
    - Competition back in track


 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
NV38 probably won't beat HL2 to the shelves. But it will delay my buying of HL2 or a new vid card.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Avatar007
Interesting speculations from Josh - NVIDIA News and Insights - Penstarsys.com

...
There are also some indications that these parts (NV38/NV36) could well be redesigned enough to partially make up the performance discrepancy between NVIDIA parts and ATI parts.
The NV30 was well underway and nearly finalized when the DX9 standard was agreed upon. NVIDIA originally went at DX9 alone, and thought it may be able to force Microsoft to make its own design the basis for DX9. During the initial stages of DX9 development, NVIDIA removed itself from the group developing the technology standard.
Only when DX9 was far into the development stages did NVIDIA rejoin the group. By then DX9 was nearly finalized, and ATI had a very good idea what it would be like (and had been working on the R300 core since the basic inception of DX9).
NVIDIA was left with an underperforming part in floating point fragment programs, and they knew it.
The NV35 development showed that NVIDIA did realize it made some significant mistakes with the NV30, and the NV35 was designed to work around those problems. VIDIA is a smart company, and when the final specifications for DX9 were made official, NVIDIA knew they would run into problems in the future.
This was over 1.5 years ago, and during that time design changes to the NV36 and NV38 could be implemented to help this situation.

My belief (and it is only a belief) is that the NV36 and NV38 parts will be much better PS 2.0 performers than the NV30, NV31, NV34, and NV35 parts.

Lets assume for once this is true.
It would mean that:

  • - NVidia will have to admit that their previous NV35 generation boards had DX9 shortcommings (PS 2.0/ARB)
    - Existing (expensive) FX3900-card owners will be... dissapointed.
    - All previous FX/G-boards obsolete.

    - Round 3 could be a NVidia round again.
    - Half-Life 2 with all his glory best played on an NVidia board after all.
    - Competition back in track

wow you really summed it up, a newer card from nvidia running faster than their old card.
 

Luagsch

Golden Member
Apr 25, 2003
1,614
0
0
from the same article
The one thing we can bet on is that the NV40 will not have the problems the NV3x series has now. PS 2.0 performance problems will be a thing of the past with the NV40, as it is supposed to have PS/VS 3.0 hardware.

<IMO> (easier to say it in the beginning than to add a "i think" and "i guess" at the beginning of every sentence;) )
this one about the nv40 is quite sure.
but the thing about the nv38 and nv36.... i don't know if it is "so easy" to implement. it would mean some drastic changes on an basic level...
the improvements from nv30 to nv35 weren't much of a ps2.0 thing basically a massive boost in raw speed. it will be quite the same for the nv38.
the nv38 might be for the nv35 what the 9800 was for the 9700. small good improvements but nothing drastic.

(sorry nebor... you'll have to wait for q1 2004 to play half life 2 :p )
</IMO>
 

Avatar007

Junior Member
Sep 16, 2003
10
0
0
You're right Luagsch, I also think it's certainly not so easy to implement since it is an architecture thing.
Otherwise we would already had the solution in NV35 boards since the problem with full precision FP in PS2.0 and ARB was already known for NV30 boards.

On the other hand, like Josh from Penstarsys stated, for the new NV38 board NVidia had more time (1.5 years) and I can imagine that some shortcomings from NV35 could have been solved. These guys from NVidia are not from yesterday >:eek:), they knew their weaknesses.
E.g. the number of temp registers could have been increased (also a very weak point on NV35), possibly allowing higher shader program performance, and maybe allowing more room for FP precision hints algorithms (deciding for FP16 or FP32).
Although I think these hints have to come from the application, and I seriously doubt that it can be solved with software without loosing Q.(det 51.75)
(another prove of the shortcomings from Beta Det 51.75: Albatron GeForce FX5900 Turbo review - Nightmare - NVNews)
But it remains a wild guess.

I just read some posts about leaked ATI slides comming from 'competition'
(Forum elitebastards - On the leaked ATI slides - digitalwanderer)
and I'm thinking Luagsch probably is wright. NV38 won't solve it.
If nvidia has to use such methods, trying to bring discredit to ATI by 'leaking' some sales slides, then there is seriously something wrong with this compagny and I even doubt if NV4X and PS3.0 will be that much of an innovation other then some self-made-hoeray-and-not-realy-working nvidia hipe so they can hold you in buying an ATI card.

What ever.
 

Luagsch

Golden Member
Apr 25, 2003
1,614
0
0
i wonder: i've seen the digitalwanderer in a lots of forums but haven't seen him here. was he never here or was he banned?
 

ndee

Lifer
Jul 18, 2000
12,680
1
0
OK, the R300 was long out before NV30, right? So I think the

"The NV30 was well underway and nearly finalized when the DX9 standard was agreed upon. NVIDIA originally went at DX9 alone, and thought it may be able to force Microsoft to make its own design the basis for DX9. During the initial stages of DX9 development, NVIDIA removed itself from the group developing the technology standard.
Only when DX9 was far into the development stages did NVIDIA rejoin the group. By then DX9 was nearly finalized, and ATI had a very good idea what it would be like (and had been working on the R300 core since the basic inception of DX9)."

statement is kinda weak.
 

Finnkc

Senior member
Jul 9, 2003
422
0
0
its all the same thing ...

one is better then the other , the lesser scrambles to become the best, the once best starts all over again reclaiming his title as the best.

wash, rinse, repeat.
 

Luagsch

Golden Member
Apr 25, 2003
1,614
0
0
Originally posted by: ndee
OK, the R300 was long out before NV30, right? So I think the

"The NV30 was well underway and nearly finalized when the DX9 standard was agreed upon. NVIDIA originally went at DX9 alone, and thought it may be able to force Microsoft to make its own design the basis for DX9. During the initial stages of DX9 development, NVIDIA removed itself from the group developing the technology standard.
Only when DX9 was far into the development stages did NVIDIA rejoin the group. By then DX9 was nearly finalized, and ATI had a very good idea what it would be like (and had been working on the R300 core since the basic inception of DX9)."

statement is kinda weak.
(ok not sure about the dates here (could be +- a month))
problem was that nvidia planned for a november 2002 introduction of the nv30, 2 month after the 9700 (and at the same time as dx9 was officially released). but with all the problems they had (0,13 process, bad yield, etc) it was postponed until february 2003 and was crap as gfx-card (maybe it would be good as doomsday-device for bond-movies...(think hurricanes and audial torture) :) ). so nv30 and r300 were in quite same time-frame development-wise.

EDIT englush or somesing and :)
 

Cuular

Senior member
Aug 2, 2001
804
18
81
Because Microsoft was insisting on using full floating point all the way through, rather than making the DX9 standard match their lower resolution modes.

Basically MS didn't cave into their demands for lower resolution for the standard, so nvidia took their toy's and left.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: Cuular
Because Microsoft was insisting on using full floating point all the way through, rather than making the DX9 standard match their lower resolution modes.

Basically MS didn't cave into their demands for lower resolution for the standard, so nvidia took their toy's and left.
Just like they left Futuremark beta program when 3DMark03 showed how badly they had stumbled -- nV's current management seems to show a pattern of running away from anything unpleasant, then trying to cover it up with lies, bribes, and lawyers.

Hopefully nV40 will be good enough that management will stop hiding under their desks and will let the coders write decent drivers again.
 

Finnkc

Senior member
Jul 9, 2003
422
0
0
Originally posted by: DaveSimmons
Originally posted by: Cuular
Because Microsoft was insisting on using full floating point all the way through, rather than making the DX9 standard match their lower resolution modes.

Basically MS didn't cave into their demands for lower resolution for the standard, so nvidia took their toy's and left.
Just like they left Futuremark beta program when 3DMark03 showed how badly they had stumbled -- nV's current management seems to show a pattern of running away from anything unpleasant, then trying to cover it up with lies, bribes, and lawyers.

Hopefully nV40 will be good enough that management will stop hiding under their desks and will let the coders write decent drivers again.

yes we hope ... cause they have to do something soon ....