NVIDIA's Next-Generation Architecture @ firingsquad (NV3x info)

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
http://firingsquad.gamers.com/hardware/cinefx/default.asp


The key message NVIDIA plans to deliver with the NV3x family is the cinema quality graphics effects it brings to the table. This is something we?ve been hearing from NVIDIA for quite awhile now, and ultimately reached reality when NVIDIA demonstrated a Quadro GPU rendering scenes from Final Fantasy: The Spirits Within at SIGGRAPH 2001 last summer. As impressive as that demonstration was, NVIDIA feels it has taken an even greater step forward in bringing the world of film to the PC with its next generation graphics technology. To highlight this, NVIDIA has chosen the moniker ?CineFX? to brand the graphics engine of its next generation graphics processor. While NVIDIA hasn?t disclosed the complete details of its new CineFX architecture, a few morsels of information have just been revealed.

I like this tasty bit right here

Another new improvement introduced with NV3x is support for 64-bit and 128-bit color (16 or 32-bit floating point for each RGB component). As you can imagine, this results in images with more vibrant color ranges. A common example is fog. Chances are, if you?ve played lots of games with extensive use of fog or smoke, artifacts were frequently visible. With a wider color range these artifacts are eliminated.

and finally

Other details that we do know is that NV3x will be built off TSMC?s 0.13-micron manufacturing process, will fully support AGP 8X, and supports high-speed DDR-II memory.
 

jbond04

Senior member
Oct 18, 2000
505
0
71
Jeez, Adul. Tryin' to put Brandon out of business? ;)

Personally, I'm waiting for the NV30. I like ATi's offering very much, but they don't have the programmability in their pixel shaders that they should (like looping, branching, etc.) for 3D animation. I think that it will be a while before we see DX9 games, but 3D animation/visualization professionals can enjoy super fast rendering right away. I think that these card, combined with the Cg compiler, will really take the graphics in games to a new level. We may soon have a killer app on our hands...and it's called DX9! :Q

EDIT: Fixed "Bradon's" name. Oops. :p
 

BDawg

Lifer
Oct 31, 2000
11,631
2
0
Originally posted by: Adul
http://firingsquad.gamers.com/hardware/cinefx/default.asp



Another new improvement introduced with NV3x is support for 64-bit and 128-bit color (16 or 32-bit floating point for each RGB component). As you can imagine, this results in images with more vibrant color ranges. A common example is fog. Chances are, if you?ve played lots of games with extensive use of fog or smoke, artifacts were frequently visible. With a wider color range these artifacts are eliminated.

I don't understand how this increase will make more vibrant colors current 32-bit colors produces 16,777,216 colors + 8 bits for an alpha channel.

A 1600x1200 monitor can only display 1,920,000 colors (pixels) at any give time. How is an increase in color depth going to help when we can't even use all of the colors we have? Is all we're doing increasing the palate of possible colors?

 
Aug 10, 2001
10,420
2
0
Ummm...32-bit per component color (or 128-bit RGBA) would have a palette of 7.92282x10^28 colors. That sounds crazy.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91
Originally posted by: jbond04
Jeez, Adul. Tryin' to put Bradon out of business? ;)

Personally, I'm waiting for the NV30. I like ATi's offering very much, but they don't have the programmability in their pixel shaders that they should (like looping, branching, etc.) for 3D animation. I think that it will be a while before we see DX9 games, but 3D animation/visualization professionals can enjoy super fast rendering right away. I think that these card, combined with the Cg compiler, will really take the graphics in games to a new level. We may soon have a killer app on our hands...and it's called DX9! :Q

Who is Bradon?? :D

Anyway, I get up at 7:30, drive 60 miles to Raleigh everyday to take a class from 9:50 til 11:20. Then drive 60 miles back home to work from 1:00 til 6:00. Needless to say, I'm fuggin' tired at the end of the day :D
 

GT1999

Diamond Member
Oct 10, 1999
5,261
1
71
Originally posted by: NFS4

Who is Bradon?? :D

Anyway, I get up at 7:30, drive 60 miles to Raleigh everyday to take a class from 9:50 til 11:20. Then drive 60 miles back home to work from 1:00 til 6:00. Needless to say, I'm fuggin' tired at the end of the day :D


Heh man I hear ya on that one. I do the same thing, though only a 20 min drive. Wake up at 6:50, class done at 10:30, nap, work 2pm-7pm.. :( Gotta love this "summer break".

Beeh, on topic though: I can't wait for NV30. It'll be interesting to see how it stacks up against ATI's 9700.
 
Jun 18, 2000
11,187
763
126
Originally posted by: BDawg
A 1600x1200 monitor can only display 1,920,000 colors (pixels) at any give time. How is an increase in color depth going to help when we can't even use all of the colors we have?
What would displaying all available colors on the screen have anything to do with higher color bit depths?
continued...[/b]
Is all we're doing increasing the palate of possible colors?
Pretty much, yeah. As fragment shaders get increasingly complex, the availability of more color gradients between shades becomes more important.

On a side note, does anybody know whether or not the R300 will support higher color texture formats? Internal 128-bit FP precision is supported, obviously, but I haven't read anything about the framebuffer or texture formats? Or am I not looking hard enough...
 

GoodRevrnd

Diamond Member
Dec 27, 2001
6,801
581
126
Is there a chart for ATI like the one on page 2 of this firingsquad infobit so we can compare the 9700 stats up against that?
 

rahvin

Elite Member
Oct 10, 1999
8,475
1
0
The extra color precision is for blending. It will eliminate banding and pixelation generating more realistic final colors.
 

jbond04

Senior member
Oct 18, 2000
505
0
71
Originally posted by: KnightBreed
On a side note, does anybody know whether or not the R300 will support higher color texture formats? Internal 128-bit FP precision is supported, obviously, but I haven't read anything about the framebuffer or texture formats? Or am I not looking hard enough...

Well, KnightBreed, Anand says that the R300 will support HRDI images, which are greater than 32-bit color images, so I would imagine that it would work with other >32-bit formats. Now, whether or not it would handle them very well is another matter. When you start using HRDI images in 3D animation, it tneds to suck up the RAM real fast.
 

Ben50

Senior member
Apr 29, 2001
421
0
0
The new colors will be used for internal calculation. 32-bit color has more than enough colors already for display purposes because we can't even see all of those colors. Having a higher precision will eliminate rounding errors. For example, say one color's value is 10.05 and a second color's value is 10.0. When you add those numbers together you could lose the ending .05 because you could only keep the number accurate to the tenth's place. If you have ever taken chemistry or engineering of some type, they are called significant digits. I would explain further but I think most of you will be able to get the general idea here. If not, get someone else to explain it.
 
Jun 18, 2000
11,187
763
126
Originally posted by: Ben50
The new colors will be used for internal calculation. 32-bit color has more than enough colors already for display purposes because we can't even see all of those colors.
I think you're the missing the point. The 128-bit FP framebuffer is important for multipass rendering. Say, for example, you have a shader that requires 2 or 3 passes. After each pass, even though the internal precision is 128-bit FP, the pixel will need to be dithered down to 32-bit integer to be written to the framebuffer - losing precision for each pass. Though, this won't make much of a difference for your average game, this minor flaw may limit the card's usefulness in professional rendering, where a shader may require dozens or hundreds of passes. Precision here is absolutely critical.

Edit: grammar and typos.
 

Ben50

Senior member
Apr 29, 2001
421
0
0
Originally posted by: KnightBreed
Originally posted by: Ben50
The new colors will be used for internal calculation. 32-bit color has more than enough colors already for display purposes because we can't even see all of those colors.
I think you're the missing the point. The 128-bit FP framebuffer is important for multipass rendering. Say, for example, you have a shader that requires 2 or 3 passes. After each pass, even though the internal precision is 128-bit FP, the pixel will need to be dithered down to 32-bit integer to be written to the framebuffer - losing precision for each pass. Though, this won't make much of a difference for your average game, this minor flaw may limit the card's usefulness in professional rendering, where a shader may require dozens or hundreds of passes. Precision here is absolutely critical.

Edit: grammar and typos.

I think you missed my point because everything I wrote essentially agrees completely with what you just said. And my point that 32-bit color is more than enough for display purposes is still correct because you can't see any more variance in color than what 32-bit already provides. Like both you and I said, the higher precision is used for calculation of things like multipass rendering (your example) among many other things. If I am misunderstanding your post, let me know and I'll try to clarify my point. I am pretty sure we are in agreement though.

 
Jun 18, 2000
11,187
763
126
Originally posted by: jbond04
Well, KnightBreed, Anand says that the R300 will support HRDI images, which are greater than 32-bit color images, so I would imagine that it would work with other >32-bit formats. Now, whether or not it would handle them very well is another matter. When you start using HRDI images in 3D animation, it tneds to suck up the RAM real fast.
I found a great this discussing this very subject ->
here.

According to the FireGL X1 specs here, it supports a "128-bit floating point precision frame buffer."

Hmmm...
 
Jun 18, 2000
11,187
763
126
Originally posted by: Ben50
Originally posted by: KnightBreed
Originally posted by: Ben50
The new colors will be used for internal calculation. 32-bit color has more than enough colors already for display purposes because we can't even see all of those colors.
I think you're the missing the point. The 128-bit FP framebuffer is important for multipass rendering. Say, for example, you have a shader that requires 2 or 3 passes. After each pass, even though the internal precision is 128-bit FP, the pixel will need to be dithered down to 32-bit integer to be written to the framebuffer - losing precision for each pass. Though, this won't make much of a difference for your average game, this minor flaw may limit the card's usefulness in professional rendering, where a shader may require dozens or hundreds of passes. Precision here is absolutely critical.

Edit: grammar and typos.

I think you missed my point because everything I wrote essentially agrees completely with what you just said. And my point that 32-bit color is more than enough for display purposes is still correct because you can't see any more variance in color than what 32-bit already provides. Like both you and I said, the higher precision is used for calculation of things like multipass rendering (your example) among many other things. If I am misunderstanding your post, let me know and I'll try to clarify my point. I am pretty sure we are in agreement though.
My apologies. I didn't see the part that said "enough for display purposes." Sorry for the misunderstanding. :)

Anyway, it seems the R300 may support 128-bit color for the front buffer, but only 32-bit (10:10:10:2) for the display/backbuffer.
 

goulch

Member
Aug 17, 2001
26
0
0
I think the most important improvement PC graphics need is better polygon cruncher.

High resolution low polygonal scene with huge textures is the best PC can do now relative to supposedly inferior game platform such as PS2. Its EE's FPU is 3 times stronger than what a multi- Gigahertz P4 offers.

I hope NVDIA take notes to this in order to maintain its stronghold: crunch out less bs, less importance on textures or fx's, more concentration on triangles.

Just my $0.02.

 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
24bit color is definitely adequate for bitmap images like jpg and such. But like the article says, there are some situations in games where we actually can use more colors. The reason is not to actually display more colors per se, but to avoid round off error. I played morrowind with 24bit and morrowind has a ton of fog. Fog covers everything outdoors. I don't know the specifics of the algorithms involved, but there were places where you could definitely see bands of color and it was somewhat unsightly. However, I think the 128 bit color is excessive and 64 is more than adequate.

Oh, I also agree with goulch.
 

kazeakuma

Golden Member
Feb 13, 2001
1,218
0
0
Originally posted by: goulch
I think the most important improvement PC graphics need is better polygon cruncher.

High resolution low polygonal scene with huge textures is the best PC can do now relative to supposedly inferior game platform such as PS2. Its EE's FPU is 3 times stronger than what a multi- Gigahertz P4 offers.

I hope NVDIA take notes to this in order to maintain its stronghold: crunch out less bs, less importance on textures or fx's, more concentration on triangles.

Just my $0.02.

Polygons are not the biggest focus anymore simply because current tech can pump out more than enough. The PS2 is/was a polygon beast but it doesn't make up for it's shortcomings. Most games are blurry as hell due to limited texture space and the PS2 is quite low res compared to PCs. I don't know why you think the PS2 can pump more polys than a pc (remember on a PC the Video card does all the rendering NOT the CPU so your comparison is invalid) because the Xbox which is based on the GF3 can push far more polys AND textures than the PS2 and PC graphics are already leagues ahead. The reason why some PC games lag behind is they are catering for a broad range of people. You can't shut out half your potential market by requiring them to have a $300 video card. That said there are still PC games which are far prettier than any PS2 game out there.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,389
8,547
126
Originally posted by: kazeakuma
Originally posted by: goulch
I think the most important improvement PC graphics need is better polygon cruncher.

High resolution low polygonal scene with huge textures is the best PC can do now relative to supposedly inferior game platform such as PS2. Its EE's FPU is 3 times stronger than what a multi- Gigahertz P4 offers.

I hope NVDIA take notes to this in order to maintain its stronghold: crunch out less bs, less importance on textures or fx's, more concentration on triangles.

Just my $0.02.

Polygons are not the biggest focus anymore simply because current tech can pump out more than enough. The PS2 is/was a polygon beast but it doesn't make up for it's shortcomings. Most games are blurry as hell due to limited texture space and the PS2 is quite low res compared to PCs. I don't know why you think the PS2 can pump more polys than a pc (remember on a PC the Video card does all the rendering NOT the CPU so your comparison is invalid) because the Xbox which is based on the GF3 can push far more polys AND textures than the PS2 and PC graphics are already leagues ahead. The reason why some PC games lag behind is they are catering for a broad range of people. You can't shut out half your potential market by requiring them to have a $300 video card. That said there are still PC games which are far prettier than any PS2 game out there.
remember that the CPU figures out where the triangles should be then sends vertex data to the graphics card, so yes, the CPU comparison is valid.
 

kazeakuma

Golden Member
Feb 13, 2001
1,218
0
0
Originally posted by: ElFenix
Originally posted by: kazeakuma
Originally posted by: goulch
I think the most important improvement PC graphics need is better polygon cruncher.

High resolution low polygonal scene with huge textures is the best PC can do now relative to supposedly inferior game platform such as PS2. Its EE's FPU is 3 times stronger than what a multi- Gigahertz P4 offers.

I hope NVDIA take notes to this in order to maintain its stronghold: crunch out less bs, less importance on textures or fx's, more concentration on triangles.

Just my $0.02.

Polygons are not the biggest focus anymore simply because current tech can pump out more than enough. The PS2 is/was a polygon beast but it doesn't make up for it's shortcomings. Most games are blurry as hell due to limited texture space and the PS2 is quite low res compared to PCs. I don't know why you think the PS2 can pump more polys than a pc (remember on a PC the Video card does all the rendering NOT the CPU so your comparison is invalid) because the Xbox which is based on the GF3 can push far more polys AND textures than the PS2 and PC graphics are already leagues ahead. The reason why some PC games lag behind is they are catering for a broad range of people. You can't shut out half your potential market by requiring them to have a $300 video card. That said there are still PC games which are far prettier than any PS2 game out there.
remember that the CPU figures out where the triangles should be then sends vertex data to the graphics card, so yes, the CPU comparison is valid.


Ah true, I had forgotten about that. Nonetheless, PCs are able to push more than enough polys. Who wants to play a game with 100million polys but only 16mb of textures and no fx? Boring as hell if you ask me.
 

imgod2u

Senior member
Sep 16, 2000
993
0
0
The higher color precision actually has a pretty dramatic effect. Even though your eyes may not exactly see all of the colors, when doing calculations, a single .0001 error in the calculation could result in the final color be off by a whole shade. So really, even though if the monitor were to produce 128-bit color would not be noticable, the use of 128-bit FP precision in internal calculation could produce images that are much more true to the original intention of the developer. A common example is when you're applying gamma correction to an image.