The New and Improved "G80 Stuff"

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TanisHalfElven

Diamond Member
Jun 29, 2001
3,520
0
76
Originally posted by: lopri
7800 GTX debuted @599 USD. It's crazy how quickly these cards devalue.

wasn't that a year a go. now the 7900gt can be had for 200 AR and is equal to better than a 7800gtx. its crazy i tell you.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Another tidbit leaked from the beta drivers:

  • NVIDIA_G80.DEV_0191.1 = "CompoTech GeForce 8800 GTX"
    NVIDIA_G80.DEV_0192.1 = "CompoTech GeForce 8800 GT"
    NVIDIA_G80.DEV_0193.1 = "CompoTech GeForce 8800 GTS"
Apparently there is another SKU ready and waiting for the right time for launch.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
Well then, you haven't been looking in the right places, have you?
I notice you didn't post any evidence, as usual.

How many times do I have to tell you that I won't do your research for you? If if you want to make an ass of yourself in a debate by not having all the facts to hand, thats YOUR problem!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: BFG10K
Well then, you haven't been looking in the right places, have you?
I notice you didn't post any evidence, as usual.

How many times do I have to tell you that I won't do your research for you? If if you want to make an ass of yourself in a debate by not having all the facts to hand, thats YOUR problem!

we are trying to tell you that we don't believe you . . . :p

your statements have proven to be worthless . . .


without anything to back you up
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: lopri
Another tidbit leaked from the beta drivers:

  • NVIDIA_G80.DEV_0191.1 = "CompoTech GeForce 8800 GTX"
    NVIDIA_G80.DEV_0192.1 = "CompoTech GeForce 8800 GT"
    NVIDIA_G80.DEV_0193.1 = "CompoTech GeForce 8800 GTS"
Apparently there is another SKU ready and waiting for the right time for launch.

Im guessing either nVIDIA changed their minds from 3 high end SKUs to 2, or its the other way around.

Im thinking there will be 3 SKUs.

8800GTX is the full G80. 8800GT will be abit next to it, still using 384bit memory interface.
The 8800GTS is the lowest high end with its 320bit.

But, rumours are pointing towards just 2 high end SKUs from nVIDIA. Thats the GTX and GTS.

Another thing abotu VCAA might stand for Coverage Sampling AA. USes a compression method which however isnt always compatible.

From jawed B3D:

OK, this sounds to me like the AA is capable of producing as much as the equivalent of 16xMSAA, e.g. where two triangles meet. But, when the triangle count within a pixel increases, the degree of MSAA falls off due to a limit on the number of colour+z value-pairs that need to be stored within each pixel.

So,

* 2 triangles = 16xMSAA
* 3 triangles = 8xMSAA
* 4 triangles = 4xMSAA
* 5 or more triangles = 4xMSAA (since only four of the five triangles can cover a sample and be visible)

So, this means that long edges where two surfaces meet will look really nice. The fiddlesome intersections of 3 or more triangles, because of the multiple colours being blended within the pixel (3 or 4) will tend to look good anyway, even with only 4xMSAA.

Jawed

This clearly points out that nVIDIA will have the option to go both 2x, 4x, 8x, 16 MSAA and SSAA on a single GPU setup which is great. Also add in TRAA (unless they made this feature more efficent). Not to mention that G80 will fully support HDR plus AA, meaning we might be seeing playable FPS using HDR and 16xVCAA(MSAA).

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Zenoth
Originally posted by: BFG10K
Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?
I haven't seen this mentioned anywhere so I doubt it. In fact I'd expect the AF to be exactly the same as it is now on the G7x series.

Most probably yes. Actually 99% certain.
Based on what?

That's what I was thinking as well.

Of course the official announcement and the G80's technical specifications and capabilities will answer it next week I presume.

But, personally, for this generation, I went with ATi because of HQ A-F (well, not the only single reason, but it played a major role). I use it at 16x in all my games over any amount of A-A (since I personally prefer overall texture quality over simple jaggies, it's a matter of personal preferences really). If G80 isn't capable of the same quality of A-F than that of ATi's HQ, then I will wait until R600 to see if it still has it, and then I will take my decision on which one to go with. I guess I am part of a minority that isn't using A-A much, not because my GPU isn't capable of it, but simply because I could really care less about A-A in any forms and shape.

However there is something I am looking forward to, with G80, is to see if it'll either improve or still use its great feature called UltraShadow, which significantly reduces the amount of processing calculations needed to render a scene with complex shadows (I remember it was commonly advertised for Doom 3 when it was released). That's something that ATi's past and current GPUs can't do better (I am not saying they can't do it "well", but certainly not as efficiently as GeForce 7 can with UltraShadow II), which does occasionally force me to turn down shadow details in games or rarely turn it off completely (I am thinking about F.E.A.R here, even though I do not own it anymore, but that wasn't because I couldn't use Shadows without playing a slide show, but simply because it bored me to death within a few weeks).

Look at this.

Alle Infos sind zusammengefasst aus dem PCGH Special 'Geforce 8800' Ausgabe 12/2006 S54ff

Hier ein paar Facts:
- 681mio Transen
- neue ROPs mit FP32 + MSAA
- 128 Steam Prozzies mit 1350Mhz (1 Skalar pro Takt) = 518,4GFLOP/s
- WUAF
- besseres Dynamic Branching
- besseres Eraly Z

WUAF huh. What do you think it is? take a good guess ;)

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Who cares what the likes of apoppin, bfg10k and the rest of the lunATics think? (I think I may retire fanatic in favor of this one - it's certainly an apt description for the AT video forum ATi supporters)

They delight in being wrong and can't handle having their precious ATi/AMD criticized. I'll bet 99% of them are 14 y.o schoolboys (must be because thats the level ATi's PR is pitched to)...

Just look at the interest they have shown in the puerile virtual jenna thread...
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Who cares what the likes of apoppin, bfg10k and the rest of the lunATics think?
I don't know, probably the guy who is constantly quoting their posts and getting into debates with them? *cough* You? *cough*
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Who cares what the likes of apoppin, bfg10k and the rest of the lunATics think? (I think I may retire fanatic in favor of this one - it's certainly an apt description for the AT video forum ATi supporters)

They delight in being wrong and can't handle having their precious ATi/AMD criticized. I'll bet 99% of them are 14 y.o schoolboys (must be because thats the level ATi's PR is pitched to)...

Just look at the interest they have shown in the puerile virtual jenna thread...



you do . . . more than anyone else :p

WHO can't handle having their precious company criticized?
--without 'losing it' and requiring Mod warnings
:Q

WHAT puerile thread? . . . find my post there if you can

:thumbsdown:

almost forgot

G80's small chip is for NV I/0
OUR FRIENDS from China published very detailed pictures of the Geforce 8800 GTX and GTS cards.

Many of you noticed the smaller chip that accompanies G80 on the GTX board. You can clearly see it on the site.

We popped the question as we wanted to know what it is doing. It turns out that the small chip on the Geforce 8800 cards is an Nvidia NVIO chip. It provides dual 400MHz RAMDAC and Dual - Dual Link DVI output, TV Out and HDCP. Our informative friends call the chip External Video I/O Chip or simply External RAMDAC. We don?t have any idea why Nvidia needed that, as RAMDAC has been normally part of the chip for generations.

The detailed review in, Chinese, is here but you can always translate it but you can see some cool DX10 rendering pictures and detailed specs here.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Originally posted by: apoppin
G80's small chip is for NV I/0
OUR FRIENDS from China published very detailed pictures of the Geforce 8800 GTX and GTS cards.

Many of you noticed the smaller chip that accompanies G80 on the GTX board. You can clearly see it on the site.

We popped the question as we wanted to know what it is doing. It turns out that the small chip on the Geforce 8800 cards is an Nvidia NVIO chip. It provides dual 400MHz RAMDAC and Dual - Dual Link DVI output, TV Out and HDCP. Our informative friends call the chip External Video I/O Chip or simply External RAMDAC. We don?t have any idea why Nvidia needed that, as RAMDAC has been normally part of the chip for generations.

The detailed review in, Chinese, is here but you can always translate it but you can see some cool DX10 rendering pictures and detailed specs here.
So that more or less confirms multi-monitor support with SLI, I guess. There is no reason to take RAMDAC out of the GPU if only for dual-link support. It looks like NV put a northbridge to handle all the communication between (GPU-Memory)-Video output.

R600 has a new checkbox to fill..
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: lopri
Originally posted by: apoppin
G80's small chip is for NV I/0
OUR FRIENDS from China published very detailed pictures of the Geforce 8800 GTX and GTS cards.

Many of you noticed the smaller chip that accompanies G80 on the GTX board. You can clearly see it on the site.

We popped the question as we wanted to know what it is doing. It turns out that the small chip on the Geforce 8800 cards is an Nvidia NVIO chip. It provides dual 400MHz RAMDAC and Dual - Dual Link DVI output, TV Out and HDCP. Our informative friends call the chip External Video I/O Chip or simply External RAMDAC. We don?t have any idea why Nvidia needed that, as RAMDAC has been normally part of the chip for generations.

The detailed review in, Chinese, is here but you can always translate it but you can see some cool DX10 rendering pictures and detailed specs here.
So that more or less confirms multi-monitor support with SLI, I guess. There is no reason to take RAMDAC out of the GPU if only for dual-link support. It looks like NV put a northbridge to handle all the communication between (GPU-Memory)-Video output.

R600 has a new checkbox to fill..

There could be other things related to that chip that we dont know about yet.

G80 is going to have 681 million transistors to be more exact.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Adding the following stuff as well as the NVIO thingsy in the first post. (Thank you Gstanfor and Triniboy from B3D)

  • 681M transistors
    FP32+MSAA (a la HDR+AA)
    128 scalar MADD+MUL - 518 GFLOPS/s
    Groups of 16
    4 Texture Address calculators per group - 32 total
    8 TMU's per group - 64 total
    6 ROP clusters (how many ROPs in a cluster? One?) / 4 samples/clock (minimum 4xAA ?) Double Z/stencil?
    WUAF -> *Angle-independent AF*
    Improved Dynamic Branching
    New Coverage Sampling AA not universally compatible

We can safely say that NV's implementing at least *less angle-dependent* AF for G80.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
I think this multi-monitor SLI support is, if true, very very significant and can change the view on SLI dramatically. Multi-monitor support has been one of the biggest issues of SLI.
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Ah, that's good news then.

If both G80 and R600 have Non-Angle Dependent (it's not "Angle-Dependent", but "Non-Angle Dependent", important detail) then all I have to do is wait for R600, and see them compared.

And then, I will take my decision based either on Price, or Price and Performance.

Time will tell.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Zenoth
Ah, that's good news then.

If both G80 and R600 have Non-Angle Dependent (it's not "Angle-Dependent", but "Non-Angle Dependent", important detail) then all I have to do is wait for R600, and see them compared.

And then, I will take my decision based either on Price, or Price and Performance.

Time will tell.

He said, "Angle-independent"...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Want to see G80 in action? How about some crysis footage with G80 doing the rendering? (sure to trigger a 'crysis' in at least one lunATic... :D)

crysis-online.com
Flythrough
Gameplay

Crysis DirectX10 Videos on the G80 - October 30, 2006
I've recieved some exclusive videos (who knows how long they'll stay exclsuive for though) and information from what you could call "an inside man". Besides these being the first public videos of the G80 actually running a game ( as far as I know ) - These are also the first real-time DX10 Crysis videos. You read it here first :)

This journalist who shall remain anonymous was not allowed to say too much, but he may have already breached his NDA by telling me about the G80. The NDA is lifted on the 8th of November, yet you are hearing and seeing this all now :)

The videos were taken at a G80 presentation in Paris. When I asked for more information from mr.anonymous, he said the only thing else he could say was that the G80 will definately be worth buying. The same person also said he had a chat with the lead designer of Crysis who I believe is Bernd Diemer. Apparently, everyone there was drunk, so I'm sure there was some wild information flying around :).

Below are the high quality videos from the event. The videos were taken with a still camera, but they're still pretty damn good. I've listed some of the things to look out for in each video.

Flythrough Video:

# At the end of the fly through, you'll see what appears to be an alien craft on the left..
# You'll see a shark at the beginning of the clip.
# You can see some of the shadows in the distance fade in. This is to improve performance.
# You'll notice in this video and in some others that.
# There's a lot of vegetation. Heaps of grass and plenty of interactive shrubs.

Gameplay Video:

# The seas and waves are all 3D.
# The trees have lovely transparent shading.
# You should notice in this video as well as some of the other videos that the windscreen of the car has glare and reflect the dashboard like in real-life. You'll notice many fine touches like that.
# The waters surface refracts the light on to the bottom of the ocean floor like in real-life. Another great touch.
# The HUD is blue and yellow instead of green which it has been in all the other videos. The color of the HUD my change depending on what MP team you are on.
# The characters shadow underneath the water is not animated? This will most likely be changed by the time of release.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Looks like the 8800GTX is going to be a powerhouse. I can't wait to see all the official benches/pics/reviews on Nov 8th. I definitely won't be one of the people shelling out $800 for one, though. My 512mb X1800XT can carry me along well enough until the R600 is out so I can choose between the two. Not to mention the price drops once the 8800GTX has some competition.

But until then, it looks like the 8800GTX is going to be sitting pretty as King of the Hill.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Creig
Looks like the 8800GTX is going to be a powerhouse. I can't wait to see all the official benches/pics/reviews on Nov 8th. I definitely won't be one of the people shelling out $800 for one, though. My 512mb X1800XT can carry me along well enough until the R600 is out so I can choose between the two. Not to mention the price drops once the 8800GTX has some competition.

But until then, it looks like the 8800GTX is going to be sitting pretty as King of the Hill.

BFG and PNY jump the guns on 8800 GTX ...Only $800

Nvidia's G80's I/O chip has implications for GPU futures

i does look like it will be 50% faster than the 1950xtx . . . better than that +30% nonsense. :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: apoppin
BFG and PNY jump the guns on 8800 GTX ...Only $800

Nvidia's G80's I/O chip has implications for GPU futures

i does look like it will be 50% faster than the 1950xtx . . . better than that +30% nonsense. :p

Try $623

http://www.atacom.com/program/atacom.cg...RDS=8800&Disp_item2.x=0&Disp_item2.y=0

link is still active at atacom for $799-$809

yours is dead

i just added five to my cart:
BFG GEFORCE 8800GTX 768MB PCI-E 2DVI HDTV TV-OUT(BFGR88768GTXE) VIDR_BFGX_80_2G $799.95 5 $3999.75

:D

oops . . . i don't even have pcie
:Q

:D
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: apoppin
BFG and PNY jump the guns on 8800 GTX ...Only $800

Nvidia's G80's I/O chip has implications for GPU futures

i does look like it will be 50% faster than the 1950xtx . . . better than that +30% nonsense. :p

Try $623

http://www.atacom.com/program/atacom.cg...RDS=8800&Disp_item2.x=0&Disp_item2.y=0

link is still active at atacom for $799-$809

yours is dead

i just added five to my cart:
BFG GEFORCE 8800GTX 768MB PCI-E 2DVI HDTV TV-OUT(BFGR88768GTXE) VIDR_BFGX_80_2G $799.95 5 $3999.75

:D

oops . . . i don't even have pcie
:Q

:D

And they'll be on your doorstep tommorow, won't they? :disgust:

You can either quit the pricing nonsense now, or have the favor returned when R600 is about to launch - your choice...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: apoppin
BFG and PNY jump the guns on 8800 GTX ...Only $800

Nvidia's G80's I/O chip has implications for GPU futures

i does look like it will be 50% faster than the 1950xtx . . . better than that +30% nonsense. :p

Try $623

http://www.atacom.com/program/atacom.cg...RDS=8800&Disp_item2.x=0&Disp_item2.y=0

link is still active at atacom for $799-$809

yours is dead

i just added five to my cart:
BFG GEFORCE 8800GTX 768MB PCI-E 2DVI HDTV TV-OUT(BFGR88768GTXE) VIDR_BFGX_80_2G $799.95 5 $3999.75

:D

oops . . . i don't even have pcie
:Q

:D

And they'll be on your doorstep tommorow, won't they? :disgust:

You can either quit the pricing nonsense now, or have the favor returned when R600 is about to launch - your choice...
Nov 8th ... i cancelled . . . no pcie slot :(

you mean IF i quit the "pricing nonsense" now, we WON'T have the favour returned at R600's launch.
:Q

i think . . . not

so . . . :p

you are not giving me a 'choice'...
it's called an ultimatium

and i am calling your bluff

:D
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Come on, you two. Initial price gouging has nothing to do with either manufacturer, it has to do with retailers. How the price goes from there will depend on MSRP, supply, performance and popularity.