The New and Improved "G80 Stuff"

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Originally posted by: Skott
Sorry to pull the subject away from the ploitcal side of things but does anyone know what the G8800GTX will draw in amps? Kinda curious whats the minimum amperage needed on a 12V rail for one of these beasts. Or for two beasts in SLI? This inquiring mind wants to know.

Thanks,

Skott
Under the current configuration, the most possible current GTX can draw is 225W. (75W from PEG, 75W each from 6-pin connector) So The max draw should be somewhere around 150~200W.

 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Yeah, I knew about that wattage. I was trying to find out the amps. I know there's a formula to figure it. I just dont know it. I hear up to 18amps but was wondering if that was correct or not.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Amperage? Since video cards use 12V rails, just divide the wattage by 12V.

W = V x A

So 18A on 12V would be 216W.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: ShadowOfMyself
Originally posted by: Cookie Monster
With SLi, could we see 20K being broken in 3dmark06?
With a heavily OCed kentsfield, most probably
Gibbo of OcUK says more like 16k with 8800GTX SLI, no mention of the processor but I am assuming he is talking about the Core 2 Extreme X6800. With a quad-core overclocked to hell I think 20k would definitely be possible. :)
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Is there any information regarding the little chip next to the IHS'ed G80 on 8800 GTX? I wonder if that's something like EDRAM on Xenos? I know we're getting a new method of AA (VCAA), so could it be that the little chip is there for Virtually Cost-free Anti-Aliasing? :D
 

nrb

Member
Feb 22, 2006
75
0
0
Originally posted by: lopri
Is there any information regarding the little chip next to the IHS'ed G80 on 8800 GTX? I wonder if that's something like EDRAM on Xenos? I know we're getting a new method of AA (VCAA), so could it be that the little chip is there for Virtually Cost-free Anti-Aliasing? :D
It's a RAMDAC. There is some speculation that it might contain hardware MPEG4 acceleration too, but that's just speculation so far.

 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Ulfhednar
Originally posted by: ShadowOfMyself
Originally posted by: Cookie Monster
With SLi, could we see 20K being broken in 3dmark06?
With a heavily OCed kentsfield, most probably
Gibbo of OcUK says more like 16k with 8800GTX SLI, no mention of the processor but I am assuming he is talking about the Core 2 Extreme X6800. With a quad-core overclocked to hell I think 20k would definitely be possible. :)

Can 3dmark even execute 4 threads?
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Someone in B3D suggested the mysterious chip next to the gigantic G80 core (well, IHS to be exact) could be there to make possible multi-monitor SLI support. I think it's brilliant and makes alot of sense.

Discuss. :D
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
A question for everyone knowing more than I do about G80 surely ...

Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?

Just curious. It might be a strong element for me to look into if I ever intend to buy G80, since, for me, A-F is much more important than A-A.

Thanks.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Zenoth
A question for everyone knowing more than I do about G80 surely ...

Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?

Just curious. It might be a strong element for me to look into if I ever intend to buy G80, since, for me, A-F is much more important than A-A.

Thanks.

Most probably yes. Actually 99% certain.
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Originally posted by: Cookie Monster
Originally posted by: Zenoth
A question for everyone knowing more than I do about G80 surely ...

Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?

Just curious. It might be a strong element for me to look into if I ever intend to buy G80, since, for me, A-F is much more important than A-A.

Thanks.

Most probably yes. Actually 99% certain.

99% certain because it was mentioned somewhere or ... ?

I don't want to make it sound like you don't know. I just want to be sure myself. Of course the best I can do is to wait until G80 is officially announced (which will be November 8th, right ?). I was just curious if it was ever mentioned somewhere. I haven't followed the fuzz around G80 much.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
I'm just rofl'ing at the nvidiots salivating over the fantasy of R600 being the last gpu from ati .. :laugh:
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?
I haven't seen this mentioned anywhere so I doubt it. In fact I'd expect the AF to be exactly the same as it is now on the G7x series.

Most probably yes. Actually 99% certain.
Based on what?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?
I haven't seen this mentioned anywhere so I doubt it. In fact I'd expect the AF to be exactly the same as it is now on the G7x series.

Most probably yes. Actually 99% certain.
Based on what?

Well then, you haven't been looking in the right places, have you?
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Originally posted by: BFG10K
Will G80 be capable of Non-Angle Dependent Anisotropic-Filtering à-la ATi's High-Quality A-F ?
I haven't seen this mentioned anywhere so I doubt it. In fact I'd expect the AF to be exactly the same as it is now on the G7x series.

Most probably yes. Actually 99% certain.
Based on what?

That's what I was thinking as well.

Of course the official announcement and the G80's technical specifications and capabilities will answer it next week I presume.

But, personally, for this generation, I went with ATi because of HQ A-F (well, not the only single reason, but it played a major role). I use it at 16x in all my games over any amount of A-A (since I personally prefer overall texture quality over simple jaggies, it's a matter of personal preferences really). If G80 isn't capable of the same quality of A-F than that of ATi's HQ, then I will wait until R600 to see if it still has it, and then I will take my decision on which one to go with. I guess I am part of a minority that isn't using A-A much, not because my GPU isn't capable of it, but simply because I could really care less about A-A in any forms and shape.

However there is something I am looking forward to, with G80, is to see if it'll either improve or still use its great feature called UltraShadow, which significantly reduces the amount of processing calculations needed to render a scene with complex shadows (I remember it was commonly advertised for Doom 3 when it was released). That's something that ATi's past and current GPUs can't do better (I am not saying they can't do it "well", but certainly not as efficiently as GeForce 7 can with UltraShadow II), which does occasionally force me to turn down shadow details in games or rarely turn it off completely (I am thinking about F.E.A.R here, even though I do not own it anymore, but that wasn't because I couldn't use Shadows without playing a slide show, but simply because it bored me to death within a few weeks).
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
I'm fairly certain that this time around NV will come out on top with IQ. (Well, at least until R600 shows up) Why? Look around the review sites that started benching ATI default vs NV High quality, to put them on the same start line. In all due honesty, I don't think NV would let that happen if not the G80 would decimate any question regarding NV's IQ. Besides, it's a good selling point for otherwise so-so performance of G80. (well, relatively speaking)

I know I have this bad habit of looking things in an overly sarcastic point of view, but I can't just believe that NV all of sudden gave up their wrath on media exposure.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Umm.. Conspiracy aside, let's talk about why NV would take RAMDAC out of the GPU. (a huge one at that) I think multi-monitor SLI support explains it quite nicely.
 

Polish3d

Diamond Member
Jul 6, 2005
5,501
0
0
500 for the midrange model now? LOL, are you fking serious? and nearly 700 for the GTX? Last year it was 499 for the GTX. I'm not paying 700 for a video card. That's probably the point where I say forget it
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
7800 GTX debuted @599 USD. It's crazy how quickly these cards devalue.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Early adopters will always pay over the odds, especially with something as complex as G80.

Two large dies (one enormous) 12 ram chips and a very complex PCB don't exactly come cheap.

If you want to play, you've gotta pay (or wait), and those who think AMD's R600 will be much cheaper will likely be in for a rude shock also.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Some Info from 3dcentre.de (highly respected site) Unfortunately an exact link to the info wasn't provided in the post.

Text
Alle Infos sind zusammengefasst aus dem PCGH Special 'Geforce 8800' Ausgabe 12/2006 S54ff

Hier ein paar Facts:
- 681mio Transen
- neue ROPs mit FP32 + MSAA
- 128 Steam Prozzies mit 1350Mhz (1 Skalar pro Takt) = 518,4GFLOP/s
- WUAF
- besseres Dynamic Branching
- besseres Eraly Z

Streaming Proc.:
Bisher konnten GPUs (Verktorprozzies) einen RGBA Wert auf einemal berechnen. Da aber nicht immer vier Vektoren benötigt werden müssen die ALUs nicht mehr 3:1 oder 2:2 aufgeteilt werden. Somit steigert die die effizeinz. Ein skalar ist im Prinzip ein einkanaliger Vektor. Der Shader-Compiler hat somit einen einfacheren Job.
Die 128 Streaming Proc. sind aus gründen der Cacheverwaltung zu 16ALUs zusammengefasst. Jede dieser ALUs kann pro Takt eine MADD und eine MUL ausführen.
Wenn man die reine MADD Leistung vergleicht kommt man auf 345,6GLOP/s

Textureinheiten:
Acht Bilineare Textureinheiten sind integriert. Allerdings nur 4 Textur Address Proc. pro Gruppe. Insgesamt können daher 64 bi-Texel pro Takt berechnet werden, aber nur 32 adressiert. Die restlichen TMUs können per Koordinaten Offset zusätzlkiche Daten samplen, wenn wenigstens eine Dimension mit der ersten übereinstimmt (z.B. 2:1 Bi-AF).
TMUs haben eine eigene Adresseinheit, es muß nicht mehr der Pixel-Shader verwendet werden. -> TMU Latenz entschärft.

ROPs:
Es gibt 6 ROP Cluster, jeder Cluster ist mit 64bit bei 900Mhz angebunden. Somit die 384bit.
Bei der GTS ist es ein Cluster weniger. Je ROP vier Pixel mit Farb- und Texturinfo.
Z-und Stencil-Ops weiter getuned.

AA:
Es wir 4/8/16 MSAA geben.
Neu ist zuschaltbares 'Coverage Sampling AA'. Dies ist ein Kopressionsverfahren, welches aber nicht immer Kompatibelsein soll. Die 16 Zufällig verteilten Subpixel verden auf den Bedekungsgrad als Booleanwert (Wahr/Falsch) gespeichert. Das ergebnis wird auf 4x oder 8x MSAA Kompromiert.
Text
128 scalar MADD+MUL - 518 GFLOPS/s
Groups of 16
4 Texture Address calculators per group - 32 total
8 TMU's per group - 64 total
6 ROP clusters (how many ROPs in a cluster? One?) / 4 samples/clock (minimum 4xAA ?) Double Z/stencil?
New Coverage Sampling AA not universally compatible

Perhaps someone who speaks german can summarize what the post says better than the mess Google outputs.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: lopri
Someone in B3D suggested the mysterious chip next to the gigantic G80 core (well, IHS to be exact) could be there to make possible multi-monitor SLI support. I think it's brilliant and makes alot of sense.

Discuss. :D

i think it's there . . . for a 'conversation piece'

:Q

actually it has no purpose other than so people would speculate about it :p

it's probably only a couple of bucks . . . and well worth it from what i can see




:D



... psst . . . in case you didn't realize . . . slightly over 11 days to know "for sure".

:laugh: