8800 GT INFO / Updates in bold / MUST READ OP AND SEE LINKS FOR BENCHMARK PICS

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
I couldn't bother waiting for 8800GT anymore. I bought the 8800GTS Fatal1ty on Tigerdirect for $315 shipped (AR). So I guess that means I spent $40 more than what the 8800GT would be. Hell, if it's 95% as fast as an 8800 GTX at under 1900x1200 res for most games, I think it might even be faster than the 8800GT.

The thing that pushed me over the edge was Valve and id saying they aren't interested in DX10 yet because of the poor performance associated with it. If developers are not too keen on moving to DX10, the I'm sticking with the higher bandwidth vs more shader power--even it means I have to put up with a louder fan :p
 

MichaelD

Lifer
Jan 16, 2001
31,529
3
76
I'm actually a little disappointed in the single slot solution; I like having all the hot exhaust air blown out of the case, ala a dual-slot card.

It's not like anyone actually uses PCI slots in a gaming rig anymore. And if you do use them, there's always 1 or 2 open down at the other end of the board. :)

Additionally, has there been any mention of power requirements? Single 6-pin? Dual 6-pin? 8-pin? Decent 550W enough?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
80w for lower end (8800gt 256 or 2950gt - type cards) up to as high as 130w for 2950 xt - type units. IIRC, they are all pci 2.0 cars so won't even need 6 pin power at stock speeds. I'm sure they'll have the plugs, however :(
 

aiya24

Senior member
Aug 24, 2005
540
0
76
Originally posted by: bryanW1995
Originally posted by: Skott
In about 12 days we'll know what the real 8800GT performance differences are and I guess about a week or two after that we'll see how the new 8800GTS looks. I hope to build a new rig about mid-November so these new cards come out at a good time for me. I cant wait to see how they do.
yeah, your rig is clearly outdated. Every time I get 22fps on my 1950xt, I think, "Well, at least I don't have skott's rig."

lol good one bryan :laugh:

i'm kinda glad i'm not playing games very often as of late, so i don't really need a new card till the holidays.
 

MichaelD

Lifer
Jan 16, 2001
31,529
3
76
Originally posted by: bryanW1995
80w for lower end (8800gt 256 or 2950gt - type cards) up to as high as 130w for 2950 xt - type units. IIRC, they are all pci 2.0 cars so won't even need 6 pin power at stock speeds. I'm sure they'll have the plugs, however :(

80w? That's nothing. My 7900GTO surely must pull more than that. How can they increase the power of the card but decrease the electrical requirements? Die shrink? Smaller manuf process...82um vs 90um or whatever....?
 

Rusin

Senior member
Jun 25, 2007
573
0
0
MichaelD:
Geforce 7900 GTX consumed like 84W on full load. I think that your 7900 GTO did pull less than 80 W since it's memory is lower clocked (and there was voltage difference between GTO and GTX?)

But..:
that 84W for 7900 GTX was it's normal powerconsumption under load. This 80W for HD2950 PRO and 110W for 8800 GT are their TDP-numbers -> real consumption will be far less.

For example:
8800 GTS 640MB has TDP value of 143W, but in real life it typically consumes only 105W in full load.

8800 GTX has TDP of 185W, but it consumes like 130W typically, highest peak values that have been seen on test are something like 145W
 

Nanobaud

Member
Dec 9, 2004
144
0
0
Originally posted by: Cookie Monster
Originally posted by: aka1nas
Is the G92 supposed to be double-precision capable?

Yes but not for consumer level cards.

Excuse me if this is obvious, but what does that mean?

I was hoping for double precision so I could finally do some interesting things with GPGPU/CUDA (supposed to work on all G8x and up GPUs), but not if I have to buy a new Quadro.

I don't know that much about hardware impact on gaming experience, but it seems like double-precision SPs (coupled with the appropriate drivers, of course) could have a major impact on game performance when using HDR. Why would they put it on the die then somehow restrict it to single-precision (and how?) for GeForce cards? Plus, wouldn't double-precision on gaming cards then benefit from hi-res textures, creating even more reason to buy larger-memory, more profitable, cards?

nBd
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
It sounds like they will restrict it to the Tesla's just like some CAD features only run on Quadro's. Might be fixable with the right drivers and BIOS images if we're lucky. I don't think game devs are all that interested in double precision floats at this point. It is a must-have feature for a lot of HPC applications, though.

I'm doing my thesis on some CUDA-related work, so I'd be willing to trade up for G92 GTS models if they support more CUDA features.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: aka1nas
It sounds like they will restrict it to the Tesla's just like some CAD features only run on Quadro's. Might be fixable with the right drivers and BIOS images if we're lucky. I don't think game devs are all that interested in double precision floats at this point. It is a must-have feature for a lot of HPC applications, though.

I'm doing my thesis on some CUDA-related work, so I'd be willing to trade up for G92 GTS models if they support more CUDA features.

Your right. What i meant was that the upcoming G92 based cards e.g 8800GT i.e consumer level cards won't have DP enabled while it will be for professional level i.e workstation/quadro/telsa etc.

 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: Rusin
http://www.tcmagazine.com/comm...shownews=16480&catid=2
Foxconn lists 8800 GT 512MB. Nothing new here. Simple mistake there..it says that 8800 GT would use G96-chip, but pictures have already confirmed that it uses G92-A2 revision.

How do we know for a fact the pictures of the G92 chip were for the GT? We don't. I want it to be, don't get me wrong. I want 112SPs, 512mb for $250. Steal if you ask me. I just don't see it happening.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: PC Surgeon
Originally posted by: Rusin
http://www.tcmagazine.com/comm...shownews=16480&catid=2
Foxconn lists 8800 GT 512MB. Nothing new here. Simple mistake there..it says that 8800 GT would use G96-chip, but pictures have already confirmed that it uses G92-A2 revision.

How do we know for a fact the pictures of the G92 chip were for the GT? We don't. I want it to be, don't get me wrong. I want 112SPs, 512mb for $250. Steal if you ask me. I just don't see it happening.
Well we have seen 8800 GT and we have seen that heatsink coming off and there was G92-A2 core inside.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: Rusin
Originally posted by: PC Surgeon
Originally posted by: Rusin
http://www.tcmagazine.com/comm...shownews=16480&catid=2
Foxconn lists 8800 GT 512MB. Nothing new here. Simple mistake there..it says that 8800 GT would use G96-chip, but pictures have already confirmed that it uses G92-A2 revision.

How do we know for a fact the pictures of the G92 chip were for the GT? We don't. I want it to be, don't get me wrong. I want 112SPs, 512mb for $250. Steal if you ask me. I just don't see it happening.
Well we have seen 8800 GT and we have seen that heatsink coming off and there was G92-A2 core inside.

Where?
 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Originally posted by: bryanW1995
Originally posted by: Skott
In about 12 days we'll know what the real 8800GT performance differences are and I guess about a week or two after that we'll see how the new 8800GTS looks. I hope to build a new rig about mid-November so these new cards come out at a good time for me. I cant wait to see how they do.
yeah, your rig is clearly outdated. Every time I get 22fps on my 1950xt, I think, "Well, at least I don't have skott's rig."

I never said mine was outdated. Just said I was building a new one. More accurately another system for a special project. I'm looking forward to what the new cards bring to the table. If they bring anything that is. /shrug

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
sorry, forgot the :) I'm just jealous that you have "special projects" that require new system builds...
 

math20

Member
Apr 28, 2007
190
0
0
Will the 256 bit memory start to perform worse than 320 bits at 1920x1200? Also, what do the bits mean? :)
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
I think it will depend on the game. Higher res probably is affected by frame buffer size more than anything, which a 512mb part should handle easily (better than a 320mb GTS) but I can imagine in RPGs where brute force seems to be the better option, like Oblivion or NWN2, you might get better performance with a wider memory bus.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: math20
Will the 256 bit memory start to perform worse than 320 bits at 1920x1200? Also, what do the bits mean? :)

Wikipedia :)

Ill start you off. 1Byte = 8bit.
 

math20

Member
Apr 28, 2007
190
0
0
Originally posted by: Cookie Monster
Originally posted by: math20
Will the 256 bit memory start to perform worse than 320 bits at 1920x1200? Also, what do the bits mean? :)

Wikipedia :)

Ill start you off. 1Byte = 8bit.

Yeah, but what does that mean functionally in terms of gpu memory?
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
These benchmarks are all over the place.

The one interesting thing I noted though was the low SM3.0 score on the GT vs Ultra score despite the fact that shader output are very close between the two cards (~505 vs ~575 gigaflops?) The score differential is more than one would expect (it's closer to the GTS vs GTX differential, though the GTS's shader output is only the the 300GFlops range). It seems a bit odd that with so much SP power it's struggle with SM3.0 (and frame buffer size is not an issue at 1280x1024x4xaax4xaf with 512mb of RAM). I'm guessing there will be places that the limitations of the 256bit bandwidth will definitely be felt.

But it seems like good news for OCers, if the GPU-Z specs are anything close to reality, 300MHz core and 750mhz shader OC! That's insane.

EDIT: After checking out Tomshardware VGA charts, I see that AA/AF is very bandwidth intensive--at higher res AA/AF 3dmark06 the 7950GX2 jumps up and handily beats the 8800GTS.

This makes me even more curious of the supposed new benchmarks where the GT is beating all the GTS and sometimes surpassing the GTX/Ultra because all those gaming benchmarks are with 4xaa/8xaf enabled. You would think bandwidth would be a huge issue at those settings. It's possible the GT they are benching is in fact an OCed one.