8800gtx preview

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Less hype, less markup - that's what I'm hoping for in the GTS. The GTX is likely to see some of that simply because it's the flagship and will be the fastest single card available.
 

ForumMaster

Diamond Member
Feb 24, 2005
7,792
1
0
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculously a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

to distribute the load under two rails. allows a weaker PSU to handle such a monster.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: Cookie Monster
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.

I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.

Click the link.

My arms are broken, forgive me.
All sarcasm aside, that was a poorly worded paragraph from the link.

If it's 80nm, that's all the better.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
Link

Someone has picked up a 8800GTX and is benching them!!

Quick results -

Fear 1600x1200 all in game setttings maxed out with soft shadows enabled.
FEAR bench one
Min: 41
Avg: 83
Max 197

Note - 100% over 40 FPS.

F.E.A.R.

All possible settings maxed. 4xAA, 16xAF, 1600x1200, everything maximum, soft shadows on.

Fear Bench Two

one]http://s2.supload.com/image.php?get=fear1600x1_b7c75e2203a3dca51.jpg[/L]
Min: 40
Avg: 76
Max 175

Note - SOFT SHADOWS :Q (I thought SS cant be done with AA?)

Fear Bench Three

FEAR, 1600x1200, 4xAA, 16xAF, all settings maxed except soft shadows.
Min: 41
Avg: 81
Max 185

Fear Bench Four

FEAR maxed out at 1600x1200 with 16xAF, 16xAA:

Min: 16
Avg: 34
Max 84

Pretty Impresive.

How the control panel looks like using 8800GTX


3dmark06 on A64 4000+
6500~ but check out the S.M 3.0, S.M 2.0 scores. Impressive.

3dmark05

12000~ on the A64 4000+

Some oblivion Performance:
Anyways, here's some Oblivion at 1600x1200, default settings in game, HDR on, AA off, distant landscape, buildings, and trees in control panel:

60FPS average in folage area, 43FPS average in oblivion gate area.

Okay, at 1600x1200 in Oblivion with all the sliders and options maxed, and HDR (no AA) it gets on average:

30 FPS at Oblivion gate

42 FPS in Foilage area


Pretty damn good if I do say so myself.

Final note - All this using beta drivers never intended to the public. I geuss, the shipping drivers/Review drivers are alot different than this one.
:confused:

What happened to the claims of HDR+AA?

Also, what is the AF? If they left it on application controled in the driver control panel for their FEAR benches, I'm willing to be that they didn't think it was much different than the 7 series' AF and just left it on anisotropic x16. Otherwise, there probably would be a setting for any new AF feature in the Control Panel that they would want to turn on.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Shadermark v2.1 test

8800GTX
Resolution: 16x12

shader 2 1860
shader 3 1426
shader 4 1375
shader 5 1178
shader 6 1455
shader 7 1303
shader 8 967
shader 9 2188
shader 10 1984
shader 11 1335
shader 12 808
shader 13 929
shader 14 1152
shader 15 720
shader 16 630
shader 17 995
shader 18 101
shader 19 446
shader 20 168
shader 21 201
shader 22 192
shader 23 192
shader 24 192
shader 25 255
shader 26 257

Anyone with a X1900XTX/XT or 7950GX2 could bench shadermark 2.1 for comparison?

It would be great if someone could bench it.

Note - not sure if CPU varies the score.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
Josh wait for the reviews.

Do you really think that guy knows everything about the card? :)

I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: josh6079
Originally posted by: Cookie Monster
Josh wait for the reviews.

Do you really think that guy knows everything about the card? :)

I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.

Just an average joe that bought the card from his closest retailer. ;)

Its the retailer that shouldnt be selling the cards..
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
Originally posted by: josh6079
Originally posted by: Cookie Monster
Josh wait for the reviews.

Do you really think that guy knows everything about the card? :)

I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.

Just an average joe that bought the card from his closest retailer. ;)

Its the retailer that shouldnt be selling the cards..

Lucky SOAB!! :p

Still, he knows the importance of benches obviously and for him to ignore the fact that Oblivion's HDR, no AA test on a G80 and still say that it's "Pretty damn good" is pretty lame. The lack of AF information also leaves me with a sense of doubt that they changed much from the 7 series, but we'll see.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Looks like the 8800GTS for $449 might be the star of this months release. Similiar to 6800GT and the 7800GT during its launch with the flagship model.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.

???

If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.
 

BlizzardOne

Member
Nov 4, 2006
88
0
0
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.

???

Those results were with Cat A.I set to Standard, setting it to Advanced would seem to force AFR (and may or may not also enable some shader replacements) resulting in upto 130% improvement for the CF'd setup, as seen here..

Results with Cat A.I. = Advanced

Cheers :beer:

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Cookie Monster
Originally posted by: josh6079
Originally posted by: Cookie Monster
Josh wait for the reviews.

Do you really think that guy knows everything about the card? :)

I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.

Just an average joe that bought the card from his closest retailer. ;)

Its the retailer that shouldnt be selling the cards..

An average joe has IGP on a dell. He's not someone who has a next gen top of the line video card that isn't even selling anywhere. The average joe isn't some one who would go on sites like this and post information about said video card. I too would like to see if the G80 actually supports HDR+AA. It's not too long hopefully before the launch so I guess we will just have to see.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.

???

If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.

Wow, did anyone miss the fact that Nvidia's cards can't use the chuck patch. :disgust:

Before you guys start jumping on what Nvidia might have screwed up, think the situation through thoroughly... :(
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
Originally posted by: ForumMaster
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculously a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

to distribute the load under two rails. allows a weaker PSU to handle such a monster.

I thought this as well buy Jonnyguru said it's not how PSU's work (I don't know well) but that most split the 12V rail between all the PCI-e connectors. I guess PSUs with multiple 12V rails (e.g. ones with 4) would be ok?
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: gramboh

I thought this as well buy Jonnyguru said it's not how PSU's work (I don't know well) but that most split the 12V rail between all the PCI-e connectors. I guess PSUs with multiple 12V rails (e.g. ones with 4) would be ok?

Even if both PCI-e connectors are on the same rail, two connectors on the card are better than one.

A good deal of voltage drop is caused by the resistance through the actual wires. The greater the amerage is being delvered through the wires, the more the voltage is going to drop. So if you have two connectors on two different sets of cables then you will have less of a drop in voltage at the end of each connector.


 

CP5670

Diamond Member
Jun 24, 2004
5,666
765
126
Those performance increases look awesome, but as others have said there is something off about the X1950XTX benchmarks they have given for comparison. I suppose we'll know the real story soon enough.

Someone just posted this on the General Forum:

http://www.ewiz.com/detail.php?name=MSI-880GTX

$643.75 @ eWiz.

A might pricey

If you go to it through Froogle, you actually get a slightly lower price ($625). Those prices are quite good if they're indicative of what all the other retailers are going to do. I was expecting a lot more gouging.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Nightmare225
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.

???

If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.

Wow, did anyone miss the fact that Nvidia's cards can't use the chuck patch. :disgust:

Before you guys start jumping on what Nvidia might have screwed up, think the situation through thoroughly... :(
So nVidia doesn't "Officially" support HDR+AA then if it has to use what staunch nV supporters like Crusader claimed was a "hack"?

If he's using beta G80 drivers, quite possibly the drivers that will be used for the card's launch in 4 days, why couldn't the AA be forced through the driver's control panel and HDR turned on in the game? I mean, are nVidia's latest drivers, the beta's for the G80, behind ATi's 6.10's since they don't allow HDR+AA?
 

thilanliyan

Lifer
Jun 21, 2005
12,059
2,272
126
Originally posted by: josh6079
So nVidia doesn't "Officially" support HDR+AA then if it has to use what staunch nV supporters like Crusader claimed was a "hack"?

Who knows if that's the case but I'd love to see "certain" people's reaction if it isn't supported officially.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: josh6079
Originally posted by: Nightmare225
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.

???

If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.

Wow, did anyone miss the fact that Nvidia's cards can't use the chuck patch. :disgust:

Before you guys start jumping on what Nvidia might have screwed up, think the situation through thoroughly... :(
So nVidia doesn't "Officially" support HDR+AA then if it has to use what staunch nV supporters like Crusader claimed was a "hack"?

If he's using beta G80 drivers, quite possibly the drivers that will be used for the card's launch in 4 days, why couldn't the AA be forced through the driver's control panel and HDR turned on in the game? I mean, are nVidia's latest drivers, the beta's for the G80, behind ATi's 6.10's since they don't allow HDR+AA?

Considering it's their first driver that could implement it you should give them a break. I mean how long did it take ATi to integrate it into the drivers eh?

I don't really care if it's a hack or part of the drivers, just that I can get my hands on it & that it works. If that means a few months of it being a hack then fine, i'll just expect the same pettiness from ATi supporters as we've seen from nVidia supporters.