Dethfrumbelo
Golden Member
- Nov 16, 2004
- 1,499
- 0
- 0
Less hype, less markup - that's what I'm hoping for in the GTS. The GTX is likely to see some of that simply because it's the flagship and will be the fastest single card available.
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculously a lot of power, even my 430 watt psu would easily handle it.
I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?
Originally posted by: Cookie Monster
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.
Link
While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.
The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.
This could explain the lower power consumption.
I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.
Click the link.
Originally posted by: Cookie Monster
Link
Someone has picked up a 8800GTX and is benching them!!
Quick results -
Fear 1600x1200 all in game setttings maxed out with soft shadows enabled.
FEAR bench one
Min: 41
Avg: 83
Max 197
Note - 100% over 40 FPS.
F.E.A.R.
All possible settings maxed. 4xAA, 16xAF, 1600x1200, everything maximum, soft shadows on.
Fear Bench Two
one]http://s2.supload.com/image.php?get=fear1600x1_b7c75e2203a3dca51.jpg[/L]
Min: 40
Avg: 76
Max 175
Note - SOFT SHADOWS :Q (I thought SS cant be done with AA?)
Fear Bench Three
FEAR, 1600x1200, 4xAA, 16xAF, all settings maxed except soft shadows.
Min: 41
Avg: 81
Max 185
Fear Bench Four
FEAR maxed out at 1600x1200 with 16xAF, 16xAA:
Min: 16
Avg: 34
Max 84
Pretty Impresive.
How the control panel looks like using 8800GTX
3dmark06 on A64 4000+
6500~ but check out the S.M 3.0, S.M 2.0 scores. Impressive.
3dmark05
12000~ on the A64 4000+
Some oblivion Performance:
Anyways, here's some Oblivion at 1600x1200, default settings in game, HDR on, AA off, distant landscape, buildings, and trees in control panel:
60FPS average in folage area, 43FPS average in oblivion gate area.
Okay, at 1600x1200 in Oblivion with all the sliders and options maxed, and HDR (no AA) it gets on average:
30 FPS at Oblivion gate
42 FPS in Foilage area
Pretty damn good if I do say so myself.
Final note - All this using beta drivers never intended to the public. I geuss, the shipping drivers/Review drivers are alot different than this one.
Originally posted by: Cookie Monster
Josh wait for the reviews.
Do you really think that guy knows everything about the card?![]()
Originally posted by: josh6079
Originally posted by: Cookie Monster
Josh wait for the reviews.
Do you really think that guy knows everything about the card?![]()
I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.
Originally posted by: Cookie Monster
Originally posted by: josh6079
Originally posted by: Cookie Monster
Josh wait for the reviews.
Do you really think that guy knows everything about the card?![]()
I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.
Just an average joe that bought the card from his closest retailer.
Its the retailer that shouldnt be selling the cards..
Originally posted by: wanderer27
Someone just posted this on the General Forum:
http://www.ewiz.com/detail.php?name=MSI-880GTX
$643.75 @ eWiz.
A might pricey
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.
???
Originally posted by: Cookie Monster
Originally posted by: josh6079
Originally posted by: Cookie Monster
Josh wait for the reviews.
Do you really think that guy knows everything about the card?![]()
I'd hope he'd know how to use HDR+AA on a card that has been said to be able to do it. How does he have the card already anyways? He must know something about hardware and what features upcoming technology is said to support if he's already using a G80.
Just an average joe that bought the card from his closest retailer.
Its the retailer that shouldnt be selling the cards..
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.
???
If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.
Originally posted by: ForumMaster
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculously a lot of power, even my 430 watt psu would easily handle it.
I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?
to distribute the load under two rails. allows a weaker PSU to handle such a monster.
Originally posted by: gramboh
I thought this as well buy Jonnyguru said it's not how PSU's work (I don't know well) but that most split the 12V rail between all the PCI-e connectors. I guess PSUs with multiple 12V rails (e.g. ones with 4) would be ok?
Someone just posted this on the General Forum:
http://www.ewiz.com/detail.php?name=MSI-880GTX
$643.75 @ eWiz.
A might pricey
So nVidia doesn't "Officially" support HDR+AA then if it has to use what staunch nV supporters like Crusader claimed was a "hack"?Originally posted by: Nightmare225
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.
???
If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.
Wow, did anyone miss the fact that Nvidia's cards can't use the chuck patch. :disgust:
Before you guys start jumping on what Nvidia might have screwed up, think the situation through thoroughly...![]()
Originally posted by: josh6079
So nVidia doesn't "Officially" support HDR+AA then if it has to use what staunch nV supporters like Crusader claimed was a "hack"?
Originally posted by: josh6079
So nVidia doesn't "Officially" support HDR+AA then if it has to use what staunch nV supporters like Crusader claimed was a "hack"?Originally posted by: Nightmare225
Originally posted by: Dethfrumbelo
Strange... in some of those tests a single X1950XT outperforms CF'ed cards.
???
If Nvidia didn't make an allowance for HDR+AA on G80, after this being such a major selling point against Nvidia in the G7x series vs. ATI, I'd have to say that they're the biggest fvckwits on the planet. In other words, possible, but highly unlikely.
Wow, did anyone miss the fact that Nvidia's cards can't use the chuck patch. :disgust:
Before you guys start jumping on what Nvidia might have screwed up, think the situation through thoroughly...![]()
If he's using beta G80 drivers, quite possibly the drivers that will be used for the card's launch in 4 days, why couldn't the AA be forced through the driver's control panel and HDR turned on in the game? I mean, are nVidia's latest drivers, the beta's for the G80, behind ATi's 6.10's since they don't allow HDR+AA?