• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

The REAL DEAL-Anand's HL2 benches!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Lots of AA + AF = ATI > Nvidia. It's been that way for years.

However, since Nivida is pretty close in this game, & far better in Open GL games, i think they are still a tad better overall.
 
Originally posted by: Tylanner
"Back in May '04, NVIDIA introduced the GeForce 6800 Ultra Extreme Edition to counter the launch of ATi's Radeon X800 XT Platinum Edition, and gamers the world over took notice."


Show me a link on NV site for the card. Or a link of NV launching the card.

 
Can someone explain to me in simple terms why the 12-pipe 6800nu is losing to the (I can't remember the number of pipes) 6600GT?
The 6600GT has 8 pipes vs 12 on the 6800 but the 6800 has much less memory bandwidth.
 
Originally posted by: BFG10K
Can someone explain to me in simple terms why the 12-pipe 6800nu is losing to the (I can't remember the number of pipes) 6600GT?
The 6600GT has 8 pipes vs 12 on the 6800 but the 6800 has much less memory bandwidth.

Core speed has a lot to do with it as well, 8 pipes x 500MHz > 12 pipes x 325MHz.
 
Originally posted by: BFG10K
Can someone explain to me in simple terms why the 12-pipe 6800nu is losing to the (I can't remember the number of pipes) 6600GT?
The 6600GT has 8 pipes vs 12 on the 6800 but the 6800 has much less memory bandwidth.

You're wrong about that one BFG.

The 6600GT has 128MB bus width, so even though it has a higher frequency on the RAM (500MHz) it has a lower memory bandwidth than the 6800NU. (which has a 256bit bus width)

The 6800NU has 21MB/s bandwidth and the 6600GT has 15MB/s.

http://anandtech.com/video/showdoc.aspx?i=2195&p=5
 
Has anyone tried running the benchies on a Dell 6800 GTO yet?

Im going to buy HL2 anyhow, just wondering if anyone can give me an idea what they getting on their framerates (if possible both stock 12/5 350/900 and unlocked and overclocked at 16/6 375/1000)

im assuming the results locked/stock will be between the 6800nu and 6800GT, am curious tho how it performs at the ul/oc i just mentioned as that is technically slightly higher than the GT.

if no one posts then i'll do it once i can afford HL2..lol


edit: please list the rest of your system spec too please..sorry, forgot that bit...hehe
 
Well, Nvidia, historically, are the wizards of driver releases and squeezing surprisingly more performance from their cards as time goes on, so, I suspect, over time, we'll see the 6800 series pull markedly ahead.
 
Can someone explain to me in simple terms why the 12-pipe 6800nu is losing to the (I can't remember the number of pipes) 6600GT? Also would unlocking the other 4-pipes and the 6th vertex shader have any drastic effect on this?

Chances are this game is pixel shader limited. Thus the higher clock but less pipes of the 6600GT might allow it to push past the 6800NU when memory bandwidth is not the bottleneck. As you will notice when you start to apply AA the 6600GT crumbles under its less memory and bandwidth.

Show me a link on NV site for the card. Or a link of NV launching the card.

So what is worse? Having an unoffical extreme card sold by a couple of OEMs? Or having an official card that nobody can get nearly 6 months after release?



 
Originally posted by: HeroOfPellinor
Well, Nvidia, historically, are the wizards of driver releases and squeezing surprisingly more performance from their cards as time goes on, so, I suspect, over time, we'll see the 6800 series pull markedly ahead.

I seriously doubt it, speculation was that back then nVidia enjoyed a commanding lead they had no need to streamline their drivers, today they have to do as much as possible right off the bat in order to maintain a lead or just keep up as competition is far more fierce now than it is back then.

PLUS if you want to play the performance from drivers game ATI can do it too, just look at what they did with the 8500 which was a complete mess with early drivers performing well under a GF3, whereas now its right on the heals of theGF4 Tis.
 
Originally posted by: ronnn
For more flamebait. Hardocp just updated their conclusion.

"Update: We have gone back and updated our pages with a couple of graphs that show Maximum IQ settings in terms of AA and AF. Without a doubt the ATI Radeon X800XT-PE did by the far the best job at delivering a playable gaming experience. Of course it is up to the end user to determine if turning these options on give you any tangible gaming returns, but without a doubt if you want to run "ultra high quality settings," the ATI Radeon X800XT-PE gives a much better return than NVIDIA's solution."


:beer:


?? Isnt that a 180 degree turn?




 
Originally posted by: bunnyfubbles
Originally posted by: HeroOfPellinor
Well, Nvidia, historically, are the wizards of driver releases and squeezing surprisingly more performance from their cards as time goes on, so, I suspect, over time, we'll see the 6800 series pull markedly ahead.

I seriously doubt it, speculation was that back then nVidia enjoyed a commanding lead they had no need to streamline their drivers, today they have to do as much as possible right off the bat in order to maintain a lead or just keep up as competition is far more fierce now than it is back then.

Remember that we're looking at one specific game that is supposed to run best on ATI cards and the NVidias are generally on par...and the game just released a couple days ago. These benchmarks are a big deal because this was ATI's trump card but they're ending up having to split the pot.

But, ya, I guess we'll see.
 
I think it is obvious what he is saying.

Fillout a form and you get a PE in 2 weeks.

nevermind that ATIs own website has 0 in stock with no mention on when you can expect them in lol!

 
Originally posted by: Genx87
I think it is obvious what he is saying.

Fillout a form and you get a PE in 2 weeks.

nevermind that ATIs own website has 0 in stock with no mention on when you can expect them in lol!


I asked because Ive already checked and BB shows zero anywhere and the item cannot be ordered via fulfilment.
 
Regarding the above results that ATI cards are faster when all quality/speed tradeoff settings are set to max quality:

this is a useless benchmark unless you actually inspect the quality on the screen.

Both NVidia and ATI have optimizations in their drivers and cards that you cannot turn off.

Both NVidia and ATI have optimizations, speeding things up behind the back of the game, that are not actually visible.

So the above test is useless unless you also establish that the picture quality is actually higher.
 
Originally posted by: HeroOfPellinor
Originally posted by: bunnyfubbles
Originally posted by: HeroOfPellinor
Well, Nvidia, historically, are the wizards of driver releases and squeezing surprisingly more performance from their cards as time goes on, so, I suspect, over time, we'll see the 6800 series pull markedly ahead.

I seriously doubt it, speculation was that back then nVidia enjoyed a commanding lead they had no need to streamline their drivers, today they have to do as much as possible right off the bat in order to maintain a lead or just keep up as competition is far more fierce now than it is back then.

Remember that we're looking at one specific game that is supposed to run best on ATI cards and the NVidias are generally on par...and the game just released a couple days ago. These benchmarks are a big deal because this was ATI's trump card but they're ending up having to split the pot.

But, ya, I guess we'll see.

I don't see how this is a big deal anymore... AFAIK, HL2 was only being pushed as "the ATI game" in the days of the GF 5xxx series. After the 6800 came out, and it was proven how wickedly amazing it was at anything DirectX and shader-heavy, it shouldn't be a surprise how well it plays HL2.

OTOH, if HL2 had actually come out in 2003... then yeah, ATI with their 9700 and 9800's would be dancing over the 59xx's.


Also, with regards to squeezing perf. out of drivers... ATI has been doing an amazing job at that over the last 4 months. My Doom3 performance has only been going *up* with each Catalyst release since 4.8.
 
Originally posted by: MartinCracauer
Regarding the above results that ATI cards are faster when all quality/speed tradeoff settings are set to max quality:

this is a useless benchmark unless you actually inspect the quality on the screen.

Both NVidia and ATI have optimizations in their drivers and cards that you cannot turn off.

Both NVidia and ATI have optimizations, speeding things up behind the back of the game, that are not actually visible.

So the above test is useless unless you also establish that the picture quality is actually higher.


Nope, it's not useless. If you don't understand their (AT's) results, you're a moron.
 
Xbit actually included comparison screenshots. I didn't inspect them, but they said all the cards checked out (except for some minor-sounding distance fog issue with nVidia).
 
Originally posted by: LocutusX
Originally posted by: HeroOfPellinor
Originally posted by: bunnyfubbles
Originally posted by: HeroOfPellinor
Well, Nvidia, historically, are the wizards of driver releases and squeezing surprisingly more performance from their cards as time goes on, so, I suspect, over time, we'll see the 6800 series pull markedly ahead.

I seriously doubt it, speculation was that back then nVidia enjoyed a commanding lead they had no need to streamline their drivers, today they have to do as much as possible right off the bat in order to maintain a lead or just keep up as competition is far more fierce now than it is back then.

Remember that we're looking at one specific game that is supposed to run best on ATI cards and the NVidias are generally on par...and the game just released a couple days ago. These benchmarks are a big deal because this was ATI's trump card but they're ending up having to split the pot.

But, ya, I guess we'll see.

I don't see how this is a big deal anymore... AFAIK, HL2 was only being pushed as "the ATI game" in the days of the GF 5xxx series. After the 6800 came out, and it was proven how wickedly amazing it was at anything DirectX and shader-heavy, it shouldn't be a surprise how well it plays HL2.

OTOH, if HL2 had actually come out in 2003... then yeah, ATI with their 9700 and 9800's would be dancing over the 59xx's.


Also, with regards to squeezing perf. out of drivers... ATI has been doing an amazing job at that over the last 4 months. My Doom3 performance has only been going *up* with each Catalyst release since 4.8.

Not really. If you run it in DX8 mode, which ATs review only showed real differences in the water, the 5900U would be every bit as fast. (and only missing shiny water from the looks of things)
 
Well, I don't know if you've actually played the game, but water is a pretty recurring theme as you'll spend at least X hours of gameplay on or around water. I don't know the exact figure (since I haven't finished) but I'd say I've spent about 2-3 hours on or around water. Having the extra eye candy there for free is a good thing.

You are correct though - assuming the 5900U scales from the 5900XT as it usually does - DX8 path performance of the 5900U will give you roughly the same fps as the 9800Pro on the DX9 path.

(But why you are comparing the 5900u to the 9800pro rather than it's price-point equivalent - the 5900xt - is beyond me)
 
I'm playing the game now. If the water was DX8, I don't know that I'd really care. It doesn't look real in DX9 IMO. Been playing it on my unlocked 6800NU at 10X7 2X4X, all high and at 11X8 4X8X on my 6800GT.
Both work well, but that water is cheesy IMO. (as are the ungodly load times to start it and all that "Steam" BS)

The game is a lot fun and very well done other than those quibbles.

The 5900XT was never meant to compete against the 9800Pro, it was a budget card. The 5900U has never been as cheap as a 9800Pro, but it was made to compete against it nonetheless. (9800XTs didn't exist when 5900Us were released) That is why I compare them like most review sites always have?
 
Back
Top