• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

HL2: X800 40% Faster than 6800

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: jrphoenix
There are a couple of pages of benchmarks at X-bit on HL2 & the new cards: X-bit Review.

It looks like ATI wins when it comes to high resolutions with all the eye candy turned on 🙂
Note that no one else "benched" HL2. Now where would Xbit get it.. :Q
 
Originally posted by: jagec
Originally posted by: GTaudiophile
True? Start your hunt here.

actually, I wouldn't be a bit surprised if it was 40% faster in HL2. Different games run better on different hardware, just look at Homeworld 2 and Wolfenstein: ET on x800 versus 6800.
not realy relevant since both homeworld2 and wolfenstein are pretty old and it isnt very important how many frames above 100 they are there
 
From the leaked version that the Russians put together.

btw, 3dfx was where it was at for the original HL, SLI was untouchable back then with AMD's 3DNow! optimizations.
 
Originally posted by: Czar
Originally posted by: jagec
Originally posted by: GTaudiophile
True? Start your hunt here.

actually, I wouldn't be a bit surprised if it was 40% faster in HL2. Different games run better on different hardware, just look at Homeworld 2 and Wolfenstein: ET on x800 versus 6800.
not realy relevant since both homeworld2 and wolfenstein are pretty old and it isnt very important how many frames above 100 they are there

my point was that different engines "prefer" different cards:roll:

so I was saying that it's very possible the x800 would be 40% faster in HL2, but that doesn't mean much if you want to predict performance for Doom 3.
 
Originally posted by: jagec
Originally posted by: GTaudiophile
True? Start your hunt here.

actually, I wouldn't be a bit surprised if it was 40% faster in HL2.

yep...i agree with that. The more PS2.0 code they use, the more of an advantage the X800 has. In high resos (1600 and up) the X800 is A LOT aster in FarCry, so in TRAOD and in UT2K4. 40% is very realistic in my opinion. Call me a fanboi whatever....but NV40 shows an advantage on 'old' code/engines....X800 on heavy PS2.0 shader using engines, ie. most/all newer games.
 
Originally posted by: ViRGE
Originally posted by: jrphoenix
There are a couple of pages of benchmarks at X-bit on HL2 & the new cards: X-bit Review.

It looks like ATI wins when it comes to high resolutions with all the eye candy turned on 🙂
Note that no one else "benched" HL2. Now where would Xbit get it.. :Q

I'm not sure how they did it but you can read up more here: Highly Anticipated Games

We managed to obtain a copy of a highly-anticipated DirectX 9.0 game utilizing modern technologies like Pixel Shaders and Vertex Shaders 2.0. Fortunately, the title allows benchmarking and we could not miss the opportunity to benchmark the RADEON and GeForce FX graphics cards under hardcore load. Here we go with our own X-bit labs? graphics performance preview of a very highly-anticipated DirectX 9.0 game that is presumably coming out in foreseeable future.

(Maybe they were the hackers :Q

:laugh:
 
Originally posted by: Bar81
From the leaked version that the Russians put together.
Then how relivant are these results? They're based off a leaked pre-alpha build 8 months old; while I'd like to see these numbers to be true(just to settle arguments), something like this is nearly irrelivant.
 
It is obvious that Gabe is paid to make the 40% claim, just as his dev team is paid to make it a reailty. The real question is whether or not Gabe is paid to maintain his geeky "full figured" appearance.
 
Originally posted by: ViRGE
Originally posted by: Bar81
From the leaked version that the Russians put together.
Then how relivant are these results? They're based off a leaked pre-alpha build 8 months old; while I'd like to see these numbers to be true(just to settle arguments), something like this is nearly irrelivant.

Better than guesswork and sepculations.... Of course I imagine we will see some difference when the product is finalized. I do expects the final version to run faster on ATI for a couple reasons:

1. The test version did with the eye candy on
2. ATI is bundling this game with their card so I would expect some possible code optimizations for ATI cards / drivers?
3. The x800 looks like it runs quicker in most of the modern ps 2.0 games

I wouldn't count nvidia out though 😛
 
Originally posted by: NFS4
Did someone say Gabe????? 😛


Not work safe 😛


hahahahaha..at first I thought that was a real picture of him with the KFC bucket.

And I don't think you can believe Gabe at this point. They already have a financial relationship with ATI, so theres no way he'll say something bad about the product. I wouldn't doubt it if they made ATI specific optimizations either. Its too bad that games are being tied to specific gfx card makers. I just want to buy a good gfx card and play games, not have to know who is in bed with who.


This doesnt look any different than the footage shown last year..like they are playing back a demo.
 
Originally posted by: flexy
Originally posted by: jagec
Originally posted by: GTaudiophile
True? Start your hunt here.

actually, I wouldn't be a bit surprised if it was 40% faster in HL2.

yep...i agree with that. The more PS2.0 code they use, the more of an advantage the X800 has. In high resos (1600 and up) the X800 is A LOT aster in FarCry, so in TRAOD and in UT2K4. 40% is very realistic in my opinion. Call me a fanboi whatever....but NV40 shows an advantage on 'old' code/engines....X800 on heavy PS2.0 shader using engines, ie. most/all newer games.

This is not true. 6800 is very good at mid resolutions and getting better at high w/o AF. Of course at high ones AF is good to have but things are not as dramatic as they seem with HL2.

Regardless that fact WTF?? Am I the only one that still anticipates HL2? 😀
 
Originally posted by: PorBleemo
He was an ATI fanboy from the start. The code is probably optimized heavily for ATI cards.

Hope that is the case. I am sick of games optimized for Nvidia with their logo plastered all over the box and the game.
 
How much money ATi is paying Valve this time? I bought a Radeon 9800 Pro last September since it was "the best videocard" for HL2. Should had save my $300 bucks. You can have a Radeon 9800 Pro for around $200 now, and still no HL2. Apparently, it's not the best videocard for HL2 anymore.
 
Originally posted by: Pocatello
How much money ATi is paying Valve this time? I bought a Radeon 9800 Pro last September since it was "the best videocard" for HL2. Should had save my $300 bucks. You can have a Radeon 9800 Pro for around $200 now, and still no HL2. Apparently, it's not the best videocard for HL2 anymore.
They've taught you a valuable lesson, that PC hardware is not frozen eternally in time, unchanging until entropy itself is no more.

And to never buy hardware to play a game before the game is released. All those people who upgraded to a Pentium 2 233 MHz to play Duke Nukem Forever are deeply, deeply disappointed.
 
Originally posted by: jim1976
Originally posted by: SickBeast
You guys are all talk now but you all know deep down that you will buy this game as soon as it comes out. 😛

Amen bro :beer: 😉

but that's never going to happen. and really i don't care when it comes out. i'm more eager for half life 3 vouchers.
 
Originally posted by: Dacalo
Originally posted by: PorBleemo
He was an ATI fanboy from the start. The code is probably optimized heavily for ATI cards.

Hope that is the case. I am sick of games optimized for Nvidia with their logo plastered all over the box and the game.
It's either going to be "The Way It's Meant To Be Played" or "Get In The Game", neither company is going to stop that tactic any time soon.
 
Originally posted by: Pocatello
How much money ATi is paying Valve this time? I bought a Radeon 9800 Pro last September since it was "the best videocard" for HL2. Should had save my $300 bucks. You can have a Radeon 9800 Pro for around $200 now, and still no HL2. Apparently, it's not the best videocard for HL2 anymore.
Are you gonna get an x800?

That way you can have TWO HL2 coupons. 😉

:roll:

Originally posted by: ViRGE
Originally posted by: Dacalo
Originally posted by: PorBleemo
He was an ATI fanboy from the start. The code is probably optimized heavily for ATI cards.

Hope that is the case. I am sick of games optimized for Nvidia with their logo plastered all over the box and the game.
It's either going to be "The Way It's Meant To Be Played" or "Get In The Game", neither company is going to stop that tactic any time soon.
there IS a simple solution to this . . . with PCI-E you'll be able to run TWO video cards, one ATI for HL2 and the GF for DIII . . .

simple, if a bit expensive . . . kinda like having GC/X-b/PS2 but in a single high end PC.

:roll:

practical, not. 😉

😀

(i'd better get some sleep . . . . aloha)
 
Back
Top