Half-life 2 Performance: Breaking news

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sep

Platinum Member
Aug 1, 2001
2,553
0
76
I have to agree with everyone that owns a newer video card (9700 Pro, 9800 Pro, 5900 Ultra, 5900) that you'd have to go back to a 1024x768 res to get less than 60fps blows king kong with two fingers in the rear! I could expect this from a G4, 5600 or lower, but not these so called extreme cards....by the way I just bought a 9800 pro!

I played the original HL single player. Going back in my memory it was a good game, but never tried it online. If I had it might of clinched, but I never did and it didn't stick. However, after watching the large video at my previous lan party I'm impressed and would like to give this game another try.

To all of those writing articles about what to expect from your video cards I've got one word for you......DEMO! I'm not going to take any of these speculations without seeing it before my own eyes. I've noticed that a DEMO helps sell games. So far I've not *purchased* a game that I've not played a DEMO or copy and liked. So Valve...get us a DEMO NOW!

Welcome back Slippery Sam!
-JC
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
And I'm PROUD that my RIG will jizz on your rig while playing HL2.
LOL It's an accomplishment on your part, for sure.

Your rig won't play HL2 NEARLY as well as mine Vian. I have a Asus P4PE/P4 2.53/512 Crucial PC2700/Sapphire 9800 Pro/Audigy 2. (i.e. while your motherboard is approximately equal, my vid and sound blow yours away) So I guess you' ll have to be happy you're faster than someone else, if it takes a pathetic thing like that to make you happy. Personally, I could care less if my computer is faster than yours our anyone's.

5150ATI Employee: I could really care less if you believe I bought a 9800Pro. It really wasn't that big a deal. My wife and I are both working professionals, so I didn't really have to save up for a year to buy the thing.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Lucky I went with my 9700NP on the basis it was better value/performance over the Nvidia counterpart,it`ll be interesting to see how the NV40 does in HL2,I`m sure lots of ATI/Nvidia owners here will still be upgrading again when HL2 is out,I know I`m :).
 

Aftermath

Golden Member
Sep 2, 2003
1,151
0
0
Reading over some of these posts I think the GeForce 4/9700 generation and every video card following it, spoiled a lot of poeople. Suddenly, cranking 1280x1024 or better with max details and likely some AA and AF in there became standard, and a lot of people forgot what it was like to play a new game on a video card where you actually have to lower your resolution and/or detail levels to play it normally. As 5150Joker said, does anyone remember playing Quake 3 Arena on say, a TNT2? You likely had it at 800x600, 32bit color (the latest and greatest feature), and most of the details on medium to play it smoothly (50+fps). Which is how it should be. Brand new games shouldn't scream at high resolutions on ANY video cards, it should be that new and have that much detail and quality.
 

Tennoh

Member
Jan 30, 2000
116
0
76
For people who bought an ATI Radeon 9700 Pro when it first came out in Sept/Oct 2002, my hats to you. Its probably the best future proof video card that came out for its time. Most people recommend to buy video cards for the present day gaming situation but the R300 proved otherwise. To date the original R300 still is awesome as HL2, a DX9 focused game can attest to.
 

Ferocious

Diamond Member
Feb 16, 2000
4,584
2
71
Yeah I got my 9700 Pro last year..........and I can safely say it will go down as the best video card ever released imo.
 

Tates

Elite Member
Super Moderator
Jun 25, 2000
9,079
10
81
OK, I'm the proud owner of a Radeon 9800 Pro, but I don't think I'll be willing to pay $9.99 a month to pay Half-Life2.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: Tates
OK, I'm the proud owner of a Radeon 9800 Pro, but I don't think I'll be willing to pay $9.99 a month to pay Half-Life2.

Where does the $9.99/month come from?
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: Vonkhan
ofcourse Half Life 2 is going to perform better on ATI hardware because it's been optimised and probably produced with ATI hardware in mind

another nVIDIOT
rolleye.gif
HL2 was made for DX9 specs, not optimized for ATi ... dont blame Valve if nvidia didnt follow the DX9 guidelines, read up before u make an ass outta urself.

I own a Inno3D Geforce FX 5600 Ultra, Inno3D Geforce FX 5200 Ultra, XFX Geforce FX5200, Gainward GF4 Ti4600, MSI GF4 Ti4200, Sapphire R9500 Pro, Sapphire R9200, Inno3D Xabre400 and a Videologic VividXS Kyro2. How am I an nVidiot you asshole?
 

Vonkhan

Diamond Member
Feb 27, 2003
8,198
0
71
How am I an nVidiot you asshole!

Thru this statement "ofcourse Half Life 2 is going to perform better on ATI hardware because it's been optimised and probably produced with ATI hardware in mind" and see my reply to the same. READ UP the articles that have been published so far before making lame remarks, will ya :p
 

gaurav311

Junior Member
Aug 28, 2003
5
0
0
Hi, dudes, some stuff to think about:

These early benchmarks at 1024x - I'm fairly sure it's with 4X AA turned on. Maybe the game defaults to it with appropriate hardware or something. Gabe mentioned that he runs the game on his R9800Pro at 1600x1024 or something (widescreen) with 2X AA, so I'm almost *certain* my first sentence holds.

Okay, now nVidia cards had issues running with AA on, right? (there was news on this a few weeks back)

And they said they'd have to write some pixel shader wrapper thingy to make AA work, right?

Wrappers slow things down. It's as if they're "emulating AA" on the nV hardware using shader bandwidth. That explains it, and it'd be nice if someone pointed it out.

So the main reason nV hardware is sucking is because of the wrapper Valve wrote to enable AA.

Valve may not have optimised for ATi etc, but their marketing tactics here (not disclosing technicals that explain nV's crap performance, and disclosing only certain benchmarks on which thousands are already drawing big conclusions) are fairly sad.
 

Tates

Elite Member
Super Moderator
Jun 25, 2000
9,079
10
81
Originally posted by: GTaudiophile

--------------------------------------------------------------------------------
Originally posted by: Tates
OK, I'm the proud owner of a Radeon 9800 Pro, but I don't think I'll be willing to pay $9.99 a month to pay Half-Life2.
--------------------------------------------------------------------------------

Where does the $9.99/month come from?

OK, I'm clear now. Looks like it's $9.99/mo for Steam and access to all other Valve titles.

There will be three SKU versions of HalfLife2 .
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
This whole thread is proof that some people just HATE NVIDIA to the exclusion of all reason AND in defiance of any claims the Fanboys make about wanting "competition". When Nvidia ruled the roost, people swore that 3dfx, then ATI, had to be kept alive to prevent Nvidia's "monopoly" (fess up, that's what it was called) from keeping prices high and quality low.

But, now that ATI is King, these same people are now shoveling dirt on Nvidia and cackling about how they're the smartest people in the world for contributing to their "downfall". How many posts have declared that there is NO WAY Nvidia will ever come back with anything good? Do these delusional Fanboys remember that ATI was the b*tch until the 9700 came out or are they just not admitting it?
Nvidia owned ATI from 1998-2002 - that's five years. ATI's been comparable for a year and you'd think they invented the transistor for all of the giggling from the Fanboys.

ATI and Nvidia are the Ferrari and Lamborghini of the video world - while one may be faster than the other, it's not like the 2nd place car is a hooptie, that is, unless you're a rabid fan of whichever card ISN'T an Nvidia. ATI fans used to brag about the 2D quality - "Who cares if my card is 40% slower and has broken drivers that bork all the games I play? My Excel looks SHARP!" - when they couldn't compete on speed. Now that ATI can fuel their d*ck-measuring egos, they're just that much more obnoxious.

Finally, does anyone think it's odd that in most benchmarks, the top cards from both finish within 10 percent of each other, but suddenly the Nvidia part is HALF as fast? Combined with the reach-around that ATI and Valve are giving each other AND the event was put on by ATI:Q, the lack of critical questions being asked as to whether Nvidia is getting smeared shows just how disinterested in FACTS the Fanboys are. Who cares if it's untrue if it supports their beloved ATI?

1. ATI staged the event.

2. ATI and Valve are seriously in bed with each other.

3. Valve has been leaking comments that ATI was better. (Remember when Carmack used to trash ATI drivers as not following specs? At least he's independant.)

4. Nvidia has a new driver revision pending and tried to get betas used for comparison, but Valve refuses, making vague accusations of "out-of-hand optomizations", but doesn't specify what those optimizations do.

The last point is key: While the Fanboys wank themselves furiously and yelp that Nvidia has been busted for cheating, what's to say that the major crime the new drivers commit is BEING AS FAST AS THE ATI DRIVERS? God forbid Valve has to back down from all their smacktalk about Nvidia in support of their cash masters at ATI.

What's been missing from all of this is SPECIFIC DETAILS about what the heck is going on. All we get is vague murmurs about how crappy Nvidia's parts and drivers are and how ATI r00lz cuz they follow the DX9 spec, but HOW DO WE KNOW THAT? Just because a video company and a game maker who are in bed together say so?

INDEPENDENT investigation is needed and, so far, all I've seen are the Valve supplied benchmarks that were tailored to show ATI in the best light. Aren't you Fanboys just the slightest bit interested in the truth or is your Nvidia hatred so all-consuming that you can't help yourselves?
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
Jesus christ, it's just a benchmark relase, and everyone is jumping all over each other arguing about nothing but bs from my computer is better to yours to you've got a small dick...
 

Crapgame

Member
Sep 22, 2001
151
0
0
A pixel shader 2.0 written according to the recommendations from ATi or created with the current DX9-SDK HLSL compiler unfortunately highlights all those problems. Such shaders use many temp register and have texture instructions packed in a block. A shader of version 1.4 uses the same principles.

nVidia will have to tackle these problems, the earlier the better. Microsoft is working on an SDK update which will include a new version of the compiler that can produce more "NV30-friendly" code with less temp registers and paired texture ops.

Im less blah blah the 2x4 pipe does have limitations using 2.0 shaders, the ATI 8x1 works better but it is not hopeless at all.

Det 50 is no panacia to the issue but it will help (btw 51.75 is 3 weeks old), the problem lies in that the pipe (ATI 8x1, nV 4x2) at this time is geared toward the ATI in that it has multiple wait states and the shader having a definate FIFO process at one per pass. The "length" of ATI's 8x1 is a great help as the wait states are held internaly and there is less interuptions allowing the FIFO output to run faster and smoother. The modifide compiler will allow for the removal of some of the wait states and use paired textures per pass, in other words nV cant use both pipes right now effectively or with the current drivers at all.

This is what is causing the slowdowns in PS 2.0 for nV, once paired textures are able to travel the pipe some of the "length" will still hinder the nV but full use of the second pipe will greatly aliviate this.

Reports of the Tomb Raider 2 slideshow becomming smooth with just the driver implimentation to me is fabulous, add that to the DX modifications and we (nV'ers) are back in buisness even with PS 2.0.

Three days ago the 5900 flicker/scroll noise issue was an unfixable hardware issue, new divers (45.33) fixed it.
 

gaurav311

Junior Member
Aug 28, 2003
5
0
0
no way, this stuff's great.
I'm staying up till midnight for the benchies.

It's like a technical soap opera.
Except you replace the characters (who have strengths and weaknesses) with graphics cards. And the story has lots of interesting twists like some characters cheating and stuff.

Actually come to think of it I need some sleep.
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
/clap @ defref


well said.




What do you think happens when you build a game specifically for one card, then try and code for another later?

You get results like this.


Yes ATI and Valve are having a circle jerk, and yes Gabe Newell is the pivot man.

Same thing is going happen w/ nvidia and DOOM, then the shoe will be on the other foot.

This is bad for us, quality is going to be sacrificed for speed now.
 

Crapgame

Member
Sep 22, 2001
151
0
0
Rage, there is a hole in the FX line concerning PS 2.0, it's not made up. Eidos would have to be in on it as well, I'm no democrat so massive conspiracies dont sit well with me. :p THe hole isnt some great abis that is going to cause nV to fold, half the ATI guys are running nForce motherboards. ;)
 

CyberZenn

Senior member
Jul 10, 2001
462
0
0
Originally posted by: Rage187


What do you think happens when you build a game specifically for one card, then try and code for another later?

You get results like this.

This is why I dont really have a problem with video card companies putting game-specific optimizations into their drivers. As long as they arent "cheating" (lowering image quality, etc), it's kind of necessary to level the playing field when questionable alliances occur between compaies to the exclusion of others. As you said I would expect Doom3 to favor the NV cards:

Early D3 benches


Remember when ATI leaked the Doom3 alpha earlier this year? Carmack was EXTREMELY upset about this (according to his postings at \.) and I wouldnt be suprised if this is the result.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: DefRef
1. ATI staged the event.

2. ATI and Valve are seriously in bed with each other.

3. Valve has been leaking comments that ATI was better. (Remember when Carmack used to trash ATI drivers as not following specs? At least he's independant.)

4. Nvidia has a new driver revision pending and tried to get betas used for comparison, but Valve refuses, making vague accusations of "out-of-hand optomizations", but doesn't specify what those optimizations do.

The last point is key: While the Fanboys wank themselves furiously and yelp that Nvidia has been busted for cheating, what's to say that the major crime the new drivers commit is BEING AS FAST AS THE ATI DRIVERS? God forbid Valve has to back down from all their smacktalk about Nvidia in support of their cash masters at ATI.

What's been missing from all of this is SPECIFIC DETAILS about what the heck is going on. All we get is vague murmurs about how crappy Nvidia's parts and drivers are and how ATI r00lz cuz they follow the DX9 spec, but HOW DO WE KNOW THAT? Just because a video company and a game maker who are in bed together say so?

INDEPENDENT investigation is needed and, so far, all I've seen are the Valve supplied benchmarks that were tailored to show ATI in the best light. Aren't you Fanboys just the slightest bit interested in the truth or is your Nvidia hatred so all-consuming that you can't help yourselves?

READ Anand's snipped on exactly *why* nVidia is slower.

In regards to your silly list where ATI and Valve are in bed together and refuse any nVidia optimizations: go get your head checked! nVidia went out and got a bunch of games to have their "the way it's meant to be played" logo and nobody says jack. Now, in retort to that, ATI comes up with a package deal with Valve, sweetened by the fact that ATI hardware is better equipped to run HL2 (read the postings all over the internet for why). However, this just can't be. ATI must be influencing Valve to make the game perform like ass on nVidia platforms, and Valve is more than happy to alienate over half of it's market just to please ATI.

what's to say that the major crime the new drivers commit is BEING AS FAST AS THE ATI DRIVERS

Again - READ ANAND'S SNIPPET (and search the web). Valve had to design a specific DirectX 8/9 mixed codepath to get HL2 to run acceptably on nVidia hardware. Apparently, they're willing to work their asses off to get it to run acceptably on nVidia, but you still have some pipe dream that nVidia has some "Ether" in their cards that will magically make it run as fast as ATI in an apples-to-apples comparaison. Please!

Edit: The exact same thing happened with Doom3 about 6 months ago: John Carmack announced that nVidia performed like crap with the default ARB2 codepath and a special NV-specific codepath had to be invented. This also incorporated changing the FP precision to 16-bit down from 32-bit (since Nvidia didn't support the DX9 spec 24-bit, which ATI does). John Carmack decided not to make a big stink of it, in part because he really respects Nvidia and their hardware (which he says is incredibly reliable and has some of the most stable OpenGL drivers around). But it's no longer really an apples-to-apples comparaison anymore with that game...
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: Vonkhan
How am I an nVidiot you asshole!

Thru this statement "ofcourse Half Life 2 is going to perform better on ATI hardware because it's been optimised and probably produced with ATI hardware in mind" and see my reply to the same. READ UP the articles that have been published so far before making lame remarks, will ya :p

UH NO! :p