• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Half-life 2 Performance: Breaking news

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
rollo, gimme the house and the boat and you can move into my parents basment, curfew is a 3:30 am sharp
Do you want the mortgage and payment too, or do I have to keep those? And what does your mom look like? ;)

BTW- my point in saying I bought that stuff wasn't to make myself out as "rich" like 5150 thinks, just to point out that at this stage of the game, a vid card isn't a major purchase anymore. I'm just a middle class joe in the suburbs, making payments and watching my IRAs devalue.
 

peter7921

Senior member
Jun 24, 2002
225
0
0
Originally posted by: Ferocious
Originally posted by: peter7921
Originally posted by: NavJitsU4
nVidia should've never kept those "jelly doughnut eating" 3dfx engineers.

Amen!! All ex-3DFX engineers should be lobotomized and all there work destroyed before it corrupts another company. Every thing about the 5800 and 5900 stinks like 3DFX.

Damn you 3DFX!!!!!!

heh...it was ex-3dfx engineers (the first wave prior to V3) that helped build Nvidia and got their GeForce out the door.

Your right all the good engineers left before the infamous Voodoo 3. lol. I remeber when the voodoo 3 came out all my friends went and got one because of the Voodoo brand. While my lowly TNT2 outperformed their V3's and even supported 32 bit colour. Voodoo 3 was the begining of the end for 3DFX, everything they released after that got worse and worse.
 

Vonkhan

Diamond Member
Feb 27, 2003
8,198
0
71
Originally posted by: NYHoustonman
Originally posted by: shady06
Originally posted by: NYHoustonman
ARE THE BENCHMARKS BEING POSTED AT MIDNIGHT PST OR EST OR WHAT???

i asked the same question in the forum issues forum. looks like 12 est/9 PST

Thanks. Finally somebody answered :)...

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAARGH! HURRY UP ... MOVE ON THERE U STUPID CLOCK!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: rbV5
ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7

Yikes that sucks. "winner" looks like "less of a loser" to me:Q

remember this is the highest level of dx9 not some old dx8 with dx9 features thrown in. That is very taxing on any card. The major focus of HL2 is singleplayer anyway with multiplayer comming along behind the AI construction and story experience.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: NYHoustonman
Hey, to me 1280x1024 @ 40 fps would be good anyway, but also note that it seems that this was with the Radeon running the "special" FX codepath, rather than the default codepath.

no...they are saying the Radeon at default codepath = 60fps and nvidia = 10fps.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Genx87
This is interesting, a game company with an Agenda. Looks like Valve has a possible problem on thier hands.

If the top end card is getting 60 FPS at 1024X768 and all the rest are horrible then who is going to buy the game?!?!?!?!?!?

I just got a 5900 and no way in hell am I going to be buying a new card to play a 60 dollar game. And there are lots of people out there who are in the same boat.
Will be interesting to see what happens here.

well, it seems the 9600 can get around 40fps...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Crapgame
Well I havent seen any part of it myself but I heard it looks like butt compaired to Doom3.

Doom3 is no substance. Doom3's storyline is a joke and it's low polygon models. All it has is textures and lighting effects. HL2 uses alot of better features IMO and doesn't need it to be dark and use shadows to get effect.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: NYHoustonman
Good news for NVidia users:

Here.

This is NVIDIA's Official statement: "The Optimiziations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".

This better not turn into what it looks like it's on path to...


EDIT-And, of course, I, as an ATI user, can't wait for Cat3.8 :)...I hope it improves this (although I am not that bad off as it is).

looks like Nvidia is turning off some features again. I remember reading somewhere that Valve say Nvidia drivers turning off fog in a particular map just to get a decent framerate. That is one of their optimizations
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Rollo
nVIDIVIA got owned on THE ONE that really mattered, even from a business POV for the upcoming season

I really dont care about Serious Sam or any of his cousins HL2 baby!

LOL, literally.

"Ooohh HL2! It's gonna be cool! <spank spank spank> It's gonna be sooo much fun! <spank spank spank> Ohhhhhh Freeman is back, bay-bee! <spank spank spank spank spank spank spank spank spank spank> moan.......

As far as nVidia getting owned where it matters, I guess that depends on your perspective. FPS games are one genre, there are many others.

I think you underestimate the power of marketing. FPS games get huge marketing online and elsewhere. They have a huge following with counterstrike. Other genres don't get as much respect and no other genre uses the latest APIs to the fullest like FPS games do. They always push the edge. That is why this is a big deal.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Chord21
I started a upgrade in anticapation of the HL2 release. Spent 2 weeks pouring over the fourms debating if I was going to switch camps from Nvidia to Ati; must say that I was impressed with the overall product line that Ati is putting out now. But when all was said and done I found the Geforce 5900nu for $250 bucks, took all of 5 min to sell me.
Well now bam pre-production benchmarks are starting to show and clearly the Ati is in the lead. Humm should I worry, should I care? Naa....remember this is pre-production, Nvidia has stated that they will release drivers that will fill in the performance gap. Will Ati still be in the lead.....who cares! As long as the game is playable I will be completly happy with my 250 dollar purchase.
Some of you will not agree, but thats what makes these fourms fun. Also why worry all our cards are gonna be junk in 6 months anyway.
Well all have fun )

again I point to valve stating that Nvidia optimizations turn off fog and other shader effects to get better results. Dumbing down the graphics is not an option for me.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: gaurav311
Hi, dudes, some stuff to think about:

These early benchmarks at 1024x - I'm fairly sure it's with 4X AA turned on. Maybe the game defaults to it with appropriate hardware or something. Gabe mentioned that he runs the game on his R9800Pro at 1600x1024 or something (widescreen) with 2X AA, so I'm almost *certain* my first sentence holds.

Okay, now nVidia cards had issues running with AA on, right? (there was news on this a few weeks back)

And they said they'd have to write some pixel shader wrapper thingy to make AA work, right?

Wrappers slow things down. It's as if they're "emulating AA" on the nV hardware using shader bandwidth. That explains it, and it'd be nice if someone pointed it out.

So the main reason nV hardware is sucking is because of the wrapper Valve wrote to enable AA.

Valve may not have optimised for ATi etc, but their marketing tactics here (not disclosing technicals that explain nV's crap performance, and disclosing only certain benchmarks on which thousands are already drawing big conclusions) are fairly sad.


the problem is this.
1)you can force AA off in driver so that's not an excuse. and 2) running a game fine means no stuttering and a solid 30fps would make a singleplayer game perfect. You don't need 60+fps every second to get a solid singleplayer experience. Obviously for MP you want alot more
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: Crapgame
Well I havent seen any part of it myself but I heard it looks like butt compaired to Doom3.


Doom3 is basically DX7 level type graphics with shaders (it's OGL of course). J Carmack has acknowledge the the AARB path can be run in full precision with ATI R3x0 boards, he also stated that NV3x (incl. the NV35) has to be coded with a mixture of FX12 and FP16 --aka--- EXACTLY what G Newell had done for HL2, if you put your brand loyalty aside you would see that NVidia can only "optimize" ie dumb down API's to previous versions in order to remain competitive, ie... they're trying to polish a turd.

FYI:


Update 09/05: We emailed id Software guru, John Carmack about his experience with NV3x hardware on Pixel Shading performance, and this was his reply:

GD: John, we've found that NVIDIA hardware seems to come to a crawl whenever Pixel Shader's are involved, namely PS 2.0..

Have you witnessed any of this while testing under the Doom3 environment?

"Yes. NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit."

John Carmack
As per interview Carmack did with GD.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Rage187
/clap @ defref


well said.




What do you think happens when you build a game specifically for one card, then try and code for another later?

You get results like this.


Yes ATI and Valve are having a circle jerk, and yes Gabe Newell is the pivot man.

Same thing is going happen w/ nvidia and DOOM, then the shoe will be on the other foot.

This is bad for us, quality is going to be sacrificed for speed now.


Nvidia is much better at OpenGL than DX9 games. That is for the better performance of the 5900 on Doom3
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
From the man who created the skuzzy interface and now a game dev.

"I don't have time to read this entire thread, but will respond to those who are complaining about Vlave being the problem.

First, Valve is not the problem. You do not understand how code in DX9 works if you say that.

Here is the quick and dirty. In the game code, at init time, you ask what level of shaders (pixel and vertex) you (the card) support? The card comes back and says PS2.0 and VS2.0, then you go, "Cool", we can use DX9 shaders. At this point in the code, you have no idea what video card is really out there, unless you specifically test BEFORE starting DX9 up.

It's not Vlave's fault that the NV3x family of parts perform very badly using the shaders thay calim to support.

For Valve to fix this problem, they would have to disable all shaders. Well, the user community has wanted dynamic code in and have been bitching for years, "why can't game devs support the high end cards?"
Now we are doing it and you are bitching. Folks, the NV3x cards suck at DX9 shaders. That is the simple truth. Deal with it. It's not Valve's problem and I, for one, am glad they are taking the stance they are. Why?

Well, it might just make the farkin video card companies take notice that we, the devs, are not going to go quietly into the night anymore and take heat from gamers about features that are not being used. Maybe, just maybe, it might make the video card companies stand up and take notice, that if you put a piece of crap out in the market, we will expose it."



The ARB2 codepath for ati is of higher IQ and runs faster with the optimized cats than the nv3x.

rogo
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: rogue1979
Originally posted by: sman789
ahhh, 9600 pro made me happy again

The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;


Gee, according to the article the 5900 wasn't doing very well. So if the 9600Pro has about equal performance in the same game, it doesn't sound like something to be happy about.

If you check beyond3d for benchmarks you'll see that the 9600 is much better than Nvidia at the default
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: DefRef
Nvidia's response to all this nonsense (from Gamers Depot):

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers.

Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

Derek Perez
Director of Public Relations
NVIDIA Corp.


Short form summation: Valve chose to f*** over Nvidia by refusing to use the latest drivers.

If the Det50s deliver the performance Nvidia claims it will - will there be a backlash against Valve and ATI for this high-stakes chicanery? Doesn't the README of EVERY game mention making sure the user has the LATEST drivers to guarantee good performance? When the users are told to keep their drivers up-to-date, why is a graphics card company and a game developer deliberately staging a PR event that purposely disadvantages the non-partnering company?!?

Looks like the ball may be in the Valve/ATI court now.

reducing from 32bit to 16bit is not acceptable IMO.

 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Ginfest
Seems to get muddier by the hour-supposedly goes with the statement above quoted by DefRef

NFI:GPURW

is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):

We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.
You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.




Mike G


I read that HL2 will be bundled with later ATI cards...this is the deal I suppose.
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: cmdrdredd
Originally posted by: Rage187
/clap @ defref


well said.




What do you think happens when you build a game specifically for one card, then try and code for another later?

You get results like this.


Yes ATI and Valve are having a circle jerk, and yes Gabe Newell is the pivot man.

Same thing is going happen w/ nvidia and DOOM, then the shoe will be on the other foot.

This is bad for us, quality is going to be sacrificed for speed now.


Nvidia is much better at OpenGL than DX9 games. That is for the better performance of the 5900 on Doom3

As per ATI and Valve, coding a full DX9 game that won't run acceptably on a highend "DX9+ (as per NV)" is not the ISV fault at all, the IHV should've had the features WORKING properly on silicon - it's that simple. I'm sure people who believe that if a game is developed fully DX9 is an obvious slant or partnership with an ISV, how else can they justify to themselves their $300-500 investment.

Yes sure Nv is faster at specially coded for their hardware FX12 / FP16 precison OGL vs ATI's FP24 which doesn't need any special attention due to any inefficiencies, that's a no brainer.
 

rickn

Diamond Member
Oct 15, 1999
7,064
0
0
Originally posted by: Rogodin2
From the man who created the skuzzy interface and now a game dev.

"I don't have time to read this entire thread, but will respond to those who are complaining about Vlave being the problem.

First, Valve is not the problem. You do not understand how code in DX9 works if you say that.

Here is the quick and dirty. In the game code, at init time, you ask what level of shaders (pixel and vertex) you (the card) support? The card comes back and says PS2.0 and VS2.0, then you go, "Cool", we can use DX9 shaders. At this point in the code, you have no idea what video card is really out there, unless you specifically test BEFORE starting DX9 up.


It's not Vlave's fault that the NV3x family of parts perform very badly using the shaders thay calim to support.

For Valve to fix this problem, they would have to disable all shaders. Well, the user community has wanted dynamic code in and have been bitching for years, "why can't game devs support the high end cards?"
Now we are doing it and you are bitching. Folks, the NV3x cards suck at DX9 shaders. That is the simple truth. Deal with it. It's not Valve's problem and I, for one, am glad they are taking the stance they are. Why?

Well, it might just make the farkin video card companies take notice that we, the devs, are not going to go quietly into the night anymore and take heat from gamers about features that are not being used. Maybe, just maybe, it might make the video card companies stand up and take notice, that if you put a piece of crap out in the market, we will expose it."



The ARB2 codepath for ati is of higher IQ and runs faster with the optimized cats than the nv3x.

rogo


if you're a game developer, you sure don't talk like one. I use to run a very successful gaming website, made lots of contacts within the pc/console world, and still keep in touch with one that works at Destination Games, and none of them would talk that way. If a game doesn't run right on particular hardware, a game developer would want to fix it. That would mean going to the hardware manufacturer for assistance.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
to obtain the performance benefit of this mode with no image quality degradation.

1. If the IQ isn't diminished, then it doesn't matter, unless you want us to believe that you can tell what precision level is being used on the shaders that make that HORDE OF ALIENS TRYING TO KILL YOU look all shiny.

2. This isn't the same as the difference between 16 and 32-bit colr depth for TEXTURES. No one is talking about reducing that.
 

modedepe

Diamond Member
May 11, 2003
3,474
0
0
Another 10 mins :D
I hope that 9 pst thing was right..I don't feel like staying up till 12.
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
Originally posted by: modedepe
Another 10 mins :D
I hope that 9 pst thing was right..I don't feel like staying up till 12.

And I, 3. If my mom found out I was still on the computer right now, she'd probably kill me :(...