• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Half-life 2 Performance: Breaking news

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Shanti

My first impression after looking at the benches was:
OMG, that f'ing sucks.
All this talk about how well HL2 would "scale"
60 fps at 10x7x32 with NO AA or AF on the current top of the line video card is f'ing pathetic.
If scaling for a GF3 means running at 640x480 with 16 bit color and low textures just to get 20 or 30 fps, then NO, it does NOT scale well at all.

I would have expected 60 fps at 16x12x32 with full AA and AF on a Radeon 9800 Pro.

And yes, most of us could afford to buy whatever video card we wanted.
But for those of us with families and other priorities, we cannot justify spending $350 just so our games will look better.

So if performance sucks on GF3's and Radeon 8500's, I think HL2 sales will definitely suffer.


I agree. After the PC Gamer scoop with all that talk of scalability and similar minimum requirements to UT2k3s, I was expecting much better performance on DX7 hardware and MUCH MUCH better performance on the lastest DX9 parts. Perhaps Valve has upped the graphical punch since then, but I certainly don't see how this is a very scalable engine under these circumstances.

For example, BF1942 is one of the most scalable games I've ever seen. It runs smoothly at 1024x768x16 and medium effects on my buds 1 Ghz AMD T-Bird/GeForce 4 MX 420/ 256 MB SDRAM - the game looks quite good for such low endish hardware. Personally, I'd like to see tech like this more often.
 

Cesar

Banned
Jan 12, 2003
458
0
0
Originally posted by: Insomniak
My my there's a ton of foolish people in this thread. Let's see, who here would like to buy 5900 Ultras and 9800 Pros for much less money than they cost now? I thought so. The bottom line here is that competition is a GOOD THING! It means lower prices for all of us, and increased options for all of us to choose from. All you moron Nvidiots and ATimbeciles need to get over this petty, immature, pathetic "my card is better than yours!" crap. Are you all 14 years old? Seriously. It seems to me like half the people participating in this discussion have a serious problem with their self esteem and have to use a video card and internet forums to feel better about themselves. I think we have a classic case of what is commonly known as "dick fear". People are terrified of inadequacies...(most commonly compared to a certain anatomical part)...and so compete with each other to feel better about themselves. That's what all the jackass posturing and bullcrap tough talking in high school and bars and on sports fields is all about....it's called DICK FEAR. As far as I can tell, this video card argument is just a mutated version of the same thing. So in closing, aside from the 3 or 4 of you who have managed to discuss this topic with something resembling civility, organization, and intelligence, you're all pathetic. Grow up, get out, and become better human beings. It doesn't mean you have to give up gaming - hell, I'm a hardcore gamer in my own right - but you need to realize that that's all it is....games. Christ. - Som

I think you have a good point here! I would like to appologize for my ATimbeciles and what you call dickness :beer:
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
Nvidia's response to all this nonsense (from Gamers Depot):

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers.

Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

Derek Perez
Director of Public Relations
NVIDIA Corp.


Short form summation: Valve chose to f*** over Nvidia by refusing to use the latest drivers.

If the Det50s deliver the performance Nvidia claims it will - will there be a backlash against Valve and ATI for this high-stakes chicanery? Doesn't the README of EVERY game mention making sure the user has the LATEST drivers to guarantee good performance? When the users are told to keep their drivers up-to-date, why is a graphics card company and a game developer deliberately staging a PR event that purposely disadvantages the non-partnering company?!?

Looks like the ball may be in the Valve/ATI court now.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
> Short form summation: Valve chose to f*** over Nvidia by refusing to use the latest drivers.

Or Valve chose not to use Det50 because of catching more cheats, and nVidia did it to themselves by cheating.

nVidia claims the lack of fog is just a driver bug, but they have also denied past cheats and lawyered futuremark into accepting that clipping planes are a legitimate "optimization." And notice nV didn't address the screen-capture cheating.

Right now I can't trust nVidia if they tell me that the sun rises in the East.

And no, my current card is not ATI it's a geforce 3 and given a good card with honest drivers I'd much prefer nVidia over ATI.
 

1ManArmY

Golden Member
Mar 7, 2003
1,333
0
0
Why doesn't Nvidia finish the damn beta version of their Rel. 50 driver's and be done with it.
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
It's a BETA, that is key. That would probably be why they haven't been used. I have a feeling that Anandtech's stuff tonight will include them, though...


IMPORTANT-Is the midnight posting time EST, or PST, or what?
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
Perfect examples of the Kool-Aid guzzling fanaticism of the ATI Fanboys, gentlemen: Valve and ATI, companies that you readily admit are seriously 69ing each other, conspire to screw the odd man out and when they protest, you automatically believe the perpetrators of the fraud!

Wow. There is no reasoning with you people, is there?
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
when the det.50s show performance on par with or superior to the 9700 or 9800 class, then the nvidia and ati camps can talk. right now, only the ati fanboys can talk since they have the numbers on their side. as far as valve and ati 69ing each other. fine, sure. finally ati is getting some following years of nvidia bopping every company with their 'the way its meant to be played' logo; which really means 'the way (a developer and its consumers) are meant to be played'.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Yay for more typed out receipts!
5150, you are beyond hope or talking to.
I
Why you think I didn't buy the 9800 is unfathomable to me.

I had to think about it when I bought a new truck this year, I had to think about it when I bought a new boat last year, and I had to think about it when I bought a new house the year before. But I never thought twice about buying a lousy $300 video card. When nVidia ignored my emails about the 5800, I thought, "Screw you, if this is how you treat customers, I'll support your competitor".

I actually bought it partly to annoy you, if you want to know the truth. You are such a pest, I thought it would be amusing to take away your ability to put down my 5800, and the money was trivial.
 

sandorski

No Lifer
Oct 10, 1999
70,785
6,345
126
Originally posted by: DefRef
Perfect examples of the Kool-Aid guzzling fanaticism of the ATI Fanboys, gentlemen: Valve and ATI, companies that you readily admit are seriously 69ing each other, conspire to screw the odd man out and when they protest, you automatically believe the perpetrators of the fraud!

Wow. There is no reasoning with you people, is there?

rolleye.gif


So far every DX9 Test or Game using PS2 has shown the weakness of the FX. Unless the Det 50s redeem9performance and quality) this feature for Nvidia, then every DX 9 game using PS2 *will* suffer the same issue with the FX. Valve has gone out of it's way to make the FX cards perform reasonably well, it's NVidia who mucked up, not a Valve/ATI conspiracy.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
Yay for more typed out receipts!
5150, you are beyond hope or talking to.
I
Why you think I didn't buy the 9800 is unfathomable to me.

I had to think about it when I bought a new truck this year, I had to think about it when I bought a new boat last year, and I had to think about it when I bought a new house the year before. But I never thought twice about buying a lousy $300 video card. When nVidia ignored my emails about the 5800, I thought, "Screw you, if this is how you treat customers, I'll support your competitor".

I actually bought it partly to annoy you, if you want to know the truth. You are such a pest, I thought it would be amusing to take away your ability to put down my 5800, and the money was trivial.

I think everything you say is a lie and you're likely some little teenager living at home in mom and dad's basement. Until I see proof otherwise, I won't buy it. Just like your supposed two bachelors degrees, what were they in, b.s. and more b.s.? :D

P.S. Since you're so rich, why don't you go out and buy a digital camera and take a picture of your so-called cards, boats and what not. Typing out receipts on a forum is nothing short of a good laugh.
 

waylman

Diamond Member
Apr 4, 2003
3,473
0
0
Originally posted by: gururu
when the det.50s show performance on par with or superior to the 9700 or 9800 class, then the nvidia and ati camps can talk. right now, only the ati fanboys can talk since they have the numbers on their side. as far as valve and ati 69ing each other. fine, sure. finally ati is getting some following years of nvidia bopping every company with their 'the way its meant to be played' logo; which really means 'the way (a developer and its consumers) are meant to be played'.

spoken like a true nVIDIOT. having a hard time coping with your sh!tty hardware are we? LOL.LOL.
 

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
Seems to get muddier by the hour-supposedly goes with the statement above quoted by DefRef

NFI:GPURW

is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):

We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.
You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.




Mike G

 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
spoken like a true nVIDIOT. having a hard time coping with your sh!tty hardware are we? LOL.LOL.


waylman, :D , dude,

FYI, I own a 9700pro; last nv card was a TNT. my post concerned earlier comments meant to trash ati supporters.

don't make ati supporters sound like morons;)

gururu
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.


I wonder if this is the JIT compiler I read about today?!?!?!?!?!?!?

 
Apr 17, 2003
37,622
0
76
Originally posted by: gururu
when the det.50s show performance on par with or superior to the 9700 or 9800 class, then the nvidia and ati camps can talk. right now, only the ati fanboys can talk since they have the numbers on their side. as far as valve and ati 69ing each other. fine, sure. finally ati is getting some following years of nvidia bopping every company with their 'the way its meant to be played' logo; which really means 'the way (a developer and its consumers) are meant to be played'.

IQ will prolly suck more than it does now prolly
 

EglsFly

Senior member
Feb 21, 2001
461
0
0
After reading this article:
http://www.tomshardware.com/business/20030911/index.html

There is one thing that I have not seen mentioned in this thread that really caught my attention in the article above.
The note about 32bit shaders in future games.
Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision.

Basically the nVidia performance stinks, either way IMHO. If the new 5x.xx drivers fix it, then so be it, and that will be great for those cards and then they can run future Valve games.

However, the new ATI cards only have 24bit shaders!
So would that make ALL current ATI cards without any way to run future Valve titles? :Q

Perhaps I do not understand the technology fully, can someone elaborate on this?
 

CyberZenn

Senior member
Jul 10, 2001
462
0
0
However, the new ATI cards only have 24bit shaders!
So would that make ALL current ATI cards without any way to run future Valve titles? :Q

Perhaps I do not understand the technology fully, can someone elaborate on this?

As I understand it you are correct. I dont think future dx9 titles will call for anything higher than 24bit precision b/c dx9, itself, is made to support a maximum of 24bit precision (perhaps there is a way to extend this?). From what Ive read, it was initially going to be a 32bit standard (and thus nvidia designed its cards around it), but relationships between M$ and NV have been a bit strained of late *cough*xbox*cough* and relatively late in development M$ changed the standard to 24bit. If someone has heard differently, please feel free to correct me.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I think everything you say is a lie and you're likely some little teenager living at home in mom and dad's basement. Until I see proof otherwise, I won't buy it. Just like your supposed two bachelors degrees, what were they in, b.s. and more b.s.?

P.S. Since you're so rich, why don't you go out and buy a digital camera and take a picture of your so-called cards, boats and what not. Typing out receipts on a forum is nothing short of a good laugh.

5150:
I would give everything I have to be a teenager living in my dad's basement again. I'm a 40 year old man. When I lift weights now, my joints are sore afterward. I can'
t play tennis any more at all. I gave up power volleyball because I tore the ligaments in my ankles up too much. I have to send freaking Sallie Mae almost $400/month to pay for my degrees, because I put myself through school working part time and borrowing money. I have to go to work every day, 49 weeks of the year. I would freaking LOVE to be driving my 74 Camaro again and worrying if they'll take my fake id.

The only reason I have no digital camera is because I take no pictures, and I haven't been able to convince my wife to part with her apparently beloved 35mm.

If you don't want to believe I have a 9800 for whatever reason, that's fine. It doesn't change the fact that I do, so I'm not compromised by your disbelief. My point in buying it was to say "fu*k you" to nVidia for the most part anyway. Having the same card as you was just a little added benefit.
 

cessna152

Golden Member
Feb 10, 2002
1,009
0
0
There is a "standard" for a reason and from my shallow understanding Nvidia chose not to follow it exactly in order to do better in certain benchmarks. I am neither a fanboy of Nvidia or ATi being a student on a budget. I read reviews and make a decision on which card will give me the best bang for the buck. I have gone from GF2 MX400 -> GF4 Ti-4200 -> ATi 9700 Pro. I respect both companies. But my views on Nvidia have been tarnished lately due to their issues regarding benchmarks and this recent event. I love my Ti-4200 and it still runs great! The 9700 Pro is also great and I have not experienced any problems with drivers yet. The only reason I chose the 9700 Pro was price and availability. I will continue to base my buying decisions on best bang for the buck. It would be interesting if a reputable sight did an in-depth look into how closely Nvidia and ATi follow the DX9 Standard...