• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Half-life 2 Performance: Breaking news

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kongzi

Member
Jul 6, 2003
50
0
0
Originally posted by: cmdrdredd
the benchmark graphs are borked! They all show the bug demo graph.

Heh I noticed that too. We can probably figure out what the rest of the graphs look like though. Something along the lines of ATI kicking ass.

 

JSSheridan

Golden Member
Sep 20, 2002
1,382
0
0
Graphs are fixed. Notice how the 9600Pro doesn't experience much of a drop compared to the other cards when the resolution is increased despite the 128bit mem bus and 4x1 architecture. How is that possible? Also, a nice plug for the upcoming Athlon64 review, though HL2 won't be part of it.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
Originally posted by: RogerAdam
Probably whats happening to NV right now, their reputation is being flogged, they're always optimizing "bugs" and if something is awry and exposed it's a "driver bug" (ie the fog removal in HL2, shader replacement in 3DM03, etc...) - Funny (and sad) thing about it is that those who BLINDLY defend NV now always fallback to the "drivers are better" BS in light of a company that doesn't want to admit it's broken h/w, and say it's the DRIVERS fault, IMO NV makes BOTH arguements presented by their enthusiasts moot. They're in obvious damage control, it's not too hard to see that, and ATI doesn't have to do anything PR-wise, it's as if NV has been hired to that for them. NVidia better pray there isn't one of those "out of nowhere" hits coded in DX9 it would KILL them, rather than hurt them for a bit as what's happening today.
It's astounding how so many people are willing to take a rigged demo and extrapolate it into failed hardware and failed drivers. My god, you'd think FX cards cause cancer and acne from the way you carry on about things. A bug that Nvidia readily acknowleges (missing fog) is what you consider proof of failure?!? What about all the application fixes that the Catalyst drivers require? A friend bought a 9700Pro and right after he told me it's 3DM2K3 score, he listed the games he played that rendering bugs. Once again, the ATI double-standard rears it's ugly head.
Originally posted by: jiffylube1024
Well that's it. I'm done posting in this thread about why ATI is just plain better than nVidishit at DX9, and I'm done sugarcoating my words about nVidia. Good riddance to NV3x, good riddance to all the nVidiots in here who rain on everyone's parade, and hand out a "fanATIc" sticker to everyone at the door for making the right choice. And enjoy Doom3. That game looks very fun (I'm really looking forward to it), but you better find something else to do after the paper-thin single-player campaign wears thin.

By the way - to anyone who hasn't seen the 600MB HL2 demo, please download it. I really haven't been more impressed by a game before. Perhaps Halo at E3 '99 and Metal Gear Solid 2 at it's first E3. That's about it. HL2 looks to set a new paradigm in gaming. God bless Valve!
Yes, the demo does look great, but that doesn't mean your first paragraph was anything other than Fanboy at it's best/worst.

Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere. Someone posted a link to a Guru 3D thread discussing the new AquaMark benches and how Nvidia was a close 2nd in those tests and how it called the Shader Day charade into question. Now, there was some trolling, but the conversation was much more mature than a lot of posts here, primarily because they were more interested in trying to figure out WHY the disparities exist as opposed to the Neandethal chest-thumping like the post quoted above - there are at least five slurs in one sentence. You'd think ATI was giving him cookies and Teletubbies videos in exchange for his street team posts.
rolleye.gif

Originally posted by: CyberZenn
Originally posted by: DefRef
to obtain the performance benefit of this mode with no image quality degradation.

1. If the IQ isn't diminished, then it doesn't matter, unless you want us to believe that you can tell what precision level is being used on the shaders that make that HORDE OF ALIENS TRYING TO KILL YOU look all shiny.

2. This isn't the same as the difference between 16 and 32-bit colr depth for TEXTURES. No one is talking about reducing that.

You may, however, notice things like "seams" between textures or polygons as they are rendered ever-so-slightly out of place. In my experience (admittedly limted to college CG courses), the difference between a polygon rendered using a 32bit number (say, a double) and a 64bit number (floating point) is visually noticable, especially after the number rounding you are having to do propagates through your code thousands and thousands of times (w/o deliberate correction, at least).
Once again, what are you talking about?!? The release said that:

Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

Please note that he's talking about SHADERS, not POLYGONS. If the polys make up a box, the textures act as the wrapping paper and the shaders act as a shiny cellophane outer wrap, even if the cello is ugly, the box will still be a box. Undertand what I mean or are we talking past each other?

The way I interpret the first part is that old pixel shaders don't get better looking just because they're "updated" (promoted) to PS 2.0. This is like trying to blow up an image taken from a web site and make a clean printout. If the JPEG is 72 dpi and 200 pixels square, it's not gonna look any better than blurry Legos when you interpolate it to 300 dpi and make it 8 x 10. (Note: Anand refers to it as "banding" in the look, so it's the celo, not the box.)
Originally posted by: SickBeast
Ahhh...the return of the Deaf Ref. It's too bad you and the nvidia boys aren't "jerking each other into a frenzy" right now, huh?
Why would we be? We're waiting for some solid facts, not rigged tests by a couple of companies who counted on their legions of mindless fanboys to regurgitate whatever they pour in their troughs.

Anand's piece is balanced and he's interested in why the big discrepency in performance. I'll trust the guy who wants to find the truth over someone emotionally wrapped up in a piece of silicon and metal any day.

Even with new drivers, the FX will probably be slower than the Radeons, but if we're talking 160 mph vs. 175 mph, it's no big deal. However, if the gap is 50%-70%, whoa Nelly! Continuing the car analogy, Valve deliberately demanded that the Nvidia cards be run with watery low-octane gas and put premium in the ATI tank. The fact that they suddenly cut Nvidia off from development recently and then blind-sided them claims and charges that Nvidia had to read about in the papers (so to speak) shows that Valve may be souring the whole HL2 experience in their quest for cash. What sucks the most about that is I was dumb enough to stick up for them when people freaked-out without reason about the price pans they had. Maybe they are greedy after all.

Final note: Someone well up this thread mentioned that one of the reasons for HL's success was it ran pretty well on everyone's machines at the time - it even had a SW renderer option. (Which I see UT2K4 will as well.) Now, they're about to release a game that's "playable" (ahem) on lower-end systems but only look it's best on the pimpest of the pimp rigs, which are strictly in Alienware/Falcon/Voodoo territory. Even my new 3200+/5900/1GB system may not be enough. Yeep! But, do you know why The Sims (and their endless expansion packs) are the biggest sellers ever? A: Because it runs pretty well on any machine.

Any game that requires a $1500 upgrade (video, CPU, the works) to be playable isn't going to sell. People will wait for the Xbox version and just spend the $50 and not care about the pretty DX9 stuff under the hood.
 

MrJim

Member
Jan 10, 2003
38
0
66
DefRef>Good post, but if nvidia fixes its drivers so ATI and nvidia gets the same FPS(with top of the line cards), it still uses the mixed codepath with lower quality. Im using Nvidia cards because i do alot of 3d work but i still want to play my games with the best image quality(Shaders etc). This is good for future consumers, will be interesting to see more pure DX9 games and more clarifying responses from Nivida.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: SickBeast
Ahhh...the return of the Deaf Ref. It's too bad you and the nvidia boys aren't "jerking each other into a frenzy" right now, huh?
Why would we be? We're waiting for some solid facts, not rigged tests by a couple of companies who counted on their legions of mindless fanboys to regurgitate whatever they pour in their troughs.

Anand's piece is balanced and he's interested in why the big discrepency in performance. I'll trust the guy who wants to find the truth over someone emotionally wrapped up in a piece of silicon and metal any day.

Even with new drivers, the FX will probably be slower than the Radeons, but if we're talking 160 mph vs. 175 mph, it's no big deal. However, if the gap is 50%-70%, whoa Nelly! Continuing the car analogy, Valve deliberately demanded that the Nvidia cards be run with watery low-octane gas and put premium in the ATI tank. The fact that they suddenly cut Nvidia off from development recently and then blind-sided them claims and charges that Nvidia had to read about in the papers (so to speak) shows that Valve may be souring the whole HL2 experience in their quest for cash. What sucks the most about that is I was dumb enough to stick up for them when people freaked-out without reason about the price pans they had. Maybe they are greedy after all.

Final note: Someone well up this thread mentioned that one of the reasons for HL's success was it ran pretty well on everyone's machines at the time - it even had a SW renderer option. (Which I see UT2K4 will as well.) Now, they're about to release a game that's "playable" (ahem) on lower-end systems but only look it's best on the pimpest of the pimp rigs, which are strictly in Alienware/Falcon/Voodoo territory. Even my new 3200+/5900/1GB system may not be enough. Yeep! But, do you know why The Sims (and their endless expansion packs) are the biggest sellers ever? A: Because it runs pretty well on any machine.

Any game that requires a $1500 upgrade (video, CPU, the works) to be playable isn't going to sell. People will wait for the Xbox version and just spend the $50 and not care about the pretty DX9 stuff under the hood.

Those were some pretty bold statements, stating that HL2 will not sell well. You are underestimating the impact that the high end of the graphics market has on the mid and low ends. You are also underestimating the tried-and-true skill of the engineering department at ATi. Finally, you seem to be forgetting that this game will run on Directx 8-based graphics cards.

Anand may be interested to know the reason behind the poor scores on the Nvidia front. The answer may be relatively simple. See, Nvidia decided to adopt one of intel's many underhanded business practices: they chose to deploy a scheme where their cards required Cg optimizations for games to run at their best, à la SSE2. Brute force wins this round. Bring on Doom III.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: MrJim
DefRef>Good post, but if nvidia fixes its drivers so ATI and nvidia gets the same FPS(with top of the line cards), it still uses the mixed codepath with lower quality. Im using Nvidia cards because i do alot of 3d work but i still want to play my games with the best image quality(Shaders etc). This is good for future consumers, will be interesting to see more pure DX9 games and more clarifying responses from Nivida.

How quick is an Nvidia-based workstation card in CAD 3D and 3DSMax? How many polygons does it take before it gets bogged down? I would argue that a solid Direct3D/OpenGL based graphics card can come pretty close to a workstation card.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: DefRef
Originally posted by: jiffylube1024
Well that's it. I'm done posting in this thread about why ATI is just plain better than nVidishit at DX9, and I'm done sugarcoating my words about nVidia. Good riddance to NV3x, good riddance to all the nVidiots in here who rain on everyone's parade, and hand out a "fanATIc" sticker to everyone at the door for making the right choice. And enjoy Doom3. That game looks very fun (I'm really looking forward to it), but you better find something else to do after the paper-thin single-player campaign wears thin.

By the way - to anyone who hasn't seen the 600MB HL2 demo, please download it. I really haven't been more impressed by a game before. Perhaps Halo at E3 '99 and Metal Gear Solid 2 at it's first E3. That's about it. HL2 looks to set a new paradigm in gaming. God bless Valve!
Yes, the demo does look great, but that doesn't mean your first paragraph was anything other than Fanboy at it's best/worst.

Meh. I'm never going to convince you otherwise DefRef, so there's no point. And it's just too easy for you to not call anyone you're arguing with a fanboy. Personally, I have absolutely no preference as to who has the faster running card. I merely stated my opinion of which company is doing things "the right way" so to speak, and for DX9, it's not Nvidia. Nvidia's "use a special codepath and let developers waste 5X more developing resources coding this path" method is not good business practise. If I was Valve, I would just tell Nvidia to go suck a lemon, and optimize for DirectX9. Not ATI, not anyone else. Just how the spec was defined, and intended to be coded for.

Nvidia is trying to force unfair standards on the 3d market, which has been seen before. This seems to be 3dfx's curse, or "legacy" - an overwhelming desire by the top company to create unfair standards. In hindsight, thank god the Voodoo 5 failed. Otherwise, we would have never seen the GeForce 3 (which is essentially the granddaddy of DX9).

Do note, however, that I have never wished ill on Nvidia the company. I hope them the best, and desire a two-horse race to the bitter end. We've seen ATI on top once before, and it wasn't pretty (see: Rage IIC). ATI needs another company to be the barometer for where the future heads. I just think that Nvidia has put everyone through enough BS with NV3x. Bring on NV4x (so we can get some really playable framerates in HL2). And bring on R400 too, baby!
 

EglsFly

Senior member
Feb 21, 2001
461
0
0
We still got the issue where Valve states:
"future shaders will be more complex and will thus need full 32-bit precision."
This will be a problem for all ATI R3xx cards (since they are 24-bit max) in the future.

Where are the AA and Anisotropic filtering results?
Why where these left out?

Is it because with Quality settings turned on, which is the big deal these days, that even with a 9800Pro, that the FPS would likely drop to the 30FPS range at 1280x1024 on a top of the line system?
Anybody with that kind of system is going to want to run the game with all the eye candy, quality and resolution cranked up.
This is crazy marketing to produce a game that can only run acceptable on a very limited set of hardware.

Also the fact that this game performs like crap on nVidia hardware, whether its the result of poor design on nVidia's part or not, Valve has got to realize that a large market of computer users have nVidia hardware or worse case, integrated graphics <barf>.

This seems to really cap the potential revenue that Valve can make on this game.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
well we don't have to worry about 32-bit until at least dx10 as dx9 is just 24-bit, but ya they will eventualy have to dumb down shaders for the ati cards as well. as for the scores in the benchmarks, i think they repersent worst case senerios and framerate will be respectable while playing at those settings.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
It will be interesting to see if the 50.0 series Dets bring any thing to the table as far as improving DX9 scores. It will also be intersting to see if nVidia can improve without driver cheats. It would be interesting to see HL2 performance with special code for Ati cards as well. Why try and make improvements only for nVidia hardware so they look better in this bench? That is the real question here.

It would be real funny to let nVidia and the fanboy drones whine for a few days about this benchmark release, and then let loose with an Ati coded benchmark suite of numbers. Would'nt that be fair? Oh, of course not
rolleye.gif
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: TheSnowman
well we don't have to worry about 32-bit until at least dx10 as dx9 is just 24-bit, but ya they will eventualy have to dumb down shaders for the ati cards as well. as for the scores in the benchmarks, i think they repersent worst case senerios and framerate will be respectable while playing at those settings.

The difference is wehn DX10 games come online this line of RVxx vards willnot be the flagship of ATI...This is Nvidias best offering and supposed marketed DX9 cards....Evidently not since they choose to not follow the requirements of the DX9....

 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: DefRef
Originally posted by: RogerAdam
Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere.


You can say that again... nvnews.net is the biggest joke on the Net.
 

Crapgame

Member
Sep 22, 2001
151
0
0
Originally posted by: solofly
Originally posted by: DefRef
Originally posted by: RogerAdam
Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere.


You can say that again... nvnews.net is the biggest joke on the Net.

You've obviously never seen the Democratic Underground site.
 

Crapgame

Member
Sep 22, 2001
151
0
0
Also-

I'm surprised that Anand mentioned nothing about the comparisons between 4x2 and 8x1 pipelines? Does he even know that MS is working to included paired textures with simutainious wait states for the nV arcitexture? You see the DX9 SDK was developed thinking only one path and since each texture has a defined FIFO during the pass the second pipe in the nV is dormant until the first pipe FIFO operation is complete, with paired textures in the pipe using syncronus wait states this 'problem' will be greatly relieved.
 

sodcha0s

Golden Member
Jan 7, 2001
1,116
0
0
Valve deliberately demanded that the Nvidia cards be run with watery low-octane gas and put premium in the ATI tank. The fact that they suddenly cut Nvidia off from development recently and then blind-sided them claims and charges that Nvidia had to read about in the papers (so to speak) shows that Valve may be souring the whole HL2 experience in their quest for cash.

Not exactly. Valve insisted on using drivers that were currently available to end users, and in fact bent over backwards to make sure that the game will be playable on nvidia cards, as they should. Sadly, the only current nividia card that gives acceptable performance costs ~ $500. We will have to wait and see if the det. 50's will give nividia a much needed performance boost.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
Originally posted by: Crapgame
Not exactly. Valve insisted on using drivers that were currently available to end users
I could see the point if the game was availible to end users as well.
BINGO! That's what's so lame about this Valve/ATI putup job: The demanded that OLD drivers be used on the basis that those were what was available to the end user when the game itself isn't going to be available to end users for a couple of months. (Anyone think they're gonna get it done by Sept. 30th now? Me neither.) I'm surprised that they didn't specify that the 3.68 drivers from Dec. 1999 be used because it takes several years to know if drivers are stable.

This wasn't a fair comparison because they made sure apples couldn't be compared to apples and only offered vague libels as the reason. Nvidia's in a tough spot of having to prove a negative - that Valve is lying about bugs and that they can follow thru. Unfortunately, the ATI FUD Troopers have been working overtime to make sure no one THINKS about what's going on with all their "ATI=r0x0rZ! Nvidia=SuX0rZ!" yapping.

Most ironic about all this back and forth is that the only entities clearing lying are Valve and ATI, yet Nvidia is getting it in the neck. Think of it this way, if Valve was in bed with Micro$oft, would you believe everything they said? I think not. Yet, since it's the makers of your beloved, manhood-enhancing video cards behind the scenes, you're totally credulous.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Since when are the most recent production drivers considered "OLD"? I'm sure ATI also has newer/faster drivers in BETA. They were not used either.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
it simply doesn't make sense to complain or argue about the benchmarks. it bewilders me how some look at the writing on the wall, and are so close minded that they still (STILL) deny, deny, deny. the fact is that these benches are data. DATA. who does not understand data? or the matter that data do not lie?
would it be better to NOT know this information? Particularly for all those waiting to get a 5900 right before hl2 comes out? How could anyone not want to know this information. It's ridiculous to even conceive of that notion.

are the numbers cooked? anand, hardocp, gamersdepot, and others don't think so. are they in bed with ati? probably not.
Det 50's. valve didn't 'demand' that the benches be done with an obsolete driver. They insisted that should a site DESIRE to run benches, that the benches be done with publically available drivers. How is that unfair?
why would you want buggy, unavailable dets to be used particularly when they omit gpu eating scenery like fog.
ridiculous.

all im saying is don't complain about data or new information. embrace it. when new data comes out, perhaps vindicating nv, then so be it. its better to get information than to be without it.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: DefRef
Originally posted by: Crapgame
Not exactly. Valve insisted on using drivers that were currently available to end users
I could see the point if the game was availible to end users as well.
BINGO! That's what's so lame about this Valve/ATI putup job: The demanded that OLD drivers be used on the basis that those were what was available to the end user when the game itself isn't going to be available to end users for a couple of months. (Anyone think they're gonna get it done by Sept. 30th now? Me neither.) I'm surprised that they didn't specify that the 3.68 drivers from Dec. 1999 be used because it takes several years to know if drivers are stable.
.

One of the reasons Valve insisted that Benchmarks be done using current drivers is because the Detonator 50.xx's conveniently skip rendering key elements in HL2 in order to improve performance.

Benchmarks of Tomb Raider AOD show the same thing as the HL2 benchies in regards to Nvidia's DX9 performance, so I guess Core is out to get nVidia as well.


 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: Crapgame
Originally posted by: solofly
Originally posted by: DefRef
Originally posted by: RogerAdam
Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere.


You can say that again... nvnews.net is the biggest joke on the Net.

You've obviously never seen the Democratic Underground site.

No i didn't but this is fun enough to read...

Click here and/or here...
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: gururu
it simply doesn't make sense to complain or argue about the benchmarks. it bewilders me how some look at the writing on the wall, and are so close minded that they still (STILL) deny, deny, deny. the fact is that these benches are data. DATA. who does not understand data? or the matter that data do not lie?
would it be better to NOT know this information? Particularly for all those waiting to get a 5900 right before hl2 comes out? How could anyone not want to know this information. It's ridiculous to even conceive of that notion.

are the numbers cooked? anand, hardocp, gamersdepot, and others don't think so. are they in bed with ati? probably not.
Det 50's. valve didn't 'demand' that the benches be done with an obsolete driver. They insisted that should a site DESIRE to run benches, that the benches be done with publically available drivers. How is that unfair?
why would you want buggy, unavailable dets to be used particularly when they omit gpu eating scenery like fog.
ridiculous.

all im saying is don't complain about data or new information. embrace it. when new data comes out, perhaps vindicating nv, then so be it. its better to get information than to be without it.

^ Bingo! The whole point of the matter is to inform us, not deceive us.

As for those of you bitching and moaning about "Old Drivers" , "not using Det 50.xx" etc. put your pacifiers back in, be happy about this advanced warning about slow NV3x performance, and just wait for the final gold master of the game. Nvidia will surely have newer than 50.75 out by then, so even that will be obsolete.

Just don't say we didn't warn you when you try to run the game with full eye candy and your game comes to a crawl.