HL2 screenshot differences

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NEVERwinter

Senior member
Dec 24, 2001
766
0
71
I don't know about this.
HL2 is a fast-paced action game, so when playing I'll foucs my gameplay on the enemies and 'waiting for something to happen', basically paying less attention to little differences in background/environment.
If it's a real-world RPG like Morrowind, I'd enjoy to walk around the world map and enjoying the envirionment.
 

BoberFett

Lifer
Oct 9, 1999
37,562
9
81
Originally posted by: jjjayb
Read John Carmack's latest comments here:

http://english.bonusweb.cz/interviews/carmackgfx.html

Here is what it says for those too lazy to click the link:

Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?


Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack

Does that help clear things up? ;)

Well that certainly does hurt, and almost needs a thread of its own. The one hope that those backing Nvidia had to hang their hat on was the fact that Carmack likes the current Nvidia hardware. Everyone kept pointing at him and saying that
"Valve must have horrible programmers or they're in bed with ATI. Look at Carmack. He likes Nvidia hardware."
For him to say that the reason he likes it is because he doesn't use DX, and that Nvidia hardware does in fact suck with DX9, are the final nails in the coffin.

Yes, there will be games that use the D3 engine. But there will also be a lot of DX9 games. And if the 5x00 line can't support them as well as ATI, then they're screwed for now.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
JJ-

Do you work for Nvidia? You are posting half truths and outright lies. Just throw in a couple technical terms and nobody will be the wiser huh?

What lies or half truths? Point them out. As far as throwing out technical terms, Pete can quite easily call me on it if I step outside the lines. It isn't like I'm posting out shader code and stating why it is compiled less then optimally here(not that I could anyway ;) ).

I'll cover Carmack's comments in the thread for it.

Pete

As far as the 5200 and Longhorn, the bank knows no difference ;)

For the 5800 I'm mainly interested in it as there is no reason to drop a 5900 to FX12 over using FP16, if no performance gap is present, then their optimizations/nVidia's drivers have some huge room for improvement.

I'm sure Freelancer "requires" DX9 mainly to force people to update, and not for D3D9 specifically. Heck, the game has been in development since probably before DX9 was even conceived!

How many games do you think are going to utilize heavy pixel shaders? That is the issue on the performance front. What games are pushing shader tech? The most dramatic useage of pixel shaders we have seen in any upcoming title is Doom3, without the shaders the game would look like sh!t. We heard about the shader revolution with PS1.4, I'm still waiting to see huge edge be close to wide spread. You can point to certain cases, as you could at launch, but where is this huge influx of PS1.4 games we started hearing about close to two years ago? nVidia had the same thing with Dot3. Close to launch they could point to Evolva, Giants and through a patch Sacrifice(two of which were excellent games). By the time games started using it regularly, we were talking about the DX9 class parts. Somehow, despite the by comparison miniscule installed base of the R3x0 core boards, this time will be different.

IQF

Carmack has said the quality is slightly lower with the NV3x path. Obviously some tradeoffs must have been made for performance, however small.

Carmack stated there was no discernable quality difference. If you want to talk about the theoretical sense, then Carmack said the R3x0 line is incapable of running the game at the highest quality settings(it can't run FP32).

The NV3x lineup *needs* FX12 to run optimally, so developers like Valve have had to make concessions in their standard DX9 path for integer precision.

Not true for the NV35, and reportedly won't be true for the NV36 or NV38 either.

So you're disagreeing with my statement that the NV3x *needs* special considerations outside of the standard APIs? I'm not just talking about Doom3. My assertion is that the NV3x can't handle standard code acceptably. Forget about taking advantage of additional features not exposed in the APIs - the NV3x suffers *too much* without architecture-specific optimizations.

It depends greatly on what you are talking about. There is one area of serious weakness looking at performance for the NV3X, PS2.0 shader performance. Reread my comments, I haven't been debating if that is true or not. What besides PS 2.0 does the FX need "special consideration" for?

Why? Was the Doom3 engine a standard that ATI had input on?

In a manner, yes it was. The Doom3 engine is going to be the standard for OGL gaming for years to come.

Don't even bring up the Doom3 benchmark, which was just a one-time PR stunt put on by Nvidia, that ATI didn't even know about or optimize for. And ATI still won the high quality tests.

The first time ATi got their hands on Doom3 it ended up on Kazaa. As far as a one time PR stunt, why didn't Valve wait until nV had their optimized drivers done? For the benchmark numbers, look at what the then latest Cat drivers had for performance. Having to resort to older drivers to run better then the 5200 IIRC(may have been quite a bit slower then that actually).

What DX9-exclusive features does Freelancer use? Would you say it's a worthy demonstration of DX9 on the basis of DX9 content, or does it just have token DX9 effects?

It doesn't use PS 2.0, which seems be your sole focus.

Silver

Also to even imply that the GF FX is more Dx9 complaint in the face of the Halflife 2 numbers is so absurd its laughable. What part of 'ATi's DX9 parts take Nvidia out behind the woodshed and manhandle them do you have problems understanding?' Time for some remedial reading skills Ben.

Compliant. Don't be ashamed if you don't know the meaning of a word :)
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
He's mistaking PERFORMANCE for COMPLIANCE. Under his understanding, he probably believes a Ford Focus isn't a compliant automobile because it's not as fast or stylish as a Ferrari Testarossa.
 

IQfreak

Junior Member
Sep 15, 2003
12
0
0
Wow, Ben, you're a slippery one. That's a compliment. :)


Originally posted by: BenSkywalker

I'm sure Freelancer "requires" DX9 mainly to force people to update, and not for D3D9 specifically. Heck, the game has been in development since probably before DX9 was even conceived!

How many games do you think are going to utilize heavy pixel shaders? That is the issue on the performance front. What games are pushing shader tech? The most dramatic useage of pixel shaders we have seen in any upcoming title is Doom3, without the shaders the game would look like sh!t. We heard about the shader revolution with PS1.4, I'm still waiting to see huge edge be close to wide spread. You can point to certain cases, as you could at launch, but where is this huge influx of PS1.4 games we started hearing about close to two years ago? nVidia had the same thing with Dot3. Close to launch they could point to Evolva, Giants and through a patch Sacrifice(two of which were excellent games). By the time games started using it regularly, we were talking about the DX9 class parts. Somehow, despite the by comparison miniscule installed base of the R3x0 core boards, this time will be different.


I think *a lot* of games are going to push heavy pixelshader usage in the short-term future. Doom3 is certainly *not* the most dramatic, as it doesn't rely on many shaders to improve the visuals. I've read that the Doom3 shaders mainly reduce the number of passes needed for the lighting. It won't use anywhere near the amount of advanced shaders (PS1.4+ level) as Half-Life 2, which is the true showcase of pixelshading.

This time, there *is* a shader revolution, because of HLSL. Shaders were extremely difficult to program for in DX8. With DX9, the technology is being quickly brought to market. Who would've guessed (among gamers) that 9 months after DX9 became publicly available, DX9 games would start appearing?

Halo PC just went gold and will be using DX9 shaders that take advantage of the latest hardware.


Carmack has said the quality is slightly lower with the NV3x path. Obviously some tradeoffs must have been made for performance, however small.

Carmack stated there was no discernable quality difference. If you want to talk about the theoretical sense, then Carmack said the R3x0 line is incapable of running the game at the highest quality settings(it can't run FP32).

For Doom3, the quality difference may not be noticeable, but that doesn't change the fact that FX12 must be used as much as possible to get acceptable performance. Artists will make the appropriate compromises to ensure FX12 doesn't look bad. Imagine if the NV3x had fullspeed FP32. The game would probably look noticeably better because the artists would have less restrictions and would take advantage of what FP32 can do.

As for the R3x0, that's true! It's incapable of running at highest quality. But it can run at slightly less quality (a much less discernable difference than with FP24 vs FP16/FX12) at double the speed of the NV3x's FP32, with no need for architecture-specific optimizations. What's the point of having theoretical capabilities that can't be used practically by gamers?

In Half-Life 2, there's reportedly a very noticeable difference between FP24 and FP16/FX12.

ATI made the best compromise between quality and performance. Always full precision ("enough"). Always full speed. No fuss.


The NV3x lineup *needs* FX12 to run optimally, so developers like Valve have had to make concessions in their standard DX9 path for integer precision.

Not true for the NV35, and reportedly won't be true for the NV36 or NV38 either.

I don't know whether the NV35+ can run FP16 as well as FX12, but it doesn't matter anyway, because all NV3x's will be using the same path that includes FX12. Or do you expect developers to make TWO custom DX9 paths for NV3x?


So you're disagreeing with my statement that the NV3x *needs* special considerations outside of the standard APIs? I'm not just talking about Doom3. My assertion is that the NV3x can't handle standard code acceptably. Forget about taking advantage of additional features not exposed in the APIs - the NV3x suffers *too much* without architecture-specific optimizations.


It depends greatly on what you are talking about. There is one area of serious weakness looking at performance for the NV3X, PS2.0 shader performance. Reread my comments, I haven't been debating if that is true or not. What besides PS 2.0 does the FX need "special consideration" for?


PS2.0 is the killer feature of DX9.

Okay, so let's distinguish the PS2.0-heavy games from the rest. The NV3x needs special consideration for games using advanced DX9-level shaders. Most if not all of the upcoming AAA DX9 titles will use PS2.0.


Why? Was the Doom3 engine a standard that ATI had input on?

In a manner, yes it was. The Doom3 engine is going to be the standard for OGL gaming for years to come.


A standard that ATI had input on? Doom3 was built around Nvidia hardware.


Don't even bring up the Doom3 benchmark, which was just a one-time PR stunt put on by Nvidia, that ATI didn't even know about or optimize for. And ATI still won the high quality tests.

The first time ATi got their hands on Doom3 it ended up on Kazaa. As far as a one time PR stunt, why didn't Valve wait until nV had their optimized drivers done? For the benchmark numbers, look at what the then latest Cat drivers had for performance. Having to resort to older drivers to run better then the 5200 IIRC(may have been quite a bit slower then that actually).


Valve waited until the final month that their game is expected to be released! Why should Valve wait for Nvidia when they already spent so much time optimizing for their hardware? Nvidia was expecting to have their beta drivers used, but Valve determined them to be unacceptable. No more waiting.

Contrast that with the Doom3 benchmark preview. ATI has to wait for an indefinite period of time for a chance to set the record straight.


What DX9-exclusive features does Freelancer use? Would you say it's a worthy demonstration of DX9 on the basis of DX9 content, or does it just have token DX9 effects?

It doesn't use PS 2.0, which seems be your sole focus.

Oh right, advanced shaders are not the main focus of DX9.
rolleye.gif
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Doom3 is certainly *not* the most dramatic, as it doesn't rely on many shaders to improve the visuals. I've read that the Doom3 shaders mainly reduce the number of passes needed for the lighting. It won't use anywhere near the amount of advanced shaders (PS1.4+ level) as Half-Life 2, which is the true showcase of pixelshading.

How many pixels aren't shaded in Doom3? Shut all of the shaders in HL2 off and the game still looks great, the shaders add to the overall impact of the title, but it doesn't rely on it like Doom3. Shut all of the shaders off in Doom3 and the game would look comparable to Quake2.

This time, there *is* a shader revolution, because of HLSL. Shaders were extremely difficult to program for in DX8. With DX9, the technology is being quickly brought to market. Who would've guessed (among gamers) that 9 months after DX9 became publicly available, DX9 games would start appearing?

Why are we still able to count the upcoming pixel shader heavy games on our fingers?

Halo PC just went gold and will be using DX9 shaders that take advantage of the latest hardware.

I'm waiting to see the difference for myself on that one, I know that you have TR:AOD(a complete crap game), Aquanox2(not sure how good it will be), Halo(kicks @ss), HL2(will kick @ss I'm sure) and Doom3(likely mediocre, but I'm hoping for good at least)- how many of them show much of a difference? HL2 with HDR enabled shows a decent difference in visuals, the rest I haven't seen anything convincing yet in terms of the PS 2.0 big advantage. I know in theory how much more the shaders are capable of, I'm waiting to see something besides Dawn and Monkeys to show us that ;)

ATI made the best compromise between quality and performance. Always full precision ("enough").

ATi's accuracy is 'good enough' and nVidia's is either 'not enough' or 'too much'? I can easily agree with too slow, but not on the accuracy front.

I don't know whether the NV35+ can run FP16 as well as FX12, but it doesn't matter anyway, because all NV3x's will be using the same path that includes FX12. Or do you expect developers to make TWO custom DX9 paths for NV3x?

The NV35 only has FP pipes, there are no INT units on the chip. As far as two custom code paths- how many 5800s do you think shipped? If the performance is such that the 5900 needs special coding, then the lower end parts are going to need to run without the shaders anyway.

A standard that ATI had input on? Doom3 was built around Nvidia hardware.

Not exactly, nVidia hardware was built around Doom3. Carmack has been saying what Doom3 would need for hardware since 1999, that's why so many people saw the specs on the NV3X line and knew that they would be extremely good at running the game.

Valve waited until the final month that their game is expected to be released! Why should Valve wait for Nvidia when they already spent so much time optimizing for their hardware? Nvidia was expecting to have their beta drivers used, but Valve determined them to be unacceptable. No more waiting.

Valve released the bench prior to nV having their new drivers ready, by any reasonable assement it appears that they did so because they knew nV would have them ready soon. ATi was still going to show superior performance, just not by quite as much. If Valve had released the game for benching at the same time as say the R9800TX being launched, then I could see why they did so. Releasing as they did, in the time slot before nV had their new series of Dets ready but within a couple weeks of the public release of the bench, it seems a bit odd to say the least.

Contrast that with the Doom3 benchmark preview. ATI has to wait for an indefinite period of time for a chance to set the record straight.

ATi had first shot at Doom3, and it ended up on Kazaa. Doom3's first real public display was run on ATi hardware, and what record do you expect them to set straight? The rendering techniques in Doom3 are no secret, and ATi already has a build of the game to work with. What are they supposed to be setting straight?

Oh right, advanced shaders are not the main focus of DX9.

Everyone's focus is on Pixel Shaders, ignoring Vertex Shaders. Why?
 

Alkali

Senior member
Aug 14, 2002
483
0
0
These jpg's of a wall and a bit of water are no basis for judgement about how DX9 looks "too bright" or "I want DX8.1 back".

Lets wait and see the movies to show the difference.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Alkali
These jpg's of a wall and a bit of water are no basis for judgement about how DX9 looks "too bright" or "I want DX8.1 back".

Lets wait and see the movies to show the difference.


Then go download some. There are about 8 or more HL2 movies out, and theyre running DX9.

they're not impressive at all IMO - certainly nothing to make me move forward from a DX8 GPU.

A little overbright and better specular highlighting isn't going to make me go gaga.

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
You haven't seen comparative screenshots of DX9 w/HDR vs. DX8 w/o yet, so hold your horses until the cart arrives. Only ten more days now....
 

BoberFett

Lifer
Oct 9, 1999
37,562
9
81
Originally posted by: Insomniak
Then go download some. There are about 8 or more HL2 movies out, and theyre running DX9.

they're not impressive at all IMO - certainly nothing to make me move forward from a DX8 GPU.

A little overbright and better specular highlighting isn't going to make me go gaga.
Yet another Nvidia fanatic, bitter about his video card not making the grade.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: BoberFett
Originally posted by: Insomniak
Then go download some. There are about 8 or more HL2 movies out, and theyre running DX9.

they're not impressive at all IMO - certainly nothing to make me move forward from a DX8 GPU.

A little overbright and better specular highlighting isn't going to make me go gaga.
Yet another Nvidia fanatic, bitter about his video card not making the grade.


What the hell?

I didn't say a single frickin' thing about ATi or Nvidia in that post. Your statement has to be one of the most dipsh1tted yet. God you fanboys are insecure....I say I don't care for an instruction set that your card supports, nothing at all about the card itself, and suddenly I'm an "nividiot"? Take a long hard look in the mirror if you want to see a fanboy.

I own 3 rigs - One with a GeForce 4 Ti, One with a GF4 MX 420, and one with a 9600 non pro.

I've looked at the screens and the movies, and IMO DX9 does NOT look significantly better than DX 8.1 - yes the overbright and specularity are improved but they're nothing we haven't seen before, and they are most certainly NOT worth paying 250$ or more on a new ATi OR Nvidia card.

You are pathetic. Every single post anyone makes expressing any opinion different from yours, and you have to turn it into an NV v ATi flame war.

I could come in here and say "I prefer my coffee with a cream and one sugar over straight black" and you'd reply with "Face it! Nvidia's coffee makers are slower than ATis! It's because of the heating plate architecture!"

Seriously, graphics cards are not the end of the world. Get over this whole fiasco; ATi is ahead right now, Nvidia will probably retake the crown at some point in the future only to have ATi steal it again, etc. That's the way of the IT market, so climb off your high horse and take your wang out of your Raddy's digital port. That's where the monitor plug goes.

Sheesh...



 

DaveBaumann

Member
Mar 24, 2000
164
0
0
Shut all of the shaders off in Doom3 and the game would look comparable to Quake2.

There are approximately two shaders in Doom3 that pertain to the per pixel lighting model - its requirements are Dot3, multitexturing/multipassing and stencil. "Shut off the pixel shaders" and you drop back to the NV10 / ARB path - there will be quality differences, but fundamentally it will be doing the same (and still nothing like Q2!).
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Seriously, graphics cards are not the end of the world. Get over this whole fiasco; ATi is ahead right now, Nvidia will probably retake the crown at some point in the future only to have ATi steal it again, etc. That's the way of the IT market, so climb off your high horse and take your wang out of your Raddy's digital port. That's where the monitor plug goes.

Holy crap. The voice of reason. Another human being who thinks Shader Day is no reason for 9/10 ATers to spit in the face of anyone who says nV30/35s aren't that bad, DX9 doesn't look that spectacular, or R350s don't walk on water.......

I didn't think there were any others out there. (besides, Ben Skywalker, Def Ref, and Gorillaman anyway)

And here I was already to turn myself in to the Canadian Mounties because I bought a 4200 for my three year old's box.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
"Shut off the pixel shaders" and you drop back to the NV10 / ARB path - there will be quality differences, but fundamentally it will be doing the same (and still nothing like Q2!).
"

I said shut off the shaders, not the pixel shaders in particular ;) Without Dot3 functionality at least Doom3 would look extremely poor, I'd say comparable to Quake2(perhaps even worse, IIRC D3 won't have a lightmap fallback will it?) based on the underlying geometry if you removed the lighting effects also(based on everything I've read, take away Dot3 and CubeMaps and your lighting model outside of shadows is completely gone).
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I've looked at the screens and the movies, and IMO DX9 does NOT look significantly better than DX 8.1 - yes the overbright and specularity are improved but they're nothing we haven't seen before, and they are most certainly NOT worth paying 250$ or more on a new ATi OR Nvidia card.

You are pathetic. Every single post anyone makes expressing any opinion different from yours, and you have to turn it into an NV v ATi flame war.

Boy, some of the voices of reason at Anandtech sure use some crass language and an abundance of ad-homs. Cause for praise? I think not. If you have to "sink" to a level to make your point, it's probably not worth making.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Pete:
He's part of a frustrated minority here that gets jumped on for not bowing to ATIs might. It's sad to me that the guys here who buy 5900s are regaled with snide comments like:
"Hope you don't like AA/AF"
"You'll be screwed when HL2 comes out!"
" Your IQ will suck due to nVidia's cheater drivers"
etc ad infinitum

There's no end or recourse to it. You'd think the 5900 guys bought Parhelias, or if you want to go totally logical, that there could be something "wrong" with buying ANY video card.

IMO, it's a pretty small man who needs to get his jollies trying to one up a fellow ATers video card.

So, crass or not, he gets my praise. It's easy enough to feel like the whole board wants to lynch you and burn a cross in your yard if you're not echoing the "Hail ATI/Valve/DX9 propaganda.



 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: Rollo
Pete:
He's part of a frustrated minority here that gets jumped on for not bowing to ATIs might. It's sad to me that the guys here who buy 5900s are regaled with snide comments like:
"Hope you don't like AA/AF"
"You'll be screwed when HL2 comes out!"
" Your IQ will suck due to nVidia's cheater drivers"
etc ad infinitum

There's no end or recourse to it. You'd think the 5900 guys bought Parhelias, or if you want to go totally logical, that there could be something "wrong" with buying ANY video card.

IMO, it's a pretty small man who needs to get his jollies trying to one up a fellow ATers video card.

So, crass or not, he gets my praise. It's easy enough to feel like the whole board wants to lynch you and burn a cross in your yard if you're not echoing the "Hail ATI/Valve/DX9 propaganda.



I have to 100% totally agree...I thought AT boards were built to HELP people, but over the past year or so, it has gotten WAY outta hand and has turned into another "beyond3d" board. I dont even visit much anymore.

I PROUDLY bought a NV 5900 Ultra (Gainward) and I dont regret the hradware, I only regret the tactics NV is doing. As for the purchase, I was up and running in UNDER 5 minutes, I didnt even switch drivers from my old GF3 Ti200.

I dont follow the crowd, I dont follow propaganda...I follow my experiences, and NV hasnt let me down yet...however, ATI has...TWICE. I even loved my Diamond S220 (rendition chipset) but unfortunately they went under. My Indycar Racing 2 was ahead of it's time with the S220, it was as beautiful as F12002 is today.

I might be a bit slower, but I was running in under 5 mins, a fact I needed because I am a game administrator for HomeLAN. I couldnt afford to be down 3 days cause the drivers sucked, or I needed to clean my installation, etc. I went with what works, and my NV WORKS...and yes EVEN with AA/AF on!

*EDIT*

and as for the pics of DX8 and DX9...the DX9 part...look at the rock in the middle of the stream. It doesnt look like a reflection of that rock, it looks like another rock UNDER the water!
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Pete
I've looked at the screens and the movies, and IMO DX9 does NOT look significantly better than DX 8.1 - yes the overbright and specularity are improved but they're nothing we haven't seen before, and they are most certainly NOT worth paying 250$ or more on a new ATi OR Nvidia card.

You are pathetic. Every single post anyone makes expressing any opinion different from yours, and you have to turn it into an NV v ATi flame war.

Boy, some of the voices of reason at Anandtech sure use some crass language and an abundance of ad-homs. Cause for praise? I think not. If you have to "sink" to a level to make your point, it's probably not worth making.



Believe me, if it were possible to communicate my sentiments to these gentle-persons (uh...) without the use of such comments and dialogue, I would, but I don't think they'd be able to comprehend what I was saying in the least. Thus, since the rant was aimed at them and not you sane folk, I chose to put it in language they could understand.

Know your audience and whatnot.

Now, if you wish to discuss the current make-up of online GPU forums and the effects of the enthusiast gamer market on them, I would be happy to oblige.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Understood, I just don't think it helps to add to the mindless chatter. No need to encourage it, as more flaming probably won't do much to educate those quick to talk and slow to think.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Yeah, I see your point, but I sort of felt something needed to be said. His remark was completely off base from what mine was about.
 

DaveBaumann

Member
Mar 24, 2000
164
0
0
I said shut off the shaders, not the pixel shaders in particular. Without Dot3 functionality at least Doom3 would look extremely poor

Dot3 was really only a Per pixel operation, although it is rolled in as a subset of fragement shaders under PS1.0 and above. However, what you were replying to was fundamentally correct - Doom3 is not designed as a fragement shader title - JC has already stated that this is built to utilise the featureset of GeForce256/2, and the various paths are there to harness different levels of shader functionality and and roll up the 8 passes NV10 class products require to produce the per pixel lighting.

The level of shader utilisation in HL2 appears to be unprecidented - although the title was started before DX9, clearly they have managed to fully integrate it withing the engine very effectively, no doubt helped a lot by HLSL. HDR is an example of an effect integrated into the engine that clear wouldn't have been available on pre-DX8 hardware.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You are looking at it from the technology end in terms of which particular operations are utilized to pull off a given effect, I'm talking about the end visual impact of the effects. Dot3 falls within the scope of what is considered a shader, and without it Doom3 would look extremely poor. I think that there isn't a significant rift in the visuals between D3 and HL2 speaks very poorly for the DX9 implementations we have seen to date looking at the fragment shaders anyway(on the dev front, not the hardware). Of everything that has been released on HL2, there isn't a single fragment shader effect that I have seen that looks as good as what we have seen in Doom3 and in many cases even earlier titles(Rallisport comes to mind).

Now you have had the advantage of seeing the game running first hand and have seen it running with HDR(as opposed to most of us who have seen a low res vid clip at best), so perhaps you can comment on the visuals.

Without HDR, the fragment shaders utilized in HL2 pale in comparison to many DX8 games we have seen to date in terms of visual impact. I think Valve did a lot of incredible things with their engine, the animation of the characters and the skinning are the most impressive by far on the visual front to me, the physics engine looks to be incredible and I'm sure the gameplay will be stellar, but I haven't seen anything on the fragment shader side that is very impressive. When compared to the uber register combiner game, it doesn't scream out how great the new shader tech is. From what we have seen of HDR it appears that it will aid the title greatly, but from what I've seen it still doesn't look comparable to Doom3 in terms of jaw dropping visuals.

I know you state that the HDR tech isn't possible on pre DX8 hardware(obviously) but does it allow HL2 to approach/compete/surpass Doom3 in terms of the 'wow' factor? I have absolutely no doubt HL2 is going to be the vastly superior game, but from what I've seen I don't see a good reason not to run the game in DX7 mode, the edge in visuals doesn't seem to be there.
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
This is nothing new, for ATIvNVidia or any other rivalry. For some reason, neither side is willing to give in and admit that they are wrong or whatever. Then there's the smart people who don't pay attention to the name but rather the benchmarks, which always speak for themselves...
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
I think it was OldFart that stated we should not be deciding IQ over a few lousy pics without seeing DX9 in action, or in animation. I tend to agree with him.

From what Anand states from playing the game demos, there is a significant and noticeable change in IQ with 8.1 vs 9.0 .