• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Official Half Life 2 thread. - AnandTech, Gamers Depot, Extreme Tech, and now The Tech Report Benches are UP!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Hate to interupt the fanboys and flames , but I think this is the most intersting issue. SOMETHING is limiting the 10x7 scores, and that something is exactly the same at 12x10? Doesn't make sense.

I read the article and if I understand correctly the scores can be attributed to the fact that the game relies more on raw computational power and not memory bandwidth. Higher resolutions require more memory bandwidth and if the game is not bandwidth limited then the scores are probably on point.
 
Originally posted by: KnightBreed
There is something monumentally wrong with the 9600Pro benches. Here are the scores from the 3 different benchmarks for the 9600.

Techdemo:
1024x768 - 44.5 fps
1280x1024 - 44.5 fps

Bugbait:
1024x768 - 55.5 fps
1280x1024 - 55.4 fps

City 17:
1024x768 - 41 fps
1280x1024 - 40.9 fps

Can somebody explain these scores to me? The system isn't CPU limited, given the fact that the 9700Pro and 9800Pro beat these scores by an appreciable amount. The card cannot be shader limited, because you would have seen a significant drop in performance by raising the resolution. The scores for all intents and purposes are identical for both resolutions for each test. Is this is a bug in the drivers, the game, or Anand's tests?

Read the article. Anand explains it all
 
Originally posted by: rickn
Originally posted by: Czar
Just a thought, isnt it the job of game developers to make games that work with Direct3d, and isnt it the job of hardware developers to make their card work with Direct3d, so isnt it their own fault when one of those brake this rule and performance goes down on a specific hardware?

not necessarily. game developers don't have access to a lot functions within DX, and for very good reasons. So that has to be left up to the hardware manufacturer to address, and thus we see optimizations. Optimization is not always a bad thing. It could be that an optimization was one that the game developer simply did not have access too.

You're obviously not a programmer. You have no idea what you're talking about and just made that up. Admit it.
 
^^^^^ ROFL

this was posted in the 8 page session last nite..but seeing as how this is THE thread now (hey its official 🙂 )

omg i just spent about 3 hrs reading up on this whole controversy instead of doing my paper for Modern Political Discussion in English Seminar class....

all i have to say is this:

-one cannot know if Ati + Valve have been *in bed* with one another to the extent that valve will be willing to cripple nvidia but look at it this way. Nvidia users number 100+ million GPUS....Now as a company u cannot cripple some 100 million *potential * customers in the hope that everyone will run out and buy a $350 GPU...does this make some sense?

-Now, What about Nvidia being in bed w/ ID? Same issue right? Well for #1 the ati counterpart to the nvidia cards still hold up reasonably well w/out extensions and from (firing squad i believe) there was a quote from carmack stating that the native DX9 pixel shader 2.0 extensions run horrendously slow on the nvidia Fx series and instead of native 32 bit code he had to use a backdoor customization that runs the program at 12 and 16 bit instead of 32.....:...now this if this doesn't point to bad hardware then i dont know what else can

-the 3dmark '03 scandal. Nvidida has had a bad history w/ special *optimizations*...

-Why did valve say no DET 50.xx......well perhaps b/c they found that certain optimizations included outputting higher quality images whenever screen capture was detected?, or not rendering certain things, or blurring certain features?

-Look at the Lara craft benchmarks, aquamark 3, halo 1.5....this isn't only a HL2 specific problem. Can it be that all game developers just randomly decided to jump on the Ati bandwagon or there is something inherently wrong hard-ware wise

-*just wait till the next series of drivers*....to do what. cheat?, get rid of certain things so that only a select few wont notice them...
-Keep in mind folks. The majority of ppl find 800x600x16 playable. Get out of your elitist box. Companies ( i include ati here) know that they can sacrifice quality for higher fps b/c ONLy a select few will ever notice/bother.

Please read the above statements and take them to heart and stop this idiotic flame war. What is ...is what is. The future lies ahead so stop bashing one another so much.

oh and woot i'm super 133t for owning a 9700 PRo ...hehe i couldn't resist
 
I have a 9500 PRO running at 325/300 - anyone have any idea how that will hold up? I imagine pretty good...
 
should perform about the same as a 9700 pro considering the gpu clock.. apparently ram bandwidth doesnt matter. so ur lower than 9700 pro ram frequency and half bus width doesnt matter!
 
Originally posted by: KDOG
I have a 9500 PRO running at 325/300 - anyone have any idea how that will hold up? I imagine pretty good...

should do just as well as a 9600 if not better
 
Originally posted by: NFS4
Originally posted by: KnightBreed
There is something monumentally wrong with the 9600Pro benches. Here are the scores from the 3 different benchmarks for the 9600.

Techdemo:
1024x768 - 44.5 fps
1280x1024 - 44.5 fps

Bugbait:
1024x768 - 55.5 fps
1280x1024 - 55.4 fps

City 17:
1024x768 - 41 fps
1280x1024 - 40.9 fps

Can somebody explain these scores to me? The system isn't CPU limited, given the fact that the 9700Pro and 9800Pro beat these scores by an appreciable amount. The card cannot be shader limited, because you would have seen a significant drop in performance by raising the resolution. The scores for all intents and purposes are identical for both resolutions for each test. Is this is a bug in the drivers, the game, or Anand's tests?

Read the article. Anand explains it all
Mind pointing me to the quote? Because if it's the part about being "computationally limited" then Anand is WRONG.

Shader performance scales with resolution. More pixels on the screen means there are more pixels to run through the fragment program. If you are SHADER limited at 1024x768, then you should get a huge performance drop when upping the resolution. Why the benchmarks don't show this, is what I want to know.
 
Originally posted by: hdeck
Originally posted by: KDOG
I have a 9500 PRO running at 325/300 - anyone have any idea how that will hold up? I imagine pretty good...

should do just as well as a 9600 if not better

lol, i think it holds up better than a 9600 pro at default speeds
 
I just watched the Source_HDR02 Demo and I was blown away, and that was jut to show off the lighting!! Could anyone give me their opinion on how my Radeon 9700 NON-Pro would do?? My Core clock runs at 276Mhz which is slower than the 9600Pro, but it has more pipelines, etc. I get about 15,920 on 3DMark2001SE and around 4450 on 3DMark2003. Of course none of the graphs in Anand's review show a 9700np card, but it seems like the game will rely more on Chip architecture, and raw GPU speed. I also have a new rig with a ABIT IC7/ P4 3Ghz/ 1Gb RAM/ Audigy 2, etc. When I put my rig together I wasn't really into gaming, so thats why I limited my budget for Video to around $200. So now I'm wondering how this game will play on my rig??

Anyone's opinions?
 
man I really struck gold with my 9700 Pro purchase. I never would've thought that a $300 video card bought more than a year ago would still be one of the top performers for HL2. Hands down the best video card purchase I've made.
 
Originally posted by: Shad0hawK
And just how do you know that? You have Det 50's and HL2 to test this? If so, please supply a link with the bench results.


ushh..anandtech.com? they say there is a 40% increase in perf.

Uhh, sorry to bust your bubble, but if you're refering to
Here you can see the 40% performance boost NVIDIA gets from the special NV3x code path

then you got things wrong. This performance increase is not due to nVidia drivers, but to already present optimizations done by Valve that try to minimize the GeforceFX's difficulties with HL-2.
 
Originally posted by: kuk
Originally posted by: Shad0hawK
And just how do you know that? You have Det 50's and HL2 to test this? If so, please supply a link with the bench results.

ushh..anandtech.com? they say there is a 40% increase in perf.

Uhh, sorry to bust your bubble, but if you're refering to
Here you can see the 40% performance boost NVIDIA gets from the special NV3x code path

then you got things wrong. This performance increase is not due to nVidia drivers, but to already present optimizations done by Valve that try to minimize the GeforceFX's difficulties with HL-2.
Anand did not test Det 50. As above, that gain was Valve's doing. The game is running in "mixed mode" (DX8.1) to get the speed up. Even with the reduced quality, it is still way behind ATi @ full quality.
 
Originally posted by: Shiva112
man I really struck gold with my 9700 Pro purchase. I never would've thought that a $300 video card bought more than a year ago would still be one of the top performers for HL2. Hands down the best video card purchase I've made.

I just checked my 9700Pro box and the receipt says Sept. 16, 2002. 😀

I remember I wanted to wait for NV30 when I bought it, but decided to take a chance... Glad I did.
 
Originally posted by: thatsright
I just watched the Source_HDR02 Demo and I was blown away, and that was jut to show off the lighting!! Could anyone give me their opinion on how my Radeon 9700 NON-Pro would do?? My Core clock runs at 276Mhz which is slower than the 9600Pro, but it has more pipelines, etc. I get about 15,920 on 3DMark2001SE and around 4450 on 3DMark2003. Of course none of the graphs in Anand's review show a 9700np card, but it seems like the game will rely more on Chip architecture, and raw GPU speed. I also have a new rig with a ABIT IC7/ P4 3Ghz/ 1Gb RAM/ Audigy 2, etc. When I put my rig together I wasn't really into gaming, so thats why I limited my budget for Video to around $200. So now I'm wondering how this game will play on my rig??

Anyone's opinions?


Just pick a middle point between 9700pro/9600pro benchies 😉


And wow @ the 9600pro - This Christmas if the 9600pro can hit ~100 (110 @ most) I'm sure as hell snagging one and replacing my aging mx420
 
Back
Top