Half-life 2 Performance: Breaking news

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
Originally posted by: cmdrdredd
Originally posted by: rogue1979
Originally posted by: sman789
ahhh, 9600 pro made me happy again

The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;


Gee, according to the article the 5900 wasn't doing very well. So if the 9600Pro has about equal performance in the same game, it doesn't sound like something to be happy about.

If you check beyond3d for benchmarks you'll see that the 9600 is much better than Nvidia at the default

I don't think I need to check benchmarks to see that this biased article says in the same sentence how the 5900 Ultra is "horrendously slower", then goes on immediately to say the 9600Pro "performs quite well" all the time saying that the performance in Half Life 2 is close. What does that mean? I don't think there is gonna be many benchmarks where a 9600Pro overclocked or not beats the 5900 Ultra. I own both the Radeon 9500, 9700Pro and 3 fast Geforce4 Ti 128MB cards, so I don't really care who is faster. But it is hard not to comment on the heavily biased statement quoted above and then the incorrect conclusion from sman789 that the 9600Pro is actually in direct competition with the 5900 Ultra.

 

sodcha0s

Golden Member
Jan 7, 2001
1,116
0
0
This wasn't a fair comparison because they made sure apples couldn't be compared to apples and only offered vague libels as the reason. Nvidia's in a tough spot of having to prove a negative - that Valve is lying about bugs and that they can follow thru. Unfortunately, the ATI FUD Troopers have been working overtime to make sure no one THINKS about what's going on with all their "ATI=r0x0rZ! Nvidia=SuX0rZ!" yapping.

Most ironic about all this back and forth is that the only entities clearing lying are Valve and ATI, yet Nvidia is getting it in the neck. Think of it this way, if Valve was in bed with Micro$oft, would you believe everything they said? I think not. Yet, since it's the makers of your beloved, manhood-enhancing video cards behind the scenes, you're totally credulous.



So you really think Valve is out to get nividia? That is ridiculous. It is their best interest that their software run well on as many different graphics cards as possible. What would they have to gain by trying to make the largest GPU maker in the world look inept? Do you really believe that any kind of "kickback" ATI could offer them would be better than selling software to the largest group of video card owners? gururu said it best, these benchmarks are hard DATA. The data indicates nividia is having problems with their dx9 implementation. So because of that, Valve and ATI are lying to us all and are just trying to make mean ol' nividia look bad. Puh-leez!
 

sandorski

No Lifer
Oct 10, 1999
70,696
6,257
126
Originally posted by: rogue1979
Originally posted by: cmdrdredd
Originally posted by: rogue1979
Originally posted by: sman789
ahhh, 9600 pro made me happy again

The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;


Gee, according to the article the 5900 wasn't doing very well. So if the 9600Pro has about equal performance in the same game, it doesn't sound like something to be happy about.

If you check beyond3d for benchmarks you'll see that the 9600 is much better than Nvidia at the default

I don't think I need to check benchmarks to see that this biased article says in the same sentence how the 5900 Ultra is "horrendously slower", then goes on immediately to say the 9600Pro "performs quite well" all the time saying that the performance in Half Life 2 is close. What does that mean? I don't think there is gonna be many benchmarks where a 9600Pro overclocked or not beats the 5900 Ultra. I own both the Radeon 9500, 9700Pro and 3 fast Geforce4 Ti 128MB cards, so I don't really care who is faster. But it is hard not to comment on the heavily biased statement quoted above and then the incorrect conclusion from sman789 that the 9600Pro is actually in direct competition with the 5900 Ultra.

Context. The 5900U is meant to compete with the 9800Pro and is priced accordingly. The 9600Pro is meant to compete with the 5600 and priced accordingly. So when the 2x Price 5900U gets beat by the 1/2x Price 9600Pro, how can one not be unimpressed with the 5900U and impressed by the 9600Pro?
 

KGB1

Platinum Member
Dec 29, 2001
2,998
0
0
Hmmm GF4 Ti4600 seems to run the game decently. I got a GF4 ti4200 128 8X.. seems I should be seeing the same performace as a 5600. :D
I'm all smiles man... I don't play games above 1024x768 because my 15" monitor doesn't support it. I don't turn on AA/FSAA etc... and I'm running a Celery 1.3 w/256 MB SDRAM. I'm not looking at 40 fps.. more like 25... It'll hold me over until opteron/a64 is widely available in nov '04.
PCI express will be in standard... R400/NV40 will be out and i can really push the limits to 100 fsp :Q :D

On to other business:
I don't think there is a bias... some games run GREAT on some cards and others do not. It's how its developed and kind of ironic how the first half life couldn't run decent on a Radeon GPU and ran perfectly well on a GF. Tables have turned considerably, but the game will do well and nVidia will learn ONCE AGAIN not to FVCK with Microsoft and claim they have full Direct x 9 compliance when they don't and follow the leader... NV shouldn't be making it Difficult to make games it should be trying to make it easier.

The statement was that programmers spent 5x more preparing the NV code than native Directr X9 code.. that could entail the game could have came out on Aug 22 and we'd be enjoying the game running fine, instead had to wait for the programmers to fine tune the engine fro NV performance. This is utter bull.
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
As I've said on other thread the NV3X is a hunk of overclocked garbage when compared to the R3xx.

rogo
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
As I've said on other thread the NV3X is a hunk of overclocked garbage when compared to the R3xx.
You've been nipping at the bean brew too long Rogo. The 5900 is more of an achievement than anything you or I will ever do in our lives, including pass some worthless philosophy courses so you can spout logical fallacies to people who don't know what the He!! you're talking about.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
As I've said on other thread the NV3X is a hunk of overclocked garbage when compared to the R3xx.
You've been nipping at the bean brew too long Rogo. The 5900 is more of an achievement than anything you or I will ever do in our lives, including pass some worthless philosophy courses so you can spout logical fallacies to people who don't know what the He!! you're talking about.

Since when are graphics card measured against a consumers achievements? You come up with some of the most inane reasons to defend nVidia. You must go to sleep with an nVidia pacifier and blanket at night.
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
As you probably noticed since you've taken a few college courses in your life I said "in comparison to the Rv3xx".

Why don't you use real logic Rollo by asking me for my sources when I make claims like the one above, why do you attack me personally and make fun of my education instead of discussing my proposition like rational men?

Rogo
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
What's completely ironic is that Logic is the basic building block of binary code (the core of a GPU)-hardly worthless rollo if you like to learn instead of being a sardonic 'know-it-all".

rogo
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rogodin2
What's completely ironic is that Logic is the basic building block of binary code (the core of a GPU)-hardly worthless rollo if you like to learn instead of being a sardonic 'know-it-all".

rogo

I wouldn't put much worth into what Rollo says. I think the guy is a compulsive liar.
 

SectorZero

Member
Sep 29, 2002
96
0
0
LOL this is the funnyest thread I've seen in quiet a while.

Nvidiots, go play in that corner. ATImbiciles, you play in the other corner.

Now, Nvidiots, try to understand how the ATImbicles are feeling right now. They've lived with inferior hardware and drivers for years. Rage Fury Maxx's they couldn't take with them when they upgraded to Win2K or XP. They're feeling good right now, and deservedly so. Let them enjoy their time in the sun. God knows they've waited long enough to feel this good about their purchase decision.

ATImbicles, the Nvidiots are going through a really tough time right now. They spent a lot of money on video cards they thought were future proof. They're having a hard time handeling the fact that their cards are mediocre DX9 performers. It's not their fault, they trusted a company who never gave them a reason to doubt before. This is why they're seeing conspiracies and plots everywhere.

Ok, ATImbiciles, don't be scared and stop crying. Yes, you have the better DX9 hardware, no matter what the Nvidiots tell you.

Nvidiots, stop pouting, holding your breath and stamping your feet won't change anything. Valve didn't optimise for the ATImbiciles, they optimised for DX9. The cold hard truth is Valve optimised as much as they could for your non DX9 standard hardware.

C'mon Nvidiots, cheer up. You're gonna smoke e'm in Doom 3.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: SectorZero
LOL this is the funnyest thread I've seen in quiet a while.

Nvidiots, go play in that corner. ATImbiciles, you play in the other corner.

Now, Nvidiots, try to understand how the ATImbicles are feeling right now. They've lived with inferior hardware and drivers for years. Rage Fury Maxx's they couldn't take with them when they upgraded to Win2K or XP. They're feeling good right now, and deservedly so. Let them enjoy their time in the sun. God knows they've waited long enough to feel this good about their purchase decision.

ATImbicles, the Nvidiots are going through a really tough time right now. They spent a lot of money on video cards they thought were future proof. They're having a hard time handeling the fact that their cards are mediocre DX9 performers. It's not their fault, they trusted a company who never gave them a reason to doubt before. This is why they're seeing conspiracies and plots everywhere.

Ok, ATImbiciles, don't be scared and stop crying. Yes, you have the better DX9 hardware, no matter what the Nvidiots tell you.

Nvidiots, stop pouting, holding your breath and stamping your feet won't change anything. Valve didn't optimise for the ATImbiciles, they optimised for DX9. The cold hard truth is Valve optimised as much as they could for your non DX9 standard hardware.

C'mon Nvidiots, cheer up. You're gonna smoke e'm in Doom 3.


What about those of us that had nVidia hardware prior to the release of R300 and are now ATi fans?
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
Most of us have owned both companies cards.

I've owned tnt2, g2 gts, g2mx400, g4ti4400, g4ti4200, and gfx 5600U, and from the ati camp-radeon 8500, radeon 9700PRO, radeon 9800PRO (128mb).

All I can say is that ati is making cards that are much much better than nvidia, and have been since the 8500-which was meant to compete with the G3 (0riginal not the ti) and it kicked the shat out of it.

You're not providing any epiphanies for us.

Rogo
 
Apr 17, 2003
37,622
0
76
Originally posted by: SectorZero
LOL this is the funnyest thread I've seen in quiet a while.

Nvidiots, go play in that corner. ATImbiciles, you play in the other corner.

Now, Nvidiots, try to understand how the ATImbicles are feeling right now. They've lived with inferior hardware and drivers for years. Rage Fury Maxx's they couldn't take with them when they upgraded to Win2K or XP. They're feeling good right now, and deservedly so. Let them enjoy their time in the sun. God knows they've waited long enough to feel this good about their purchase decision.

ATImbicles, the Nvidiots are going through a really tough time right now. They spent a lot of money on video cards they thought were future proof. They're having a hard time handeling the fact that their cards are mediocre DX9 performers. It's not their fault, they trusted a company who never gave them a reason to doubt before. This is why they're seeing conspiracies and plots everywhere.

Ok, ATImbiciles, don't be scared and stop crying. Yes, you have the better DX9 hardware, no matter what the Nvidiots tell you.

Nvidiots, stop pouting, holding your breath and stamping your feet won't change anything. Valve didn't optimise for the ATImbiciles, they optimised for DX9. The cold hard truth is Valve optimised as much as they could for your non DX9 standard hardware.

C'mon Nvidiots, cheer up. You're gonna smoke e'm in Doom 3.

Funniest post EVAR
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You may, however, notice things like "seams" between textures or polygons as they are rendered ever-so-slightly out of place. In my experience (admittedly limted to college CG courses), the difference between a polygon rendered using a 32bit number (say, a double) and a 64bit number

The 16bit they are talking about is 64bit(16bit per color + alpha).

No current board has a full DX9 feature set, that requires PS/VS 3.0.

The FP16/FP32 standards that the nV boards use predate the DX9 spec(they are IEEE standards).

ATi's DX10 part will be using FP32 as all boards will require it for combining the VS and the PS.

The nV drivers boosting pixel shader performance have been known about since the launch of the 5800. nVidia's pixel shaders are much 'deeper' then ATi's and require significantly greater driver level optimizations. Carmack commented on this some time ago.

nVidia won't catch ATi in performance in HL2 looking at comparable parts. They placed their focus on shadows and lighting with the NV3X architecture over PSs, their odd pipeline configuration is good evidence of this. That direction obviously paid off for Doom3, cost them in HL2. Their performance will improve considerably with the Det 50s, but not enough to close the gap. Games that use the Doom3 style rendering techniques will likely continue to be faster on nV hardware while those that are comparable to HL2 will be faster on ATi hardware.
 

SectorZero

Member
Sep 29, 2002
96
0
0
Ok, my last post was meant to be a little sarcastic. Not all the posts in this monster of a thread are from fanboys who could teach religious fanaticism in Iran.

By Nvidiots and ATImbiciles I mean the people who wont listen to reason and seem ready to fight to the death in defense of their video card choice, or who seem to have their noses in the air and their chests puffed out because their video card scores better on a bench mark.

It's all so silly. Makes for interesting reading though, and a good laugh.

 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
Ben

Why don't you post on real dev boards and see what they say (and post the link)-most of that is over our heads, but is sounds like bs to me because no one at b3d or anand or dig or ID is posting the line you just did (and they are the ones making the game, and the ones that are sent the betas and alphas).

What I've read from the actual game devs and the 3d gurus at beyond 3d and even carmack himself is that the ARB2 codepath has better IQ and runs much faster than the FX codepath.

Rogo
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I wouldn't put much worth into what Rollo says. I think the guy is a compulsive liar.
I think you're Gary Coleman, strung out on pop wine and crystal meth, bought with profits from your glory days on "Diffrent Strokes".
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
Nice pop culture reference rollo.

But you know that to use rather arcane (old geezer ;)) pop culture references is only indicative of a person that has had too much time on their hands and hasn't taken any philosophy courses.

rogo
 

orion7144

Diamond Member
Oct 8, 2002
4,425
0
0
I just bought a 9800Pro and have been a Nividia fan for a long time (Still am). I am sure the 4200TI i replaced would play the game just fine with no studdering but with reduced graphics quallity. So who cares it's just a game. It may look better on some systems but it in the end everyone will finish the game the same way.

Nobody can say the Valve is not in cohorts with ATI. I mean come on they are bundling the game with the 9800 pro. Not ATI bundling the game with their card. Valve even mentioned it at that conferance. I think the article at Tom's gives Nvidia a chance at a rebuff and Nvidia did a great job of explaining their point.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
This is the concrete proof we need to show that even a 9600 Pro totally dominates nVidia's DX9 hardware. Even a Ti4600 is better for Halflife2 than any of nVidia's newer cards.

nVidia really dropped the ball with this generation.
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: Rogodin2
Most of us have owned both companies cards.

I've owned tnt2, g2 gts, g2mx400, g4ti4400, g4ti4200, and gfx 5600U, and from the ati camp-radeon 8500, radeon 9700PRO, radeon 9800PRO (128mb).

All I can say is that ati is making cards that are much much better than nvidia, and have been since the 8500-which was meant to compete with the G3 (0riginal not the ti) and it kicked the shat out of it.

You're not providing any epiphanies for us.

Rogo

ATI 8500 was a POS and so were the drivers. What made Ati big was the 9700P but nothing else before that. It only took Ati 17 or 18 years to get where nVidia is in half that time. Ati has a better product ATM but the battle is far from over.