• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Half-life 2 Performance: Breaking news

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: RogerAdam
Originally posted by: cmdrdredd
Originally posted by: Rage187
/clap @ defref


well said.




What do you think happens when you build a game specifically for one card, then try and code for another later?

You get results like this.


Yes ATI and Valve are having a circle jerk, and yes Gabe Newell is the pivot man.

Same thing is going happen w/ nvidia and DOOM, then the shoe will be on the other foot.

This is bad for us, quality is going to be sacrificed for speed now.


Nvidia is much better at OpenGL than DX9 games. That is for the better performance of the 5900 on Doom3

As per ATI and Valve, coding a full DX9 game that won't run acceptably on a highend "DX9+ (as per NV)" is not the ISV fault at all, the IHV should've had the features WORKING properly on silicon - it's that simple. I'm sure people who believe that if a game is developed fully DX9 is an obvious slant or partnership with an ISV, how else can they justify to themselves their $300-500 investment.

Yes sure Nv is faster at specially coded for their hardware FX12 / FP16 precison OGL vs ATI's FP24 which doesn't need any special attention due to any inefficiencies, that's a no brainer.

And what happens when some unknown company comes up with a DX9 game and doesn't do any special optimization for anything. Just generix DX9 calls and shader code. That would be disasterous for Nvidia I think.

lets say that game does very well and gets a huge following for it's features or something. They being a small company from the start didn't do any specific code because they couldn't afford the man hours. This game being now very popular exposes some problems. I don't know the technicalities of the code etc, but I do believe we will get a better understanding of why stuff runs slow on certain hardware.
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
Come on...They said tomorrow, it's 11:59, in one minute it's the day after tomorrow...Gimme benchies!!! I don't want to have wasted all this time...12:00 is here...POO, NOTHING YET...

EDIT-AHHH COME ON!!! OMG A MINUTE LATE!!!11
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: NYHoustonman
Come on...They said tomorrow, it's 11:59, in one minute it's the day after tomorrow...Gimme benchies!!! I don't want to have wasted all this time...12:00 is here...POO, NOTHING YET...

EDIT-AHHH COME ON!!! OMG A MINUTE LATE!!!11

LOL I caught that too...more tomorrow but they really mean More in 2 days
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: cmdrdredd
Originally posted by: RogerAdam
Originally posted by: cmdrdredd
Originally posted by: Rage187
/clap @ defref


well said.




What do you think happens when you build a game specifically for one card, then try and code for another later?

You get results like this.


Yes ATI and Valve are having a circle jerk, and yes Gabe Newell is the pivot man.

Same thing is going happen w/ nvidia and DOOM, then the shoe will be on the other foot.

This is bad for us, quality is going to be sacrificed for speed now.


Nvidia is much better at OpenGL than DX9 games. That is for the better performance of the 5900 on Doom3

As per ATI and Valve, coding a full DX9 game that won't run acceptably on a highend "DX9+ (as per NV)" is not the ISV fault at all, the IHV should've had the features WORKING properly on silicon - it's that simple. I'm sure people who believe that if a game is developed fully DX9 is an obvious slant or partnership with an ISV, how else can they justify to themselves their $300-500 investment.

Yes sure Nv is faster at specially coded for their hardware FX12 / FP16 precison OGL vs ATI's FP24 which doesn't need any special attention due to any inefficiencies, that's a no brainer.

And what happens when some unknown company comes up with a DX9 game and doesn't do any special optimization for anything. Just generix DX9 calls and shader code. That would be disasterous for Nvidia I think.

lets say that game does very well and gets a huge following for it's features or something. They being a small company from the start didn't do any specific code because they couldn't afford the man hours. This game being now very popular exposes some problems. I don't know the technicalities of the code etc, but I do believe we will get a better understanding of why stuff runs slow on certain hardware.

Probably whats happening to NV right now, their reputation is being flogged, they're always optimizing "bugs" and if something is awry and exposed it's a "driver bug" (ie the fog removal in HL2, shader replacement in 3DM03, etc...) - Funny (and sad) thing about it is that those who BLINDLY defend NV now always fallback to the "drivers are better" BS in light of a company that doesn't want to admit it's broken h/w, and say it's the DRIVERS fault, IMO NV makes BOTH arguements presented by their enthusiasts moot. They're in obvious damage control, it's not too hard to see that, and ATI doesn't have to do anything PR-wise, it's as if NV has been hired to that for them. NVidia better pray there isn't one of those "out of nowhere" hits coded in DX9 it would KILL them, rather than hurt them for a bit as what's happening today.

 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
What if it's just the same stuff that's been put up on other websites? Oh, what a dissappointment that would be...GOD DAMMIT COME ON I'M RISKING MY LIFE FOR THIS...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yeah I mean it could happen. SOme game comes up that looks cool and then the demo is just awesome and it sells 1million in 2 weeks. I dnno how likely it is, but you never know with the vast amount of different developers out there
 

Vonkhan

Diamond Member
Feb 27, 2003
8,198
0
71
dammmmnit Anand ... come on dude! my girl wants me in bed but i want u(r) on the bench, so hurry up!

:D
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
That's it, I give up, it's not worth the risk...Good day, I guess I'lll have to wait till tomorrow...:(:(:(:( You guys let me down, WTF in god's name happened???
 

Vonkhan

Diamond Member
Feb 27, 2003
8,198
0
71
NEWSFLASH : NVIDIA HAS KIDNAPPED ANAND ... He's being held with a dustbuster at his throat; they want him to fudge the benchies!!!












what the heck, I'm bored
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Well that's it. I'm done posting in this thread about why ATI is just plain better than nVidishit at DX9, and I'm done sugarcoating my words about nVidia. Good riddance to NV3x, good riddance to all the nVidiots in here who rain on everyone's parade, and hand out a "fanATIc" sticker to everyone at the door for making the right choice. And enjoy Doom3. That game looks very fun (I'm really looking forward to it), but you better find something else to do after the paper-thin single-player campaign wears thin.

By the way - to anyone who hasn't seen the 600MB HL2 demo, please download it. I really haven't been more impressed by a game before. Perhaps Halo at E3 '99 and Metal Gear Solid 2 at it's first E3. That's about it. HL2 looks to set a new paradigm in gaming. God bless Valve!
 

CyberZenn

Senior member
Jul 10, 2001
462
0
0
Originally posted by: DefRef
to obtain the performance benefit of this mode with no image quality degradation.

1. If the IQ isn't diminished, then it doesn't matter, unless you want us to believe that you can tell what precision level is being used on the shaders that make that HORDE OF ALIENS TRYING TO KILL YOU look all shiny.

2. This isn't the same as the difference between 16 and 32-bit colr depth for TEXTURES. No one is talking about reducing that.

You may, however, notice things like "seams" between textures or polygons as they are rendered ever-so-slightly out of place. In my experience (admittedly limted to college CG courses), the difference between a polygon rendered using a 32bit number (say, a double) and a 64bit number (floating point) is visually noticable, especially after the number rounding you are having to do propagates through your code thousands and thousands of times (w/o deliberate correction, at least).
 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
i think id laugh my ass off seeing some gf4 benchmarks in hl2. i can see it all already,
" These becnhmarks were done on a P4-2.8c, IC7-G, Corsair 3500 XMS 1gig 2x512 with the GF4 Ti lineup. Here are the results:

GeForce 4 Ti 4200 128mb - No results were recorded, it crashed immediately.
GeForce 4 Ti 4400 128mb - We couldnt find one of these in stock.
GeForce 4 Ti 4600 128mb - It actually loaded, we saw an ATI logo, and then it said "WTF, your not with ATI!" *crash*"

funnyz0r.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: DefRef
This whole thread is proof that some people just HATE NVIDIA to the exclusion of all reason AND in defiance of any claims the Fanboys...yada yada...you can't help yourselves?

Ahhh...the return of the Deaf Ref. It's too bad you and the nvidia boys aren't "jerking each other into a frenzy" right now, huh?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: batmang
i think id laugh my ass off seeing some gf4 benchmarks in hl2. i can see it all already,
" These becnhmarks were done on a P4-2.8c, IC7-G, Corsair 3500 XMS 1gig 2x512 with the GF4 Ti lineup. Here are the results:

GeForce 4 Ti 4200 128mb - No results were recorded, it crashed immediately.
GeForce 4 Ti 4400 128mb - We couldnt find one of these in stock.
GeForce 4 Ti 4600 128mb - It actually loaded, we saw an ATI logo, and then it said "WTF, your not with ATI!" *crash*"

funnyz0r.

na actualy the ti4600 keeps up pretty good with a 5900u. ;)