Originally posted by: cmdrdredd
the benchmark graphs are borked! They all show the bug demo graph.
Heh I noticed that too. We can probably figure out what the rest of the graphs look like though. Something along the lines of ATI kicking ass.
Originally posted by: cmdrdredd
the benchmark graphs are borked! They all show the bug demo graph.
It's astounding how so many people are willing to take a rigged demo and extrapolate it into failed hardware and failed drivers. My god, you'd think FX cards cause cancer and acne from the way you carry on about things. A bug that Nvidia readily acknowleges (missing fog) is what you consider proof of failure?!? What about all the application fixes that the Catalyst drivers require? A friend bought a 9700Pro and right after he told me it's 3DM2K3 score, he listed the games he played that rendering bugs. Once again, the ATI double-standard rears it's ugly head.Originally posted by: RogerAdam
Probably whats happening to NV right now, their reputation is being flogged, they're always optimizing "bugs" and if something is awry and exposed it's a "driver bug" (ie the fog removal in HL2, shader replacement in 3DM03, etc...) - Funny (and sad) thing about it is that those who BLINDLY defend NV now always fallback to the "drivers are better" BS in light of a company that doesn't want to admit it's broken h/w, and say it's the DRIVERS fault, IMO NV makes BOTH arguements presented by their enthusiasts moot. They're in obvious damage control, it's not too hard to see that, and ATI doesn't have to do anything PR-wise, it's as if NV has been hired to that for them. NVidia better pray there isn't one of those "out of nowhere" hits coded in DX9 it would KILL them, rather than hurt them for a bit as what's happening today.
Yes, the demo does look great, but that doesn't mean your first paragraph was anything other than Fanboy at it's best/worst.Originally posted by: jiffylube1024
Well that's it. I'm done posting in this thread about why ATI is just plain better than nVidishit at DX9, and I'm done sugarcoating my words about nVidia. Good riddance to NV3x, good riddance to all the nVidiots in here who rain on everyone's parade, and hand out a "fanATIc" sticker to everyone at the door for making the right choice. And enjoy Doom3. That game looks very fun (I'm really looking forward to it), but you better find something else to do after the paper-thin single-player campaign wears thin.
By the way - to anyone who hasn't seen the 600MB HL2 demo, please download it. I really haven't been more impressed by a game before. Perhaps Halo at E3 '99 and Metal Gear Solid 2 at it's first E3. That's about it. HL2 looks to set a new paradigm in gaming. God bless Valve!
Once again, what are you talking about?!? The release said that:Originally posted by: CyberZenn
Originally posted by: DefRef
to obtain the performance benefit of this mode with no image quality degradation.
1. If the IQ isn't diminished, then it doesn't matter, unless you want us to believe that you can tell what precision level is being used on the shaders that make that HORDE OF ALIENS TRYING TO KILL YOU look all shiny.
2. This isn't the same as the difference between 16 and 32-bit colr depth for TEXTURES. No one is talking about reducing that.
You may, however, notice things like "seams" between textures or polygons as they are rendered ever-so-slightly out of place. In my experience (admittedly limted to college CG courses), the difference between a polygon rendered using a 32bit number (say, a double) and a 64bit number (floating point) is visually noticable, especially after the number rounding you are having to do propagates through your code thousands and thousands of times (w/o deliberate correction, at least).
Why would we be? We're waiting for some solid facts, not rigged tests by a couple of companies who counted on their legions of mindless fanboys to regurgitate whatever they pour in their troughs.Originally posted by: SickBeast
Ahhh...the return of the Deaf Ref. It's too bad you and the nvidia boys aren't "jerking each other into a frenzy" right now, huh?
Why would we be? We're waiting for some solid facts, not rigged tests by a couple of companies who counted on their legions of mindless fanboys to regurgitate whatever they pour in their troughs.Originally posted by: SickBeast
Ahhh...the return of the Deaf Ref. It's too bad you and the nvidia boys aren't "jerking each other into a frenzy" right now, huh?
Anand's piece is balanced and he's interested in why the big discrepency in performance. I'll trust the guy who wants to find the truth over someone emotionally wrapped up in a piece of silicon and metal any day.
Even with new drivers, the FX will probably be slower than the Radeons, but if we're talking 160 mph vs. 175 mph, it's no big deal. However, if the gap is 50%-70%, whoa Nelly! Continuing the car analogy, Valve deliberately demanded that the Nvidia cards be run with watery low-octane gas and put premium in the ATI tank. The fact that they suddenly cut Nvidia off from development recently and then blind-sided them claims and charges that Nvidia had to read about in the papers (so to speak) shows that Valve may be souring the whole HL2 experience in their quest for cash. What sucks the most about that is I was dumb enough to stick up for them when people freaked-out without reason about the price pans they had. Maybe they are greedy after all.
Final note: Someone well up this thread mentioned that one of the reasons for HL's success was it ran pretty well on everyone's machines at the time - it even had a SW renderer option. (Which I see UT2K4 will as well.) Now, they're about to release a game that's "playable" (ahem) on lower-end systems but only look it's best on the pimpest of the pimp rigs, which are strictly in Alienware/Falcon/Voodoo territory. Even my new 3200+/5900/1GB system may not be enough. Yeep! But, do you know why The Sims (and their endless expansion packs) are the biggest sellers ever? A: Because it runs pretty well on any machine.
Any game that requires a $1500 upgrade (video, CPU, the works) to be playable isn't going to sell. People will wait for the Xbox version and just spend the $50 and not care about the pretty DX9 stuff under the hood.
Originally posted by: MrJim
DefRef>Good post, but if nvidia fixes its drivers so ATI and nvidia gets the same FPS(with top of the line cards), it still uses the mixed codepath with lower quality. Im using Nvidia cards because i do alot of 3d work but i still want to play my games with the best image quality(Shaders etc). This is good for future consumers, will be interesting to see more pure DX9 games and more clarifying responses from Nivida.
Originally posted by: DefRefYes, the demo does look great, but that doesn't mean your first paragraph was anything other than Fanboy at it's best/worst.Originally posted by: jiffylube1024
Well that's it. I'm done posting in this thread about why ATI is just plain better than nVidishit at DX9, and I'm done sugarcoating my words about nVidia. Good riddance to NV3x, good riddance to all the nVidiots in here who rain on everyone's parade, and hand out a "fanATIc" sticker to everyone at the door for making the right choice. And enjoy Doom3. That game looks very fun (I'm really looking forward to it), but you better find something else to do after the paper-thin single-player campaign wears thin.
By the way - to anyone who hasn't seen the 600MB HL2 demo, please download it. I really haven't been more impressed by a game before. Perhaps Halo at E3 '99 and Metal Gear Solid 2 at it's first E3. That's about it. HL2 looks to set a new paradigm in gaming. God bless Valve!
Originally posted by: TheSnowman
well we don't have to worry about 32-bit until at least dx10 as dx9 is just 24-bit, but ya they will eventualy have to dumb down shaders for the ati cards as well. as for the scores in the benchmarks, i think they repersent worst case senerios and framerate will be respectable while playing at those settings.
Originally posted by: DefRef
Originally posted by: RogerAdam
Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere.
You can say that again... nvnews.net is the biggest joke on the Net.
Originally posted by: solofly
Originally posted by: DefRef
Originally posted by: RogerAdam
Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere.
You can say that again... nvnews.net is the biggest joke on the Net.
You've obviously never seen the Democratic Underground site.
Valve deliberately demanded that the Nvidia cards be run with watery low-octane gas and put premium in the ATI tank. The fact that they suddenly cut Nvidia off from development recently and then blind-sided them claims and charges that Nvidia had to read about in the papers (so to speak) shows that Valve may be souring the whole HL2 experience in their quest for cash.
Not exactly. Valve insisted on using drivers that were currently available to end users
BINGO! That's what's so lame about this Valve/ATI putup job: The demanded that OLD drivers be used on the basis that those were what was available to the end user when the game itself isn't going to be available to end users for a couple of months. (Anyone think they're gonna get it done by Sept. 30th now? Me neither.) I'm surprised that they didn't specify that the 3.68 drivers from Dec. 1999 be used because it takes several years to know if drivers are stable.Originally posted by: Crapgame
I could see the point if the game was availible to end users as well.Not exactly. Valve insisted on using drivers that were currently available to end users
Originally posted by: DefRef
BINGO! That's what's so lame about this Valve/ATI putup job: The demanded that OLD drivers be used on the basis that those were what was available to the end user when the game itself isn't going to be available to end users for a couple of months. (Anyone think they're gonna get it done by Sept. 30th now? Me neither.) I'm surprised that they didn't specify that the 3.68 drivers from Dec. 1999 be used because it takes several years to know if drivers are stable.Originally posted by: Crapgame
I could see the point if the game was availible to end users as well.Not exactly. Valve insisted on using drivers that were currently available to end users
.
Originally posted by: DefRef
Originally posted by: Crapgame
Most ironic about all this back and forth is that the only entities clearing lying are Valve and ATI, yet Nvidia is getting it in the neck.
eh?
Originally posted by: Crapgame
Originally posted by: solofly
Originally posted by: DefRef
Originally posted by: RogerAdam
Over at NVNews, their boards are totally overrun by raving ATI nutcases - it's as if the Republican Party website was 80% Democratics flaming and threadcrapping everywhere.
You can say that again... nvnews.net is the biggest joke on the Net.
You've obviously never seen the Democratic Underground site.
No i didn't but this is fun enough to read...
Click here and/or here...
Originally posted by: gururu
it simply doesn't make sense to complain or argue about the benchmarks. it bewilders me how some look at the writing on the wall, and are so close minded that they still (STILL) deny, deny, deny. the fact is that these benches are data. DATA. who does not understand data? or the matter that data do not lie?
would it be better to NOT know this information? Particularly for all those waiting to get a 5900 right before hl2 comes out? How could anyone not want to know this information. It's ridiculous to even conceive of that notion.
are the numbers cooked? anand, hardocp, gamersdepot, and others don't think so. are they in bed with ati? probably not.
Det 50's. valve didn't 'demand' that the benches be done with an obsolete driver. They insisted that should a site DESIRE to run benches, that the benches be done with publically available drivers. How is that unfair?
why would you want buggy, unavailable dets to be used particularly when they omit gpu eating scenery like fog.
ridiculous.
all im saying is don't complain about data or new information. embrace it. when new data comes out, perhaps vindicating nv, then so be it. its better to get information than to be without it.
