Originally posted by: VirtualLarry
Originally posted by: jiffylube1024
I don't see what is inherently bad or wrong with this, either. NV and ATI made hardware choices several years ago; both companies had the opportunity to include or exclude features and take advice from hardware companies, so it's not like ATI or Nvidia "didn't know" how they would perform. Slower performance in D3 is just something ATI will have to live with this generation, just like NV's poor PS 2.0 performance last generation.
I think that you totally missed the point. This isn't about the hardware support at all - both brands of hardware support the feature, but because that feature is not yet part of the OpenGL base standard, it has to be accessed using vendor-specific extensions. Both ATI and NV are (apparently) squabbling over something trivial on the API side of things. Carmack doesn't want to be forced to waste his time, coding for two arbitrarily-different APIs, just to achieve the same thing in hardware, on two different-brand cards. At least, that's how I read it.
Is this at all surprising to anyone, this petty squabbling between ATI and Nvidia? Their press releases should be enough indication of both companies' "tactics" towards eachother. Remember Cg, anyone?
All of this "Carmack prefers NVidia" fanboy-ism is quite far from the mark, I'm afraid.
It's not "Carmack prefers Nvidia fanboy-ism", it's
Carmack currently works on Nvidia hardware, and then makes his code work on other platforms . No fanboyism, no opinion, just fact. He has been developing on Nvidia hardware for years now. It's not a bad thing, it's not a good thing, but it is what it is, and it's a bit unfortunate for ATI because if there's a feature that Carmack likes on the NV cards that ATI doesn't have then they are SOL (for example ultrashadow, which, however is not used in D3 but serves my point as a feature Carmack would like to use in the future).
Originally posted by: Rollo
The fact of the matter is, even when ATI was "ahead" at Tomb Raider Angel of Sloppy Code, nVidia was still ahead in the check box features. (32bit precision, much longer instruction sets) If you're a developer and want to use this stuff, and SM3, you're left with nVidia.
Please don't yell "Tomb Raider was key! MS set the standard then not ATI". I know, I know.
Rollo, you're a knob. You are the only person on the planet earth that talks about Tomb Raider:AOD anymore. Nobody gives a sh!t and you can stop referencing it as the "only" game ATI's R300 series whooped the NV3x series in. Stop trying to spin "Nvidia had a plan all along" out of the Nv3x having some key failures and ATI having it's long day in the sun with the R3x0 series.
And this petty xenophobic semi-serious "comedy" about Carmack hating Canadians because of an errand moose, your grudge against New Zealand and New Zealanders because of BFG10K, etc has to stop. It is so juvenile it's absurd. When your son is old enough to think for himself and is surfing on the AT archives, do you think he will be proud of the endless tripe you have posted online? Of your miraculous revelations and flip-flops you have made on just about every point regarding the R300 and then NV40?
Do you even care at all about setting a good example for him or is petty name calling and insulting Canadians and New Zealanders fair game in your house?
Regardless, it's always fun to come on here and see how much of an @ss you are being ("Tomb Raider: Angel of Sloppy Code" for the 100th time) or how easily BFG10K picked through your "arguments" in your latest post.
I'm sure you'll dignify my post with your usual "meh, not worth my time" response, which is fine.
You haven't contributed anything meaningful to the boards in quite a long time, aside from your helpful 6800nu/softmod vs 6800GT thread. Lately it's been the "R300 was overrated" greatest hits, with pro-Nv fanboy Rollo as your narrator.