Carmacks Quakcon speech

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: stnicralisk
Originally posted by: Drayvn
Soooo.... What use nVidia Extensions, then dont bother with ATi's then, if i remember correctly, other lesser known companies do this... *cough cough* Crytek *cough cough*

They dont seem, oh SM3 is better, so ill use that and dont bother using SM2.0b...

And so in the end, we see nVidia powering ahead because they are using nVidia extensions which ATi cant do, because they dont have nVidia only extensions, hey ATi brought out 3Dc, but wait its open source, so why not use that?

Carmack goes for open source development software, cool, then uses non-open souce specific GPU extension only one company can do?

Youre a quack. Crytek is developing ATi's new compression method for the next patch. I bet you see conspiracy everywhere.


Really? I was being sarcastic, tho in 1.2 they didnt have 3Dc, but now with the new patch they will have 3Dc
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: AnnoyedGrunt
Dravyn,

If you'd read the whole article, it looks like you would have found the answer to your question (nice troll comment though):

JC:
"The issues right now with the hardware are, the NV40 has a few things that make development easier for me. It has floating point blending, which saves me some passes for what I've been doing, well certainly have fallback positions so anything we do with blending we can do with an additional render and another texture copy pass on there, to work for NV30 and R300 class hardware. It's nice, and there's also the pretty much unlimited instruction count on the NV40 where there are times I'm writing these large fragment programs and it's nice to keep tacking more and more things in there as I look at it but I know full well I'll eventually have to segment these into something that can run on R300 class hardware. "


From that excerpt, it sounds like DEVELOPMENT is easier on the NV40 because of a couple of its added features (not because they were "flooded" with cards), but then at the end he says knows he'll eventually have to support all the cards anyway. I think his main point on the shadow buffers that you quoted (which is what he is playing around with for the next engine)is since he already uses NV cards for development, he is also using their extensions (since that's what he's developing on). Obviously he is hoping to encourage the Vid Card people to standardize on their interface or translator or whatever it is he's working with, but I guess we'll see how that goes on the next engine. It also sounds like he knows that no matter how the engine ends up, it will have to run on hardware from both vendors if it is to be successful.

-D'oh!

Yea i did read that bit, but thats what im talking about, ok for him the nV cards are easier to code for, but i dont see why he has to exclude other cards and architectures just because he wants an easier life, if i buy the game he makes, then i want him to do his best for every possibility, dont u? I dont wanna have to come along and then find out that the other companies card runs better because he couldnt be bothered to code anything for mine, a bit selfish really?

 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
Originally posted by: Drayvn
And from what his statement said, hes gonna use only nV extensions and it sounds like he will leave out any ATi extensions because he has to worry about them? Umm if i buy his games, i expect him to think of all possibilities...

I love the indignant tone that some ATers get, like the computing world OWES them something. Fine, don't buy his games. I'm sure he'll cry a little tear for you as he drives his Ferrari to the bank with a trunkful of cash and a $5k hooker in the passenger seat. All the while we're all talking about how awsome the newest ID game is, and you're saying, "OMG it's not optimized I could do a better job!!!"

:roll:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
ATI dosnt have the custom extensions Nvidia has.
And vice versa.

It is a no-brainer if you are doing development work.
How did you figure that?

Also as noted, he ran into the limits of the R300 core when coding Doom3.
And as we've previously discussed your interpretation of those comments was rather lacking to say the least.

nVidia was still ahead in the check box features. (32bit precision, much longer instruction sets)
Great things to produce even shinier pipes at even bigger slideshows. It's actually quite ludicrous to suggest the R300 was limiting Carmack but the NV30 was giving him freedom, but then that's the sort of pro-nv propaganda you typically spread so I'm not really that surprised.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
quote:

--------------------------------------------------------------------------------
Also as noted, he ran into the limits of the R300 core when coding Doom3.
--------------------------------------------------------------------------------


And as we've previously discussed your interpretation of those comments was rather lacking to say the least.


quote:

--------------------------------------------------------------------------------
nVidia was still ahead in the check box features. (32bit precision, much longer instruction sets)
--------------------------------------------------------------------------------


Great things to produce even shinier pipes at even bigger slideshows. It's actually quite ludicrous to suggest the R300 was limiting Carmack but the NV30 was giving him freedom, but then that's the sort of pro-nv propaganda you typically spread so I'm not really that surprised.

I don't know about New Zealand, but here in the States we generally take a first person account of a situation over some unknown guy's opinion of it BFG:
What Carmack himself said

For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.

Let's see:
"forward looking" In New Zealand-ese, does that mean coding for the "future" or "past"?

"tradeoff...slower...more instructions...hit limit" does that mean to you he was trying to say the nV30 has "less" instructions, and that he "didn't" hit the limit of the R300?

Yeah, my "interpretations" are really stretching credibility- why don't you tell us what he REALLY meant BFG? I speak English, seems fairly straightforward to me? :roll:

 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Rollo
quote:
but here in NewRollo Land we generally take a first person account of a situation over some unknown guy's opinion of it IF it suites our pro nVidia agenda
If it doesn't, it is dismissed as "Shady Days" or something else. Pro nVidia is the only truth. Anything else is fabrication and lies.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: oldfart
Originally posted by: Rollo
quote:
but here in NewRollo Land we generally take a first person account of a situation over some unknown guy's opinion of it IF it suites our pro nVidia agenda
If it doesn't, it is dismissed as "Shady Days" or something else. Pro nVidia is the only truth. Anything else is fabrication and lies.


LOL Old Fart that's pretty funny- especially the "New RolloLand" part. Thanks for the laugh. :):beer:
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Carmack is just another promoter and he feels this type of flaming will attract attention for his next really scary game. Must take all this type of hype with a grain of salt as neither company is going away quickly and games must run on both platforms to be successful. Expect better ATI support for future games on this engine. Especially anything with decent online play. Or maybe Carmack has thrown in the towel and has accepted that hl2 is the stronger engine for online play? :shocked:
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: ronnn
Carmack is just another promoter and he feels this type of flaming will attract attention for his next really scary game. Must take all this type of hype with a grain of salt as neither company is going away quickly and games must run on both platforms to be successful. Expect better ATI support for future games on this engine. Especially anything with decent online play. Or maybe Carmack has thrown in the towel and has accepted that hl2 is the stronger engine for online play? :shocked:

Especially if you dont use any of the high end features like CS:S :p
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Acanthus
Originally posted by: ronnn
Carmack is just another promoter and he feels this type of flaming will attract attention for his next really scary game. Must take all this type of hype with a grain of salt as neither company is going away quickly and games must run on both platforms to be successful. Expect better ATI support for future games on this engine. Especially anything with decent online play. Or maybe Carmack has thrown in the towel and has accepted that hl2 is the stronger engine for online play? :shocked:

Especially if you dont use any of the high end features like CS:S :p

That's true... there's some nice PS2.0 lighting effects... but the shadows aren't very good... there's only one light source that casts shadows... the sun... and somehow it even casts a shadow under the bridge in de_dust :D
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
Originally posted by: Jeff7181
Originally posted by: Acanthus
Originally posted by: ronnn
Carmack is just another promoter and he feels this type of flaming will attract attention for his next really scary game. Must take all this type of hype with a grain of salt as neither company is going away quickly and games must run on both platforms to be successful. Expect better ATI support for future games on this engine. Especially anything with decent online play. Or maybe Carmack has thrown in the towel and has accepted that hl2 is the stronger engine for online play? :shocked:

Especially if you dont use any of the high end features like CS:S :p

That's true... there's some nice PS2.0 lighting effects... but the shadows aren't very good... there's only one light source that casts shadows... the sun... and somehow it even casts a shadow under the bridge in de_dust :D



he was being sarcastic.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Jeff7181
there's only one light source that casts shadows... the sun... and somehow it even casts a shadow under the bridge in de_dust :D
I remember when the shadow feature came out for Quake2. Even the explosions would cast a shadow! :confused:
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
I remember the explosions casted 2 different polygon shadows.

The cylinder and then the sphere.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Everyone can make fun of cs, but more people (kids probably) play it online than doom 3. They will still be playing it when doom 3 is a distant memory.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: ronnn
Everyone can make fun of cs, but more people (kids probably) play it online than doom 3. They will still be playing it when doom 3 is a distant memory.

And more people eat at McDonalds than at In N Out, but that says nothing about the quality of the food. I actually liked CS better when so many people didn't play it. Now theres nothing but cheaters and jerks playing, and there hasn't been a significant change to any of the core maps in years. It wouldnt be so bad, but thats all the servers run nowadays.
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
Originally posted by: ronnn
Everyone can make fun of cs, but more people (kids probably) play it online than doom 3. They will still be playing it when doom 3 is a distant memory.

Thats because CS is supposed to be a multi player game.


Doom 3 is single player, with the ability to have 4 way death match.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: ronnn
Everyone can make fun of cs, but more people (kids probably) play it online than doom 3. They will still be playing it when doom 3 is a distant memory.

Oh I'm definately playing the beta... and I'm definately going to buy the game when it's released just like Doom 3. But that doesn't change the fact that the shadows are anything but spectacular. They're a step above blob shadows...
 

VirtualLarry

No Lifer
Aug 25, 2001
56,586
10,225
126
Originally posted by: tk109
Well he wants to code for nvidia crads. He feels thats the best route. If he doesn't want the headaches and time wasted to code for other companies then that's what he wants to do. Servival of the fittest I guess eh.

I think that by making his stance public, he is trying to pressure ATI into supporting NV's method, or forcing an API compromise, so that he (and every other OpenGL-based game developer by extension), don't have to go and do a lot of extra work supporting two seperate APIs just to do the same thing, because of some petty vendor squabbling.

I think Carmack should chair the OpenGL review board, personally, but perhaps that's just me. :)

I don't really think that this has anything at all to do with him playing fanboy and cheerleader for NV, as some have tried to insinuate. However, it's pretty well-known that NV has had slightly better OpenGL support, with more up-to-date vendor extensions to support their new hardware features. I do believe that he started coding Doom3 on ATI boards (remember the E3 Doom3 alpha code leak?), and that is my evidence that Carmack is not some blind NV fanboy, unlike some in this forum that wish to paint this issue as such.
 
Apr 21, 2004
38
0
0
Id software alwyas has and always will prefer Nvidia cards regardless of the current technology of the time. Yes it does suck for us ATI owners but there is nothing we can do about it. He seems to program for Nvidia cards pretty extensively and just scratches something together that will work on ATI cards.

He thinks he can get all graphics cards makers to produce the same product using open standards and still be competitive which is not going to happen. It like telling AMD and Intel to both use the exact same instruction sets. Each company thinks their way is better and its all about marketing and money being thrown around as to who gets their version accepted.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
If anyone read John Carmack's blogs years ago during Doom3's development he basically spelled out why he developed an Nvidia preference.

It was back in the 3dfx/Nvidia days, and then in the GeForce 1/2 vs ATI Radeon days when everyone but NV's OpenGL drivers were, for lack of a better word, lacking in quality. He said something to the effect of "things work right on Nvidia the first time" and "when something is broken on NV hardware, I assume it's my fault and I try to fix it; when something doesn't work on other hardware I assume I am right."

It's not a very uncommon situation - brand loyalty. People do it all the time with a car that gives them a good or bad experience ("I'll never buy another Hyundai again!") or ("this Accord has lasted me over 15 years; I'd recommend Honda to everyone I know!!").

Because of this favourable relationship, and because Nvidia essentially custom tailored NV3x (and to a lesser extent, NV4x) to Doom3 under Carmack's guidance, he has a strong preference for NV hardware.

I don't see what is inherently bad or wrong with this, either. NV and ATI made hardware choices several years ago; both companies had the opportunity to include or exclude features and take advice from hardware companies, so it's not like ATI or Nvidia "didn't know" how they would perform. Slower performance in D3 is just something ATI will have to live with this generation, just like NV's poor PS 2.0 performance last generation.
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Originally posted by: bbomb
Id software alwyas has and always will prefer Nvidia cards regardless of the current technology of the time. Yes it does suck for us ATI owners but there is nothing we can do about it. He seems to program for Nvidia cards pretty extensively and just scratches something together that will work on ATI cards.

He thinks he can get all graphics cards makers to produce the same product using open standards and still be competitive which is not going to happen. It like telling AMD and Intel to both use the exact same instruction sets. Each company thinks their way is better and its all about marketing and money being thrown around as to who gets their version accepted.

I dont think this to be the case at all. As the article said, he programs for the NV cards because it makes his job EASIER! He then makes a similar path for the ATI cards after he gets a working NV code path. I can see the logic behind his decision and it makes logical sense. Less work, more efficiency, quicker game production, more profit. I'd personally do the same thing. Mess with technicalities and breaking up the code into multiple, smaller lengths later on, and just code w/o worries earlier on.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,586
10,225
126
Originally posted by: jiffylube1024
I don't see what is inherently bad or wrong with this, either. NV and ATI made hardware choices several years ago; both companies had the opportunity to include or exclude features and take advice from hardware companies, so it's not like ATI or Nvidia "didn't know" how they would perform. Slower performance in D3 is just something ATI will have to live with this generation, just like NV's poor PS 2.0 performance last generation.

I think that you totally missed the point. This isn't about the hardware support at all - both brands of hardware support the feature, but because that feature is not yet part of the OpenGL base standard, it has to be accessed using vendor-specific extensions. Both ATI and NV are (apparently) squabbling over something trivial on the API side of things. Carmack doesn't want to be forced to waste his time, coding for two arbitrarily-different APIs, just to achieve the same thing in hardware, on two different-brand cards. At least, that's how I read it.

All of this "Carmack prefers NVidia" fanboy-ism is quite far from the mark, I'm afraid.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I don't know about New Zealand, but here in the States we generally take a first person account of a situation over some unknown guy's opinion of it BFG:
"forward looking" In New Zealand-ese, does that mean coding for the "future" or "past"?
If you don't stop this racist bullshit Rollo I'm going to report you to the mods. You've been an idiot for years but now you're crossing the line.

For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.
Uh, so the tradeoff is longer instructions in exchange for ... even bigger slideshows.

So what exactly has the NV30 given him than the R300 didn't? Where has he taken advantage of this "feature" on the NV30?

Not to mention your past claim was that features are meaningless as long as the cards are the same speed. Well the R300 is faster so using your past logic that would make it better, right? Of course with your present logic features are everything so now the NV30 is better.

does that mean to you he was trying to say the nV30 has "less" instructions, and that he "didn't" hit the limit of the R300?
It means that the extra instructions produce an even more unusable experience on the NV30 compared to standard instruction lengths and if you had even half a brain you'd know that.

I speak English, seems fairly straightforward to me?
As straightfoward as your 9800 Pro to 5800 "upgrade"?
Or how about as straightfoward as your shiny pipes comments?
Or how about the "features don't matter, oh wait a minute, now they do"?
Or how about "nobody needs shaders, wait a minute, now everyone needs them"?

But that's okay because you have a family, right? :roll:
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: VirtualLarry
Originally posted by: jiffylube1024
I don't see what is inherently bad or wrong with this, either. NV and ATI made hardware choices several years ago; both companies had the opportunity to include or exclude features and take advice from hardware companies, so it's not like ATI or Nvidia "didn't know" how they would perform. Slower performance in D3 is just something ATI will have to live with this generation, just like NV's poor PS 2.0 performance last generation.

I think that you totally missed the point. This isn't about the hardware support at all - both brands of hardware support the feature, but because that feature is not yet part of the OpenGL base standard, it has to be accessed using vendor-specific extensions. Both ATI and NV are (apparently) squabbling over something trivial on the API side of things. Carmack doesn't want to be forced to waste his time, coding for two arbitrarily-different APIs, just to achieve the same thing in hardware, on two different-brand cards. At least, that's how I read it.

Is this at all surprising to anyone, this petty squabbling between ATI and Nvidia? Their press releases should be enough indication of both companies' "tactics" towards eachother. Remember Cg, anyone?

All of this "Carmack prefers NVidia" fanboy-ism is quite far from the mark, I'm afraid.

It's not "Carmack prefers Nvidia fanboy-ism", it's Carmack currently works on Nvidia hardware, and then makes his code work on other platforms . No fanboyism, no opinion, just fact. He has been developing on Nvidia hardware for years now. It's not a bad thing, it's not a good thing, but it is what it is, and it's a bit unfortunate for ATI because if there's a feature that Carmack likes on the NV cards that ATI doesn't have then they are SOL (for example ultrashadow, which, however is not used in D3 but serves my point as a feature Carmack would like to use in the future).

Originally posted by: Rollo

The fact of the matter is, even when ATI was "ahead" at Tomb Raider Angel of Sloppy Code, nVidia was still ahead in the check box features. (32bit precision, much longer instruction sets) If you're a developer and want to use this stuff, and SM3, you're left with nVidia.

Please don't yell "Tomb Raider was key! MS set the standard then not ATI". I know, I know.

Rollo, you're a knob. You are the only person on the planet earth that talks about Tomb Raider:AOD anymore. Nobody gives a sh!t and you can stop referencing it as the "only" game ATI's R300 series whooped the NV3x series in. Stop trying to spin "Nvidia had a plan all along" out of the Nv3x having some key failures and ATI having it's long day in the sun with the R3x0 series.

And this petty xenophobic semi-serious "comedy" about Carmack hating Canadians because of an errand moose, your grudge against New Zealand and New Zealanders because of BFG10K, etc has to stop. It is so juvenile it's absurd. When your son is old enough to think for himself and is surfing on the AT archives, do you think he will be proud of the endless tripe you have posted online? Of your miraculous revelations and flip-flops you have made on just about every point regarding the R300 and then NV40?

Do you even care at all about setting a good example for him or is petty name calling and insulting Canadians and New Zealanders fair game in your house?

Regardless, it's always fun to come on here and see how much of an @ss you are being ("Tomb Raider: Angel of Sloppy Code" for the 100th time) or how easily BFG10K picked through your "arguments" in your latest post.

I'm sure you'll dignify my post with your usual "meh, not worth my time" response, which is fine.

You haven't contributed anything meaningful to the boards in quite a long time, aside from your helpful 6800nu/softmod vs 6800GT thread. Lately it's been the "R300 was overrated" greatest hits, with pro-Nv fanboy Rollo as your narrator.