Carmacks Quakcon speech

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Heres the transcript of it, but here is a little tidbit of what Carmack said in it

Some of that is due to hardware API issues. Right now I'm using the OpenGL p-buffer and render-to-texture interface which is a GOD AWFUL interface, it has far too much inheritence from bad design decisions back in the SGI days, and I've had some days where it's the closest I'd ever been to switching over to D3D because the APIs where just that appaulingly bad. Both ATI and Nvidia have their preferred direction for doing efficient render-to-texture, because the problem with the existing APIs is not only are they crummy bad APIs, they also have a pretty high performance overhead because they require you to switch OpenGL rendering contexts, and for shadow buffers that's something that has to happen hundreds of times per frame, and it's a pretty big performance hit right now. So, both ATI and Nvidia have their preferred solutions to this and as usual they're not agreeing on exactly what should be done on it, and it's over stupid, petty little things. I've read both the specs, and I could work with either one, they both do the job, and they're just silly syntactic things and I have a hard time empathising why they can't just get together and agree on one of these. I am doing my current work on Nvidia based hardware so it's likely I will be using their extension.

Hmm, Carmack is using nVidia stuff, so he will use their extensions, hmm, why not code stuff for ATi stuff? Oh well thats what happens when one company floods u with their cards....

Here is the rest of it, its pretty long, so hopefully ull get through it all http://www.gamedev.net/communi...ic.asp?topic_id=266373
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Hmm, Carmack is using nVidia stuff, so he will use their extensions, hmm, why not code stuff for ATi stuff? Oh well thats what happens when one company floods u with their cards....

Here is the rest of it, its pretty long, so hopefully ull get through it all http://www.gamedev.net/communi...ic.asp?topic_id=266373


Do you honestly think a multimillionaire like Carmack thinks "Wow. nVidia sent me all these cards. I guess I'll use them rather than buy the ones I'd rather use." Or that ATI and the other chip companies don't send their hardware as well?


I think it's pretty safe to assume Carmack uses nVidia because he wants to use nVidia.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
he uses nvidia because he prefers opengl. if he coded in Direct3d he'd be using ati.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
ATI dosnt have the custom extensions Nvidia has.

It is a no-brainer if you are doing development work.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Genx87
ATI dosnt have the custom extensions Nvidia has.

It is a no-brainer if you are doing development work.
I've read both the specs, and I could work with either one, they both do the job, and they're just silly syntactic things and I have a hard time empathising why they can't just get together and agree on one of these. I am doing my current work on Nvidia based hardware so it's likely I will be using their extension

What does this have to do with custom extensions? It is more personal choice than any particular feature.
He is talking about a feature which both mfrs have, and he is choosing one version over another (Nvidia over ATi)
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
He is talking about forward feature sets. I am talking about the current situation.

 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
He is talking about forward feature sets. I am talking about the current situation.

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Soooo.... What use nVidia Extensions, then dont bother with ATi's then, if i remember correctly, other lesser known companies do this... *cough cough* Crytek *cough cough*

They dont seem, oh SM3 is better, so ill use that and dont bother using SM2.0b...

And so in the end, we see nVidia powering ahead because they are using nVidia extensions which ATi cant do, because they dont have nVidia only extensions, hey ATi brought out 3Dc, but wait its open source, so why not use that?

Carmack goes for open source development software, cool, then uses non-open souce specific GPU extension only one company can do?
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: Rollo
Do you honestly think a multimillionaire like Carmack thinks "Wow. nVidia sent me all these cards. I guess I'll use them rather than buy the ones I'd rather use." Or that ATI and the other chip companies don't send their hardware as well?


I think it's pretty safe to assume Carmack uses nVidia because he wants to use nVidia.
A few graphic cards + millions $$$$ = a lot of wants to. :D

 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
I'm pretty sure Carmack is doing fine on the money front.

The mofo is building a spaceship.
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Actually in the next paragraph, he lists some reasons why it makes more sense for him to use NV. He mentions things like instruction length that he doesnt have to worry about. Theres also a couple other things, but i cant remember them right now
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: Marsumane
Actually in the next paragraph, he lists some reasons why it makes more sense for him to use NV. He mentions things like instruction length that he doesnt have to worry about. Theres also a couple other things, but i cant remember them right now

I agree, nV stuff is possibly easier not to worry about, but in reality shouldnt he be considering about the other cards out there? Loads of other companies include many different paths of optmization for different cards, why cant he?

 

Originally posted by: Drayvn
Originally posted by: Marsumane
Actually in the next paragraph, he lists some reasons why it makes more sense for him to use NV. He mentions things like instruction length that he doesnt have to worry about. Theres also a couple other things, but i cant remember them right now

I agree, nV stuff is possibly easier not to worry about, but in reality shouldnt he be considering about the other cards out there? Loads of other companies include many different paths of optmization for different cards, why cant he?

Look at his target market...gamers. Gamers tend to use ATI or NV. Simple as that.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: FallenHero
Originally posted by: Drayvn
Originally posted by: Marsumane
Actually in the next paragraph, he lists some reasons why it makes more sense for him to use NV. He mentions things like instruction length that he doesnt have to worry about. Theres also a couple other things, but i cant remember them right now

I agree, nV stuff is possibly easier not to worry about, but in reality shouldnt he be considering about the other cards out there? Loads of other companies include many different paths of optmization for different cards, why cant he?

Look at his target market...gamers. Gamers tend to use ATI or NV. Simple as that.

And from what his statement said, hes gonna use only nV extensions and it sounds like he will leave out any ATi extensions because he has to worry about them? Umm if i buy his games, i expect him to think of all possibilities...

 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
Basicly he is saying "look you can't decide which of the indifferent options to go with so I'll decide for you" and probably force the other guy to comply since they aren't losing anything by complying anyway.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I read that once when Carmack was travelling in Manitoba, he was bit by a bear, and since then he's sworn off all things Canadian!

:roll:

Carmack has always used nVidia.

As noted, he uses OpenGL, nVidia has always had better OpenGL.

Also as noted, he ran into the limits of the R300 core when coding Doom3.






What's more surprising about this thread is I don't think there are many big game developers at all using ATI (look at nVidia's nV40 testimonial page at the roll call of every developer you've ever heard of saying the nV40 gives them the tools they need to code for the future) but no one is saying "Why isn't that Sweeney rascal using a X800 to code the next Unreal engine?!?! It's not fair!".

The fact of the matter is, even when ATI was "ahead" at Tomb Raider Angel of Sloppy Code, nVidia was still ahead in the check box features. (32bit precision, much longer instruction sets) If you're a developer and want to use this stuff, and SM3, you're left with nVidia.

Please don't yell "Tomb Raider was key! MS set the standard then not ATI". I know, I know.
 

T9D

Diamond Member
Dec 1, 2001
5,320
6
0
Well he wants to code for nvidia crads. He feels thats the best route. If he doesn't want the headaches and time wasted to code for other companies then that's what he wants to do. Servival of the fittest I guess eh.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Drayvn
Soooo.... What use nVidia Extensions, then dont bother with ATi's then, if i remember correctly, other lesser known companies do this... *cough cough* Crytek *cough cough*

They dont seem, oh SM3 is better, so ill use that and dont bother using SM2.0b...

And so in the end, we see nVidia powering ahead because they are using nVidia extensions which ATi cant do, because they dont have nVidia only extensions, hey ATi brought out 3Dc, but wait its open source, so why not use that?

Carmack goes for open source development software, cool, then uses non-open souce specific GPU extension only one company can do?

The patch to the cry engine added both SM2.0+ and SM3.0 support... if i recall correctly the 2.0+ was the reason for the recall on the patch, there were bugs.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
GOD AWFUL
bad design decisions
appaulingly bad.
crummy bad APIs,
stupid, petty little things
just silly syntactic things


what a whiner. nvidia can keep him! ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Acanthus
Originally posted by: Drayvn
Soooo.... What use nVidia Extensions, then dont bother with ATi's then, if i remember correctly, other lesser known companies do this... *cough cough* Crytek *cough cough*

They dont seem, oh SM3 is better, so ill use that and dont bother using SM2.0b...

And so in the end, we see nVidia powering ahead because they are using nVidia extensions which ATi cant do, because they dont have nVidia only extensions, hey ATi brought out 3Dc, but wait its open source, so why not use that?

Carmack goes for open source development software, cool, then uses non-open souce specific GPU extension only one company can do?

The patch to the cry engine added both SM2.0+ and SM3.0 support... if i recall correctly the 2.0+ was the reason for the recall on the patch, there were bugs.

There are no problems with the SM3- I've used that patch many times. They didn't recall it due to nVidia.



 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
Originally posted by: Drayvn
Soooo.... What use nVidia Extensions, then dont bother with ATi's then, if i remember correctly, other lesser known companies do this... *cough cough* Crytek *cough cough*

They dont seem, oh SM3 is better, so ill use that and dont bother using SM2.0b...

And so in the end, we see nVidia powering ahead because they are using nVidia extensions which ATi cant do, because they dont have nVidia only extensions, hey ATi brought out 3Dc, but wait its open source, so why not use that?

Carmack goes for open source development software, cool, then uses non-open souce specific GPU extension only one company can do?

Youre a quack. Crytek is developing ATi's new compression method for the next patch. I bet you see conspiracy everywhere.
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
Dravyn,

If you'd read the whole article, it looks like you would have found the answer to your question (nice troll comment though):

JC:
"The issues right now with the hardware are, the NV40 has a few things that make development easier for me. It has floating point blending, which saves me some passes for what I've been doing, well certainly have fallback positions so anything we do with blending we can do with an additional render and another texture copy pass on there, to work for NV30 and R300 class hardware. It's nice, and there's also the pretty much unlimited instruction count on the NV40 where there are times I'm writing these large fragment programs and it's nice to keep tacking more and more things in there as I look at it but I know full well I'll eventually have to segment these into something that can run on R300 class hardware. "


From that excerpt, it sounds like DEVELOPMENT is easier on the NV40 because of a couple of its added features (not because they were "flooded" with cards), but then at the end he says knows he'll eventually have to support all the cards anyway. I think his main point on the shadow buffers that you quoted (which is what he is playing around with for the next engine)is since he already uses NV cards for development, he is also using their extensions (since that's what he's developing on). Obviously he is hoping to encourage the Vid Card people to standardize on their interface or translator or whatever it is he's working with, but I guess we'll see how that goes on the next engine. It also sounds like he knows that no matter how the engine ends up, it will have to run on hardware from both vendors if it is to be successful.

-D'oh!
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: Acanthus
Originally posted by: Drayvn
Soooo.... What use nVidia Extensions, then dont bother with ATi's then, if i remember correctly, other lesser known companies do this... *cough cough* Crytek *cough cough*

They dont seem, oh SM3 is better, so ill use that and dont bother using SM2.0b...

And so in the end, we see nVidia powering ahead because they are using nVidia extensions which ATi cant do, because they dont have nVidia only extensions, hey ATi brought out 3Dc, but wait its open source, so why not use that?

Carmack goes for open source development software, cool, then uses non-open souce specific GPU extension only one company can do?

The patch to the cry engine added both SM2.0+ and SM3.0 support... if i recall correctly the 2.0+ was the reason for the recall on the patch, there were bugs.

Yea i know, i was being sarcastic...