Carmacks Quakcon speech

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Rollo
I read that once when Carmack was travelling in Manitoba, he was bit by a bear, and since then he's sworn off all things Canadian!

Ha Ha, very good. :beer: Actually he sort of looks like a guy running from a bear, very lean and intense. Experience tells me when you come face to face with a bear - you move much quicker than thought possible. Anyways now that I have seen some hl2 and played doom3, I think he needs to worry most about germans or whoever coded farcry. Much nicer graphics than either and works fine on both platforms. Maybe opengl is a thing of the past and microsoft wins (as usual) with direct 3d.
 

skace

Lifer
Jan 23, 2001
14,488
7
81
Wow, I'm just going to flat out say it. You guys are fvcking stupid. Carmack has been writing graphics engines, doing plan updates, and giving graphics lectures since before nVidia had a commercial video card. I bet if you guys were playing Quake1 right now you would be calling Carmack a 3dfx fanboy simply because those were the cards he coded around at the time. Well, he also coded about another type, but I don't recall the name anymore. When Carmack was developing Quake3 he had a lot of updates in regards to developing on the Mac and how he enjoyed using it. I'm sure he was a fanboy for the Mac back then also, right?

The funny thing is that you people would rather drag this down into an nVidia vs ATi mud slinging fest which it obviously isn't. The real point here is that there should be standardization here and there isn't. And it is because ATi and nVidia can't get their sh!t together. They are constantly trying to screw each other over and make things harder than they have to be.

Carmack started with Glide and moved to OpenGL, he still hasn't gone to DirectX but he did mention he became extremely tempted to. He used 3dfx cards when he programmed for Glide and he uses nVidia now that he programs for OpenGL.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
What u guys have said recently is very good, Carmack designed his engine to use the NV cards then made it work in other cards, this is not a bad thing, because at the top he wants the game to play at good performances while it had great graphics.

NV allowed him to do that, it was a decision he made, but that left a lot of other cards in the dark as they could not run with those extension, i just hope that when i buy a game that he makes the engine for, im paying for him doing his best, instead of trying to find a shortcut.

ATi isnt allowed to see or really cant see the NV hardware so cant make anything like it and vice versa, and they both take different routes to give the amazing performance they have now, i just wish Devs would be nice enuf, especially a company like iD, to take the time to do their best on the different top dogs of the graphics world.

And as i have said before, he used a look up table for anytihng that uses AF (or was it AA) while he could have added a sub code or a branch so when the game detects an ATi card or driver, it would switch to that branch which changes the look up table to a maths calculation. I would expect that off my hard earned money i pay for the game, and i bet it would have been damn easy for him to code it, as from the place this tweak was found, the guys that did it, have practically no clue how to code compared to Carmack. I just would like to humble ask why this couldnt have been implemented, Carmack even shrugged it off, and hes not going to implement this in his patches. Why?

I read that they have even fixed any of the image quality problems also, as stated in Elite Bastards where they tested this on the Pro. And they found no artifacts at all, as have many other sites which have tested this Humus Tweak. If Carmack spent the time figuring out ATis current architecture he probably could have come up with a way of increase the FPS, he even stated he hit the limit of the hardware, but as the Humus Tweak showed, he never did!

Skace, if both companies did start working together on making standards, whats the point in having the 2 companies, there would be no variation in performance, as they will match each other with these standards, and then there would be no competition, then the cards prices will soar. Also the companies dont know what the other one is making, they never get to see the inner workings of either cards, so they cant make it harder on each other. All they can do it go done a route they know how and hope to god someone makes a game for that route, tho from what i can see ATi really didnt lose out that much, especially with the tweaks out there that can increase the FPS without any visual Quality loss.
 

skace

Lifer
Jan 23, 2001
14,488
7
81
I think it is funny that you probably didn't understand 10% of his speech yet you have the balls to say he isn't doing his best work providing an engine that runs and is optomized on as much hardware as possible.

And standardization on 1 small part of an API is not a monopoly. It wouldn't make the cards magically the same. It would only make it easier for game developers. This isn't the first time Carmack has pointed out rediculous scenarios that simply make it harder for game developers to utilize multiple video cards. And it won't be the last. He voices problems as he sees them and hopes they get the attention they deserve.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: skace
I think it is funny that you probably didn't understand 10% of his speech yet you have the balls to say he isn't doing his best work providing an engine that runs and is optomized on as much hardware as possible.

And standardization on 1 small part of an API is not a monopoly. It wouldn't make the cards magically the same. It would only make it easier for game developers. This isn't the first time Carmack has pointed out rediculous scenarios that simply make it harder for game developers to utilize multiple video cards. And it won't be the last. He voices problems as he sees them and hopes they get the attention they deserve.

LOL!

He isnt doing his best work, no one does, when they say that he wants OTHER ppl to make HIS work easier!

DirectX and OpenGl are standards, they do make it easier, ever heard of XNA?

Carmack is a great guy, no doubt, he does voice problems when he sees them, but wait, he uses nV specific extensions and says he hit the limits of the R300/400 design, he didnt use ANY of the ATi extensions. He could have used the 3Dc compression tech... in fact, its an open source design.

Also, OpenGl is the standard for Graphics card companies to adhere to, and thats what devs make the games engine around. ATi has a alright OGL engine anyway, nVidia has a better one, but again, nVidia got the upper hand because Carmack used their extensions. Ok again i say, he wanted to get the best out of his engine, good going, but please come on dont say he hit the limits of the R300 design, again ill state, the humus tweak showed that. In fact what i think is he probably didnt even bother to test the limits, or really do as much testing as he did with an nV card.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
He isnt doing his best work, no one does, when they say that he wants OTHER ppl to make HIS work easier!

Wanting standards is being lazy now? Carmack has used nV mainly because ATi's drivers suck, particularly their piss poor OpenGL implementation. Another reason is that he tells everyone what he wants to see in hardware and nVidia gives it to him. This isn't secret documentation, look at his .plan files from late '99 early 2K and he gives the blueprint for the NV3x. If I'm a game developer and a company gives me everything I ask for I would tend to use them. This isn't anything like some other developers who come and publicly support an IHV after they are given millions of dollars to do so- Carmack told everyone what his priorities were years ago(his driver complaints have been numerous over the years- for some time ATi's parts couldn't render the D3 console properly) and nV filled them. Sweeney has recently made clear what his needs are for the U3 engine- whoever fills those requirements the best is going to be his lead development platform. If ATi fits those needs best they will be in. If they ignore it and go their own route then they likely won't be- same with nV. I don't see why people have an issue with someone saying years in advance what they want and then sticking with it.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
BFG:
If you don't stop this racist bullshit Rollo I'm going to report you to the mods. You've been an idiot for years but now you're crossing the line.
LOL- grow up. Being from New Zealand implies no racial slur that I'm aware of. I was implying that Carmack's comments were so straightforward that you must speak a different form of English if you think they need "interpretation".
You should be ashamed of yourself for hissing "racism" when you know full well what I meant. You do a disservice to anyone who's ever been a victim of racism by equating my jovial nationalism with something that is actually a crime here and pretty much universally reviled. :roll:

As straightfoward as your 9800 Pro to 5800 "upgrade"?
The rest of it I'm not going to bother with. BFG, no one else in the world cares why I traded my 9800P for a 5800U. Otherwhise I'd tell you yet again. (and probably next week...and the week after that...and the week after that.....)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Jiffy:

Rollo, you're a knob. You are the only person on the planet earth that talks about Tomb Raider:AOD anymore
The point was that for a real long time the only place you could see DX9 R300>nV40 ownage was at TRAOD. BTW- many sites still bench it, need links?
BTW- now we're calling names like kids on a playground?

And this petty xenophobic semi-serious "comedy" about Carmack hating Canadians because of an errand moose,
Do you know what a "joke" is? I have nothing against Canada, or anywhere else for that matter.

your grudge against New Zealand and New Zealanders because of BFG10K, etc has to stop. It is so juvenile it's absurd.
Yeah, I've declared war on New Zealand. :roll: I don't even have any ill will toward BFG, let alone his fine country. :)

You haven't contributed anything meaningful to the boards in quite a long time, aside from your helpful 6800nu/softmod vs 6800GT thread. Lately it's been the "R300 was overrated" greatest hits, with pro-Nv fanboy Rollo as your narrator.

Hmmm. Why don't you point me at your big contributions Jiffy? I posted benches of 6800NUs before anyone had them. I posted benches of Doom3 on 3 cards a week before most had the game. I benched many games on the 6800NU and 6800GT and posted results. Twice, at ATer request, I underclocked the GT and posted requested comparison benches. Another ATer wanted to see some high res video frames dropped, I spent a fair amount of time doing that. I posted SM3 benches at Far Cry long before the patch was made public. Etc etc etc.

So why don't you link me to the work you're doing lately that gives you the right to spit on me?

Seems to me you spend half you time lately trying to argue with me about 5800s.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo

Hmmm. Why don't you point me at your big contributions Jiffy?

Sure thing Rollo. Then we can whip 'em out and see whose is longer.

It isn't a spitting contest as you seem to view it ("your big contributions") but just your general attitude towards everything in general (particularly regarding ATI, which seems to be quite the hot button with you).

 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
Dravyn, you need to read that speech again and try to understand what Carmack is saying (this goes for you too BFG)

When he says he hits the limits of the R300 Architecture, he is specifically talking about the length of fragment programs.

quote:

--------------------------------------------------------------------------------
For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.
--------------------------------------------------------------------------------

The reason why this is good for development work (even though it runs poorly in a game environment) is because Hardware andvancements outpace Software development by a significant margin. This means that if one company offers features that will be prevalent in a couple years (but much faster than when they first are available) then developers can use that card while working on their game and then know that by the time the game comes out, most of the high end cards will support those new features and at better framerates.

He is not talking about the limits of the R300's speed, or texture lookups, or anything like that (which is what that Humus mod does). He says quite specifically what he's talking about.

Also, that helps explain how you are wrong with this statement, BFG:
Great things to produce even shinier pipes at even bigger slideshows. It's actually quite ludicrous to suggest the R300 was limiting Carmack but the NV30 was giving him freedom, but then that's the sort of pro-nv propaganda you typically spread so I'm not really that surprised.

Carmack is certainly not saying he is speed limited with the R300, just that the NV30 offered the specific features he need (which is part of the reason the NV30 cards were so poor in DX - They were specifically tailored for what Carmack told ATI and NV he would be needing for his engine). ATI obviously decided to go more towards DX, and therefore is a much better performer in games like FarCry.

All these cards have strengths and weeknesses, and each developer has their own opinion on what is important. With the realtively new programmable shaders, each company went down different paths to implement the features, and were therefore stronger in different areas. As they release their next generation cards, they improve in those weaker areas and the cards from both makers will start performing more similarly across the board. This is exactly what NV did with the NV40 (really improved in the shader area that was so slow on the NV30). Hopefully ATI will get their next card to run Doom style engines better than their R3XX/R4XX series do. I fully expect them to improve in that area, since they still have better shader performance than NV it seems (at least with AF on).


-D'oh!
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Rollo
I posted benches of 6800NUs before anyone had them. I posted benches of Doom3 on 3 cards a week before most had the game. I benched many games on the 6800NU and 6800GT and posted results. Twice, at ATer request, I underclocked the GT and posted requested comparison benches. Another ATer wanted to see some high res video frames dropped, I spent a fair amount of time doing that. I posted SM3 benches at Far Cry long before the patch was made public. Etc etc etc.


A real citizen and for totally altruistic reasons!
:laugh:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Being from New Zealand implies no racial slur that I'm aware of. I was implying that Carmack's comments were so straightforward that you must speak a different form of English if you think they need "interpretation".
This is one of the most imbecile rationales I've sever seen

You should be ashamed of yourself for hissing "racism" when you know full well what I meant. You do a disservice to anyone who's ever been a victim of racism by equating my jovial nationalism with something that is actually a crime here and pretty much universally reviled.
Then STFU about nationality and start taking about video cards. If you can't do that then STFU completely instead of carrying on with this juvenile charade.

BFG, no one else in the world cares why I traded my 9800P for a 5800U
But you do, that's why you did it three times.

BTW, I'm still waiting for your "I upgraded from a 6800GT to a 5800 and it kicks ass and both cards are equal because nobody needs to go above 1024x768 and also nobody needs AA or AF".

When will that thread be coming, Mr collector?

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Also, that helps explain how you are wrong with this statement, BFG:
The issue here has nothing to do with Carmack's comments but rather with Rollo's infantile interpretation of them.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
I interpreted Carmack's comments as being something like this: "The OpenGL spec has a lot of stupid, inefficient claptrap from the old days in it and that sucks. ATI and Nvidia have different ways of speeding the coding process, but they aren't interoperable. I think Nvidia's are more user-friendly and use those first. Also, ATI has consistantly written weak OpenGL drivers compared to Nvidia."

Reading conspiracy into these comments smacks of Fanchimpism.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: jiffylube1024
Originally posted by: Rollo

Hmmm. Why don't you point me at your big contributions Jiffy?

Sure thing Rollo. Then we can whip 'em out and see whose is longer.

It isn't a spitting contest as you seem to view it ("your big contributions") but just your general attitude towards everything in general (particularly regarding ATI, which seems to be quite the hot button with you).

The point is that you say I "haven't contributed anything meaningful to the boards in quite a long time", I think all the things I mentioned are decent contributions.
I asked what you have contributed lately because you seem to think a person's "worthiness" to post hinges on their "meaningful contributions".

I have nothing against ATI, and never have.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Being from New Zealand implies no racial slur that I'm aware of. I was implying that Carmack's comments were so straightforward that you must speak a different form of English if you think they need "interpretation".
This is one of the most imbecile rationales I've sever seen

BFG, I think we have a cultural difference going on here. In America, when you say "I don't know where you're from, but where I'm from when a man says "A cat is a mammal" we assume he means "A cat is a mammal"" you're implying that there's no other interpretation possible. It's an old cliche', I'm sorry I modified it a bit with the "New Zealand-ese". Like Annoyed Grunt was pointing out above, what he said seemed self evident to me.

You should be ashamed of yourself for hissing "racism" when you know full well what I meant. You do a disservice to anyone who's ever been a victim of racism by equating my jovial nationalism with something that is actually a crime here and pretty much universally reviled.
Then STFU about nationality and start taking about video cards. If you can't do that then STFU completely instead of carrying on with this juvenile charade.[/quote]
I agree. I shouldn't have brought your nationality into it. However, like I said, being from New Zealand has nothing to do with "race" so my comment wasn't what we call "racist" here.


BFG, no one else in the world cares why I traded my 9800P for a 5800U
But you do, that's why you did it three times.

BTW, I'm still waiting for your "I upgraded from a 6800GT to a 5800 and it kicks ass and both cards are equal because nobody needs to go above 1024x768 and also nobody needs AA or AF".

When will that thread be coming, Mr collector?

[/quote]


It boggles my mind why you think me buying three 5800s means something and what you think it means.
I bought 3 sets of V2 SLI over a couple years too- do you think I have a 3dfx secret mission as well?

I sold my GF1 and bought a slower ATI Rage MAXX- do you think I'm biased for ATI now too?

I sold my faster GF2 Pro and bought a slower Radeon 32DDR- is that ATI bias?

If you go back through my history, I've went from faster to slower cards many times, irrespective of brand. Yet you inexplicably seem to think there's some great conspiracy in me getting a few 5800s.


 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
BTW- now we're calling names like kids on a playground?


It's such a good thing you're above such pettiness, isn't it Rollo?






Originally posted by: Rollo


Amoppin':

Just because you don't really have anything to contribute about your experiences with modern video cards*, is that any reason to flame those of us who do?



Of course, that won't matter to you, will it Amoppin'? As the r500won't cost $200 for years, you won't really have any idea what they're like, will you?



My "nappy"?! LOL, Why BFG, I always thought you were a kid in some suburb pounding away your hate speech. I see you're in New Zealand in your profile. No wonder you don't try more cards- they probably cost a month's pay where you are. It must kind of burn your ass that here in the states we can get this stuff really cheap and trade cards as we please for little money. (seeing as how dear you hold this stuff?)
Tough luck there ol buddy



LOL- Oh no! Newbie member points out some benchmarks I've seen many times wherein the 5800U doesn't do as well as you'd expect!

Please.....please...stop....Lord Tyranus.....you're ...killing .....me.....



So wait a minute- you're saying nVidia used to be on top, then ATI, now they're evenly matched? You must be a professor or something Creig- I couldn't have figured that out! LOL



Where did you get that? A fortune cookie? LOL



Why don't you cut those less BRILLIANT than you are some slack buddy?



Time will tell, eh, wise one? LOL



Ronnn said I'm a troll. I can never come back here, as Ronnn is the voice of the board.

Please oh please Ronnn. I won't troll anymore! I'll pimp the Humpus tweaks and their pathetic 2fps increases! I'll say "DX9b is good enough!" I'll say "Who needs that UltraShadow or video encoding!" I'll say "who needs linux?"

For the love of God Ronnn, I have a family. Don't say I'm a troll.



LOL- why does it not surprise me that you somehow equate your slow as frozen molasses 9800XT to somehow be on par with cards that totally own it like 6800GTs?

Earth to Apoppin:
Your card is like something a homeless guy would have compared to a 6800GT.

No 20% vs 30% to argue about here, Poppinfresh- the 6800GT is TWICE as fast as your caveman 9800.

Too bad, so sad. The only guys who consider the 9800XT "fast" these days are sleeping on heat exhaust grates and dining in soup kitchens.



There's a big difference in coming up with $133 and $250 for some people, and people who live with their Dad's sometimes forget that.
*cough*General Grievous*cough*



My four year old said to me just tonite,"Dad, why doesn't Apoppin' have nearly as good of a video card as I do? Should I take it easy on him if I meet up with him online?" So I guess he likes it fine.



The better fps at one of the twelve settings doesn't make the 9800Pro a better buy General. Most people want the card that's better 90% of the time, not 10%?

I'll stop rambling about the XT when you stop telling people how they have to spend $400 and up on a video card, when you live at home with your parents for free and don't really know what it's like to have to pay rent/mortgage, and buy your own Frosted Flakes.



Sheesh get over it. What do you want me to do? Hang a red ATI flag in my rec room, start goosE stepping around it with my arm raised high, shouting "Sieg Heil! Sieg Heil! Doom 3 is teh suxorz, I'm waiting for HL2!"??????


 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: AnnoyedGrunt
Dravyn, you need to read that speech again and try to understand what Carmack is saying (this goes for you too BFG)

When he says he hits the limits of the R300 Architecture, he is specifically talking about the length of fragment programs.

quote:

--------------------------------------------------------------------------------
For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.
--------------------------------------------------------------------------------

The reason why this is good for development work (even though it runs poorly in a game environment) is because Hardware andvancements outpace Software development by a significant margin. This means that if one company offers features that will be prevalent in a couple years (but much faster than when they first are available) then developers can use that card while working on their game and then know that by the time the game comes out, most of the high end cards will support those new features and at better framerates.

He is not talking about the limits of the R300's speed, or texture lookups, or anything like that (which is what that Humus mod does). He says quite specifically what he's talking about.

Also, that helps explain how you are wrong with this statement, BFG:
Great things to produce even shinier pipes at even bigger slideshows. It's actually quite ludicrous to suggest the R300 was limiting Carmack but the NV30 was giving him freedom, but then that's the sort of pro-nv propaganda you typically spread so I'm not really that surprised.

Carmack is certainly not saying he is speed limited with the R300, just that the NV30 offered the specific features he need (which is part of the reason the NV30 cards were so poor in DX - They were specifically tailored for what Carmack told ATI and NV he would be needing for his engine). ATI obviously decided to go more towards DX, and therefore is a much better performer in games like FarCry.

All these cards have strengths and weeknesses, and each developer has their own opinion on what is important. With the realtively new programmable shaders, each company went down different paths to implement the features, and were therefore stronger in different areas. As they release their next generation cards, they improve in those weaker areas and the cards from both makers will start performing more similarly across the board. This is exactly what NV did with the NV40 (really improved in the shader area that was so slow on the NV30). Hopefully ATI will get their next card to run Doom style engines better than their R3XX/R4XX series do. I fully expect them to improve in that area, since they still have better shader performance than NV it seems (at least with AF on).


-D'oh!

Thanks for the comment a well rounded and polite answer, definitly different from the personal battles above us.

I must have read it wrong then, i thought he was talking about the technical and hardware limits of the R300 and stuff like that.

But i still stand by my comment on why Carmack couldnt have made a branching code specific for ATi, if hes supposed to be so good at coding, and the knowledge that ATi have a superior maths engine, as seen if u put the Humus tweak on the nV cards, u get a decrease in performance. But back to the thing i was talking about, why couldnt carmack have implemented this, and from what i read at the forums, Carmack even shrugged this off, saying this brings bad quality to the game, even tho by the end of the discussion with all the community they found a way to give great performance without any visual quality lost, im still quite perplexed as to why he would look the other way!...

Again, thanks for ur very thought out reply, and i still love both cards just the same, I can see ATi picking up the pieces as ive heard rumours that they are nearing completion of the R500 for the Xbox 2, and they are also nearing completion of a new GPU for the nintendo next gen console too!

Which means we will be seeing the new R520 very soon i hope, if what the rumours are true, if the R500 is completed by the end of the year, i dont see many complacations as to putting it in PC form...

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: BenSkywalker
He isnt doing his best work, no one does, when they say that he wants OTHER ppl to make HIS work easier!

Wanting standards is being lazy now? Carmack has used nV mainly because ATi's drivers suck, particularly their piss poor OpenGL implementation. Another reason is that he tells everyone what he wants to see in hardware and nVidia gives it to him. This isn't secret documentation, look at his .plan files from late '99 early 2K and he gives the blueprint for the NV3x. If I'm a game developer and a company gives me everything I ask for I would tend to use them. This isn't anything like some other developers who come and publicly support an IHV after they are given millions of dollars to do so- Carmack told everyone what his priorities were years ago(his driver complaints have been numerous over the years- for some time ATi's parts couldn't render the D3 console properly) and nV filled them. Sweeney has recently made clear what his needs are for the U3 engine- whoever fills those requirements the best is going to be his lead development platform. If ATi fits those needs best they will be in. If they ignore it and go their own route then they likely won't be- same with nV. I don't see why people have an issue with someone saying years in advance what they want and then sticking with it.


But the thing is, software shouldnt be dictating how hardware should be made, it will make too much of a gap inbetween, u could has someone say, i want this, but the other person wants that, then the 2 GC companies would be so at the extremes that they cant compete in everday games, there would be tons of work that had to be done. GPUs should HAVE to be the standard, and Devs work towards that standard when it comes, or upon getting comfirmation that its going to be in there, or if the devs cant work with that particular code, since they know this 2 years in advance, make alternatives.

If what ur saying is true, Carmack had tons of time to get an alternative code set for the ATi architecture. But instead, he went with nVs because thats what suited him best?

Also, the Devs have more freedom, because they can update or change their code to suit whatever they want, but once the GC companies release their card, its sooo much harder to change it, other than using drivers, but that is very limited...
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
But the thing is, software shouldnt be dictating how hardware should be made

Of course software dictates how hardware is made, always. Hardware is made, purchased and used to run software- and how well it runs the software determines how good the hardware is considered. It has always been that way.

u could has someone say, i want this, but the other person wants that, then the 2 GC companies would be so at the extremes that they cant compete in everday games

The market sorts those things out on its own.

GPUs should HAVE to be the standard, and Devs work towards that standard when it comes, or upon getting comfirmation that its going to be in there, or if the devs cant work with that particular code, since they know this 2 years in advance, make alternatives.

GPUs should have to be which standard? You take the last get of vid cards with MS lacking the finalized specifications, or even being close for that matter, for DX9 when the IHVs were finishing up the design phase of their GPUs nV went with the existing IEEE standard by using FP32 while ATi went with an odd format that noone had any standard for at the time, FP24. Who was 'right' in their choice? Hindsight being 20/20 MS's eventual decission to utilize FP24 for DX9 made ATi look pretty good, but when they were in the design phase they were pushing something that didn't exist in industry standards anywhere. GPU makers don't know much of anything two years in advance. DX10's specs still haven't been finalized although R500/NV50 major design choices have been completed for some time. With the exception of Sweeney we haven't heard much out of the major engine developers in terms of what they want for their next engine- so which standards is it the IHVs should be using? What they do is they take their best estimate of where software is going to be and they try to build the hardware that suits it.

If what ur saying is true, Carmack had tons of time to get an alternative code set for the ATi architecture. But instead, he went with nVs because thats what suited him best?

He dropped the NV30 code path, what do you mean he went with nV? He based his core development around nVidia because they had drivers that were capable of using the API he always has used. ATi's products for some time had drivers so poor they couldn't even render the console properly in D3. How is a dev supposed to work with hardware with such incredibly poor drivers? That being said, when ATi started to make progress on their end with significantly improved drivers and they launched the R300 core Carmack was using their parts for a while as his main dev platform.

If you are talking about the Humus hack that makes R420 parts speed up, it reduces accuracy. The NV30 path did the same thing and the ATi loyalists at the time were throwing a fit that 'it wasn't fair'. Carmack wanted to simplify the render engine so he removed it and had all the parts in roughly the same class use the same path.

Also, the Devs have more freedom, because they can update or change their code to suit whatever they want, but once the GC companies release their card, its sooo much harder to change it, other than using drivers, but that is very limited...

Software companies should do whatever they want to do in terms of making their game- the IHVs need to do their best to take an educated guess to figure out where that is going to be. Hardware exists to run software.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Creig:
You spent all that time sifting through my posts and found a couple instances where I changed one users name a little?
That I believe I apologized for?


And several examples of sarcasm?

I see what you mean- I am obviously calling people names all the time......


Sure- you just can't find any examples of it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
BFG, I think we have a cultural difference going on here.
I don't, I think you're simply extending your juvenile family and collector arguments to the next level. You've been tooled so many times in the logic department that all you really have left is irrelevant rubbish like this.

I sold my GF1 and bought a slower ATI Rage MAXX- do you think I'm biased for ATI now too?
I don't know, did you post a thread "my MAXX kicks ass, at 320x240x16 its equal to the GF"?
Do you post a "speed doesn't matter, all we need is AFR!"
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Ben, how can u say software dictates what hardware does, there is no 64-bit operating system for us desktop user is there, but we have 64-bit CPUs

Hyper Threading came out first then programs were being built for it

I dont know how u can say that, i even read from one of the guys who work for Get In the Game for ATi, their equivalent The Way its Meant to be Played for nVidia, said that they have a small team for europe who go out to games companies and sit down with them and optimise the game code for their cards, it isnt the other way round...

Ok your idea could be valid, and i can be totally wrong, but what i think is that if software can dictate what we can have in hardware, whats the point in pushing the boundries? might as well make hardware to suit just those types of software then dont do anything other than that

PS2 comes out on GPUs, devs are only starting to use it... same with PS3 except devs are catching up now.

How about 3Dc, ok it might not become the new compression standard, but how could a company ASK a graphics company for that? 1 dev uses it, how useful!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I don't, I think you're simply extending your juvenile family and collector arguments to the next level. You've been tooled so many times in the logic department that all you really have left is irrelevant rubbish like this.

I think Jiffylube said it best:
"Meh. Not worth my time."