• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Tim Sweeney, about GPGPU!

MODEL3

Senior member
Bellow are some things about GPGPU that Tim Sweeney said in his keynote at High Performance Graphics 2009 conference:

http://www.xbitlabs.com/news/v..._Set_to_Disappear.html

it is dramatically more expensive to develop software that relies on general purpose computing on graphics processing units (GPGPU) than to create a program that utilizes central processing units!

Although the market of video games has been growing rather rapidly in the recent years, game budgets have not been increasing that rapidly...
although performance increase may be 20 times, game budgets will only increase less than two times!


if the cost (time, money, pain) to develop an efficient single-threaded algorithm for central processing unit is X, then it will cost two times more to develop a multithreaded version costs, three times more to develop Cell/PlayStation 3 version and ten times more to create a current GPGPU version. Meanwhile, considering the game budgets, over two times higher expenses are uneconomical for the majority of software companies!

With so high development costs, it is not surprising that potentially very advanced heterogeneous multi-core Cell processor (jointly developed by IBM, Sony and Toshiba) is generally not very popular, and so are GPGPU technologies. According to the head of Epic, current GPGPU models are limited in general!

?In the next generation we?ll write 100% of our rendering code in a real programming language ? not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"

the days of DX GPUs are counted...

Do you think he is just trying to influence the future?

Or do you think that his statements are valid?


 
He is correct to a degree. 20x higher cost . I don't believe . But ya till everone has good compilers for threading . He is correct . Lets see who solves the compiler problem first.
 

if the cost (time, money, pain) to develop an efficient single-threaded algorithm for central processing unit is X, then it will cost two times more to develop a multithreaded version costs, three times more to develop Cell/PlayStation 3 version and ten times more to create a current GPGPU version. Meanwhile, considering the game budgets, over two times higher expenses are uneconomical for the majority of software companies!

Sounds like a load of bullshit. If it cost millions of dollars to add PhysX to a game, no game would have PhysX. Of course it's not free to add this stuff, but come on.
 
Originally posted by: ShawnD1
Sounds like a load of bullshit. If it cost millions of dollars to add PhysX to a game, no game would have PhysX. Of course it's not free to add this stuff, but come on.

He said an algorithm!

He didn't say the cost to build a game!

He didn't even say the cost to build an engine!

Afterall he supported a PhysX implementation in some of his own games (with the right $$$ from NV!) :laugh:



 
Well, right now your options for using GPGPU are CUDA, OpenCL, and DirectX Compute. DirectX Compute probably is very limited, and OpenCL and CUDA resemble low-level C and require lots of effort to get an efficient parallel solution out of.

Once nvidia (or someone else) develops friendlier languages for GPUs, dev cost shouldn't be too much higher. Heck, it's partway there, nvidia has a fortran compiler for their gpus.
 
Originally posted by: Fox5
Well, right now your options for using GPGPU are CUDA, OpenCL, and DirectX Compute. DirectX Compute probably is very limited, and OpenCL and CUDA resemble low-level C and require lots of effort to get an efficient parallel solution out of.

Once nvidia (or someone else) develops friendlier languages for GPUs, dev cost shouldn't be too much higher. Heck, it's partway there, nvidia has a fortran compiler for their gpus.

?In the next generation we?ll write 100% of our rendering code in a real programming language ? not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"

Look Arm uses C++ . And them in Nv are inbed together . NV uses cuda. Its a marriage but not a good one. ATI I not sure what there using . But I look for them to change to C++. Larrebee X86???? Imagination uses c++ .

I believe all the above are C++ (not ATI) compute with VLIW backends. Not sure about Larrabee/ ARM . On backend . Arm is actually the big one here as its C++ compute. But there also odd man out because they go against ATOM. But I not real sure the Marrage with NV will work . The Wild card is Imagination . If the 4 core performs as advertized , The 8 core be real killer on soc.

 
He is right about cost for the most part. Cost are going up while budgets are going down.
Multi-threading is the major pain in the ass for the x86 computing industry right now. Even the best compilers do not make full use of all cores. It is really hard to split some task up to use multiple cores. I think an architecture change will have to occur before we see applications like Sweeney suggest. I can't see the current form of x86 working.

I would really like to see x86 embrace what ARM has done. Leave the cpu for processing generic instructions and let custom chips do things that they are purpose built for, similar to how we have the GPU for graphics, add a DSP , FPU, all that operate independently from the cpu. Very similar to how computers like the Amiga were designed.
 
Ya Arm is good there . NV has good GPU for them . and CUDA. I see this as good possiabilities But I keep looking over at Imagination and if thats as good as people say. It looks good also I believe not sure but its a GPGPu.

As for X86 On larrabee , what to make of this . Native larrabee /X86 / Plus what ever backend goes with the vector units. SO right now not real sure what Larrabee X86 is or how its implamated Looks like a hybred .
 
Originally posted by: MODEL3
Originally posted by: ShawnD1
Sounds like a load of bullshit. If it cost millions of dollars to add PhysX to a game, no game would have PhysX. Of course it's not free to add this stuff, but come on.

He said an algorithm!

He didn't say the cost to build a game!

He didn't even say the cost to build an engine!

Afterall he supported a PhysX implementation in some of his own games (with the right $$$ from NV!) :laugh:

Sophistry. This is almost like the people who used to push RAM that was two to three times as expensive as standard RAM because in a memory only benchmark the expensive RAM would vastly outperform standard RAM. In reality, RAM is only one factor in overall system performance and the expensive RAM would only increase overall performance in some tasks by maybe 2% or so.

This is the same type of argument. It costs twenty times more to develop a proper algorithm to take advantage of GPGPU! I mean, this might increase the budget by $500k (hell let's say 1 million) but when your budget is already 5 million or higher, it doesn't sound as outrageous. If you're someone like Epic Games who licenses engines to other companies, then this doesn't seem outrageous at all. In fact, I find it highly surprising this is coming out of the CEO of Epic Games.

If you're another development house who keeps all their custom tools and stuff in-house, it's not like the time and effort spent on utilizing GPGPU can't be recycled for other games down the line. It would be highly strange to recreate the wheel every time you make a game. I'm sure that even though someone says that a game is running on a completely new engine, that engine is not 100% new. There's always re-used bits of code here and there. Even if the old code was not used and new algorithms and code was developed, usually you go back and see what worked and what didn't in the old code to create a more efficient, higher performing, engine.
 
Originally posted by: akugami
Originally posted by: MODEL3
Originally posted by: ShawnD1
Sounds like a load of bullshit. If it cost millions of dollars to add PhysX to a game, no game would have PhysX. Of course it's not free to add this stuff, but come on.

He said an algorithm!

He didn't say the cost to build a game!

He didn't even say the cost to build an engine!

Afterall he supported a PhysX implementation in some of his own games (with the right $$$ from NV!) :laugh:

Sophistry. This is almost like the people who used to push RAM that was two to three times as expensive as standard RAM because in a memory only benchmark the expensive RAM would vastly outperform standard RAM. In reality, RAM is only one factor in overall system performance and the expensive RAM would only increase overall performance in some tasks by maybe 2% or so.

This is the same type of argument. It costs twenty times more to develop a proper algorithm to take advantage of GPGPU! I mean, this might increase the budget by $500k (hell let's say 1 million) but when your budget is already 5 million or higher, it doesn't sound as outrageous. If you're someone like Epic Games who licenses engines to other companies, then this doesn't seem outrageous at all. In fact, I find it highly surprising this is coming out of the CEO of Epic Games.

If you're another development house who keeps all their custom tools and stuff in-house, it's not like the time and effort spent on utilizing GPGPU can't be recycled for other games down the line. It would be highly strange to recreate the wheel every time you make a game. I'm sure that even though someone says that a game is running on a completely new engine, that engine is not 100% new. There's always re-used bits of code here and there. Even if the old code was not used and new algorithms and code was developed, usually you go back and see what worked and what didn't in the old code to create a more efficient, higher performing, engine.

Well, I wasn't absolutely sure about the term Sophistry in English (it is from the Greek word Sophistia) and guess what, it has nearly the same meaning in English as it has in the Greek language acording to this dictionary:

http://www.merriam-webster.com/dictionary/sophistry

subtly deceptive reasoning or argumentation

Can't you understand that when you say:

that a person uses subtly deceptive argumenations

is indicative for the character of this person?

So what did you not understood of the following:

http://forums.anandtech.com/me...=2236227&enterthread=y

Zero personal attacks (no matter the degree).

No baiting other members into an altercation

-------------------------------------------------------------------------------------------

Now, for the all the points you made, why are you answering to me?
I think you didn't understood what I wrote!
Let me make it a little more analytical:

ShawnD1, said:

1.What Tim Sweeney said, sounds like a load of bullshit!

2.Tim Sweeney meant that it cost millions of dollars to add PhysX to a game!

3.If it cost millions of dollars to add PhysX to a game, no game would have PhysX!

4.Of course it's not free to add this stuff, but come on!


And with my answer, I just meant the following:

1.No, Tim Sweeney didn't say that! (it cost millions of dollars to add PhysX to a game!)

2.Millions of dollars is the cost of a game!

Originally posted by: MODEL3
He (just) said "an algorithm"!

3.How is it possible to say that, did you misunderstood Tim Sweeney? But why?

Originally posted by: MODEL3
He didn't say "the cost to build a game"!

Originally posted by: MODEL3
(Heck), He didn't even say "the cost to build an engine"!


And I suppose you can understand that i was jokking with my comment:

Originally posted by: MODEL3
Afterall he supported a PhysX implementation in some of his own games (with the right $$$ from NV!) :laugh:

-------------------------------------------------------------------------------

Now like I said, your arguments essentially, have nothing to do with what i said, so i am not going to comment them (i feel that you are baiting for altercation!)

I will leave only a hint for you:

Parallelize what Intel said about:

"performance improvements per TDP increasements" ratio is extremely important!

With what Tim Sweeney meant about:

what budget's increasemeants ratio must have a part of an engine in order to be a economically viable move, if you do the scaling long-term!

I hope, i didn't confuse you more with the above!
 
Originally posted by: MODEL3

ShawnD1, said:

1.What Tim Sweeney said, sounds like a load of bullshit!

2.Tim Sweeney meant that it cost millions of dollars to add PhysX to a game!

3.If it cost millions of dollars to add PhysX to a game, no game would have PhysX!

4.Of course it's not free to add this stuff, but come on!


And with my answer, I just meant the following:

1.No, Tim Sweeney didn't say that! (it cost millions of dollars to add PhysX to a game!)

2.Millions of dollars is the cost of a game!

I remember watching a speech John Carmack gave a few years ago where he was talking about the development of Rage and he mentioned something about the game costing upwards of 20 or 30 million dollars to create (that includes building the engine from the ground up). A few million is not the cost of the entire game; that would only cover the game's engine. If you're telling me that GPU code is going to cost 5x as much as multithreaded CPU code, that really does amount to potentially millions of dollars difference.

The other weird part of the article is that it seems to contradict itself here:
?In the next generation we?ll write 100% of our rendering code in a real programming language ? not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"
He's complaining about the cost of development, but he doesn't want to use DirectX and OpenGL/OpenCL which are specifically designed to make things easier to write. How does that make sense? If he doesn't want to use high level programming languages that are inefficient but easy to write, why doesn't he start writing his games in assembly language? ASM is much more efficient than programming in C, is it not? Of course nobody does that because programming in ASM is extremely expensive. Using a video card is no different; he's bitching about low level GPU programming being very difficult then he dismisses DirectX because it's too slow or he doesn't like the name or some other asinine reason.
 
Originally posted by: Idontcare
The full-length original presentation is well worth the time to read.

http://graphics.cs.williams.ed...HPG2009/TimHPG2009.pdf

The guy knows what he is talking about, if you disagree with his comments then I respectfully suggest that perhaps you do not fully comprehend what it is that Sweeney is attempting to communicate in his presentation.


He gave a good presentation there. I do have a small issue. THE new game engine thing starting in 2009 -2014. That maybe true for all others but not Intel . Project offset is way ahead of all others.

 
Originally posted by: MODEL3
*SNIP*

that a person uses subtly deceptive argumenations

is indicative for the character of this person?

So what did you not understood of the following:

http://forums.anandtech.com/me...=2236227&enterthread=y

Zero personal attacks (no matter the degree).

No baiting other members into an altercation

*SNIP*

I think you're mistaking what I wrote as a personal attack towards you. It was more in the line of commenting on what Sweeney said. What I'm saying is that Sweeney is exaggerating the cost of implementing GPGPU to further his point. If claiming that it cost 10x as much to implement algorithms to take advantage of GPGPU when it's merely a small part of an overall cost then I don't know what is. That is why I consider what Sweeney wrote deceptive. As such, sophistry. I stand by what I said.

****EDIT****
I don't believe saying someone's argument was subtly deceptive (as per the definition of sophistry) is a personal attack at all. In fact, I think it is more in the line of an observation of that person's line of argument during the conversation and not at all about the person.

And I really don't get how you can say I'm baiting even if I was saying your argument was flimsy. While again, you were mistaken about where I was aiming my comments, let's just say I was saying that your arguments amounted to sophistry. That is an observation about your argument and not a personal attack nor was it baiting. I really think I would have gotten less of a response had I just said strait up your argument was misleading.
 
Originally posted by: ShawnD1
I remember watching a speech John Carmack gave a few years ago where he was talking about the development of Rage and he mentioned something about the game costing upwards of 20 or 30 million dollars to create (that includes building the engine from the ground up). A few million is not the cost of the entire game; that would only cover the game's engine. If you're telling me that GPU code is going to cost 5x as much as multithreaded CPU code, that really does amount to potentially millions of dollars difference.

Up until that time we were talking only about, what Tim Sweeney said!
In your reply you didn't mention Carmack!
Also you said exactly the following:

Originally posted by: ShawnD1
If it cost millions of dollars to add PhysX to a game, no game would have PhysX.

You didn't use the word few!
So, yes it cost, millions of dollars to make a game (5,10,15,20...)

Anyway, with your last post you cleared what you meant and in many things i agree!

I mean that, I also remember Carmack saying the "20 million dollars figure", of cource he meant AAA multiplatform games!
Not all multiplatform games cost that much (and of cource there are many good (and bad) games costing way way less!

If the "general purpose computing on GPUs" method adopted from the industry, it will a have negative economical effect in all the games! (not just multiplatform AAA!)

And i think you are trying to suggest that:

since the cost of "GPU code" is not a huge part of the development budget of a game,
why Tim Sweeney is making such a big deal out of it?


Well, first of all this is true for only the AAA multiplatform games!
But this is not the actual reason, that i think Tim Sweeney is making such a big deal out of it!

Like I said in my previous post, I think that he researched:

what budget's increasemeants ratio, must have a part of an engine in order to be a economically viable move for the industry to follow,
if you do the scaling long-term! (like 10 years from now!),


and he made the conclusion that 10X can lead to undesirable results long term, if you follow such economic model policies in your design philisophy!

Not only, I didn't do a research about it, but also i don't have his knowledge and resources to do the research, so i wont even try to challenge his conclusions!

If you have the qualifications, you can try!

Just kidding! (i know you didn't mean it that way!)


Originally posted by: ShawnD1
The other weird part of the article is that it seems to contradict itself here:
?In the next generation we?ll write 100% of our rendering code in a real programming language ? not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"
He's complaining about the cost of development, but he doesn't want to use DirectX and OpenGL/OpenCL which are specifically designed to make things easier to write. How does that make sense? If he doesn't want to use high level programming languages that are inefficient but easy to write, why doesn't he start writing his games in assembly language? ASM is much more efficient than programming in C, is it not? Of course nobody does that because programming in ASM is extremely expensive. Using a video card is no different; he's bitching about low level GPU programming being very difficult then he dismisses DirectX because it's too slow or he doesn't like the name or some other asinine reason.

He is not talking about assembly, he just wants to write in a C based language (C/C++/CUDA)!

His issues with DX/openGL is that "all games look kinda similar" so the customers (gamers) lose gradually their interest in gaming , or that the visuals are not advancing in a proper manner in order to entice new customers, so the market has the potentiality to subjected in steep problems!

 
Originally posted by: akugami
Originally posted by: MODEL3
*SNIP*

that a person uses subtly deceptive argumenations

is indicative for the character of this person?

So what did you not understood of the following:

http://forums.anandtech.com/me...=2236227&enterthread=y

Zero personal attacks (no matter the degree).

No baiting other members into an altercation

*SNIP*

I think you're mistaking what I wrote as a personal attack towards you. It was more in the line of commenting on what Sweeney said. What I'm saying is that Sweeney is exaggerating the cost of implementing GPGPU to further his point. If claiming that it cost 10x as much to implement algorithms to take advantage of GPGPU when it's merely a small part of an overall cost then I don't know what is. That is why I consider what Sweeney wrote deceptive. As such, sophistry. I stand by what I said.

****EDIT****
I don't believe saying someone's argument was subtly deceptive (as per the definition of sophistry) is a personal attack at all. In fact, I think it is more in the line of an observation of that person's line of argument during the conversation and not at all about the person.

And I really don't get how you can say I'm baiting even if I was saying your argument was flimsy. While again, you were mistaken about where I was aiming my comments, let's just say I was saying that your arguments amounted to sophistry. That is an observation about your argument and not a personal attack nor was it baiting. I really think I would have gotten less of a response had I just said strait up your argument was misleading.

akugami I think what you are speaking to is captured in this AnandTech Forum Guidelines section:

Originally posted by: DerekWilson
Member and Posting Guidelines

1) No trolling, flaming or personally attacking members. Deftly attacking ideas and backing up arguments with facts is acceptable and encouraged. Attacking other members personally and purposefully causing trouble with no motive other than to upset the crowd is not allowed.

http://forums.anandtech.com/me...=2070583&enterthread=y

Emphasis here is on the sentence regarding "deftly attacking ideas...is acceptable and encouraged"...I interpret your sophistry comment as pertaining to this aspect of arguing against a prior stated position and is not a personal attack.

Model3, I think maybe you misinterpreted akugami's post, or the spirit of his post.
 
Originally posted by: akugami


I think you're mistaking what I wrote as a personal attack towards you. It was more in the line of commenting on what Sweeney said. What I'm saying is that Sweeney is exaggerating the cost of implementing GPGPU to further his point. If claiming that it cost 10x as much to implement algorithms to take advantage of GPGPU when it's merely a small part of an overall cost then I don't know what is. That is why I consider what Sweeney wrote deceptive. As such, sophistry. I stand by what I said.

****EDIT****
I don't believe saying someone's argument was subtly deceptive (as per the definition of sophistry) is a personal attack at all. In fact, I think it is more in the line of an observation of that person's line of argument during the conversation and not at all about the person.

And I really don't get how you can say I'm baiting even if I was saying your argument was flimsy. While again, you were mistaken about where I was aiming my comments, let's just say I was saying that your arguments amounted to sophistry. That is an observation about your argument and not a personal attack nor was it baiting. I really think I would have gotten less of a response had I just said strait up your argument was misleading.
Look if you meant sophistry for Tim Sweeney, then of cource it is not a personal attack to me!

But the thing is that you quoted my answer to ShawnD1,
you didn't quoted from the XbitLabs article what Tim Sweeney said!

So it was not obvious what you meant!
I guess now, you cleared the whole thing!

---------------------------------------------------------------------------------------------
Now about your EDIT!

I disagree!

If you say that one person is using subtly deceptive argumenations,
then you mean either one of the 2 following things:

1.His intent was to use subtly deceptive argumenations!

So showing this intent, it is quite clear that he is not exactly, a man of integrity!

So in this case it is a personal attack! (unless you can prove it without doubt!)


2.He didn't mean to use subtly deceptive argumenations, but it just happened!
(for example, some times we express ourselves wrongly, so our argumenations can have the effect to deceive someone to think other meaninings than the truth!)

But in this case at least in the Greek launguage, the word Sophistia (Greek word for sophistry) is used when someone has the intent!

I don't know how you use the word in USA, so i may misunderstood!

And also, if you mean that the guy is not having this intention, shouldn't at least mention it? (in order for the other forum members not to misunderstood?)

I mean if you are not somehow state it,
(that you meant that the guy is not having this intention)

does that leave open the field for a Forum member to misunderstand what you meant?

Why to leave open this field?

----------------------------------------------------------------------------------------

Anyway, like you said the whole issue is a big misunderstanding!
So, lets forget the whole thing and move on!


Take care,

modEL3

 
after selling out to consoles after ut2k4, i lost respect for this guy.

theres nothing wrong with gpgpu. itll just be a while til it becomes more widespread thus costs now are still high. but the time will come when gpgpu and parallelism will have a huge impact on computing. it has already begun.
 
Sweeney is all over the place on this stuff, so much so it is very hard to know what he is saying by the summations posted.

So developing for GPGPU is 10x more expensive, developers can't afford more then 2x the cost- so our next engine is going to be GPGPU based? Huh? Anyone home Tim?

His interest in software rendering doesn't surprise me, to his credit he did create the last commercially succesful software 3d engine to hit the market, but him and Abrash seem to be operating on a different mindset entirely then the rest of the industry.

So development costs are skyrocketting, and your solution is to go lower level then the API's we already have. Development costs are too high, so your solution is to develop for GPGPU and increase it by an order of magnitude.

Furthermore, based on the link IDC posted, he knows it will be slower then dedicated hardware. He knows that going rasterizer will be faster, using a hybrid mode for additional effects, and based on what he stated that will also be cheaper.

Perhaps he has conflicting views on the topic still, but really he is just all over the place in his comments.

after selling out to consoles after ut2k4, i lost respect for this guy.

Firing your loyal employees is more respectable? Driving your company into the ground to take some sort of platform bigot stance is admirable? Epic is not an idealist non profit organization, they are a business. Their engine is raking in tons of money for them right now precisely because it is cross platform capable. As a business decission it was a very smart move for Epic, without a doubt.
 
Originally posted by: MODEL3
*SNIP*
Look if you meant sophistry for Tim Sweeney, then of cource it is not a personal attack to me!

But the thing is that you quoted my answer to ShawnD1,
you didn't quoted from the XbitLabs article what Tim Sweeney said!

So it was not obvious what you meant!
I guess now, you cleared the whole thing!

Yeah, since you guys were discussing the Tim Sweeney quote I was just chiming in. Sorry if it was a tad unclear but it was more in the line of commenting on what Sweeney was saying.

I'm not going to pretend that 100% of what I do or say is without malice but generally I try to keep any personal attacks out of it.

Now about your EDIT!

I disagree!

If you say that one person is using subtly deceptive argumenations,
then you mean either one of the 2 following things:

1.His intent was to use subtly deceptive argumenations!

So showing this intent, it is quite clear that he is not exactly, a man of integrity!

So in this case it is a personal attack! (unless you can prove it without doubt!)

Well, when two people with opposing viewpoints discuss something and they have different opinions it's obvious that they will try to sway the audience and the one they are arguing with to their side. And by argument I don't mean we hate each other or anything, we're just trying to win the discussion.

Anyways, many times when these discussions are going on you'll notice that someone will bring up minor points, blow it out of proportion to try to sway the opposing side. As in my previous post, the high performance (and high costing) RAM vs regular RAM. While RAM is cheap now I remember people arguing for the more costly RAM cause it performs X percent better than the regular RAM. Sometimes these high performance RAM could be twice or more than the regular RAM. However, when such arguments is used, the person using such reasoning is not wrong. The RAM does perform X percent better. The problem is that when overall computer system performance is considered, it usually gives you a very minor boost in performance.


2.He didn't mean to use subtly deceptive argumenations, but it just happened!
(for example, some times we express ourselves wrongly, so our argumenations can have the effect to deceive someone to think other meaninings than the truth!)

But in this case at least in the Greek launguage, the word Sophistia (Greek word for sophistry) is used when someone has the intent!

I don't know how you use the word in USA, so i may misunderstood!

And also, if you mean that the guy is not having this intention, shouldn't at least mention it? (in order for the other forum members not to misunderstood?)

I mean if you are not somehow state it,
(that you meant that the guy is not having this intention)

does that leave open the field for a Forum member to misunderstand what you meant?

Why to leave open this field?

----------------------------------------------------------------------------------------

Anyway, like you said the whole issue is a big misunderstanding!
So, lets forget the whole thing and move on!


Take care,

modEL3

I probably could have couched my words better but my only defense can be that this is a forum and sometimes we just jot things down as we think them.

Anyways, I think Sweeney does have an intent to use a deceptive argument to prove his point. That doesn't necessarily mean that he's doing it with malice. When you are presenting a point, you do what you have to short of lying to win the argument. And let's face it, he wasn't lying, he was just giving an educated guess at the costs and performance benefits of GPGPU. I just feel he is being overly harsh in judging GPGPU and that he was using a somewhat deceptive argument to prove his point.

I guess for me, if you can present a good argument and use your wording well, I don't really find it a personal fault. I also separate malicious intent with just trying to win an argument by presenting facts in your favor. If you can present a good argument and present it well with facts to back it up, even if the wording is tricky, then all the more power to you. So long as you're not trying to cheat someone, I don't find it an attack on that person to point out that his argument is deceptive.

I guess we'll just have to agree to disagree if you still feel the way you do.
 
Originally posted by: akugami
Yeah, since you guys were discussing the Tim Sweeney quote I was just chiming in. Sorry if it was a tad unclear but it was more in the line of commenting on what Sweeney was saying.

I'm not going to pretend that 100% of what I do or say is without malice but generally I try to keep any personal attacks out of it.

You cleared the situation with your 2nd reply, so everything is fine!

Originally posted by: akugami
Well, when two people with opposing viewpoints discuss something and they have different opinions it's obvious that they will try to sway the audience and the one they are arguing with to their side. And by argument I don't mean we hate each other or anything, we're just trying to win the discussion.

Anyways, many times when these discussions are going on you'll notice that someone will bring up minor points, blow it out of proportion to try to sway the opposing side.

Yes, I agree with you, for the above!

Originally posted by: akugami
I probably could have couched my words better but my only defense can be that this is a forum and sometimes we just jot things down as we think them.

Yes, I know what you mean!

Originally posted by: akugami
Anyways, I think Sweeney does have an intent to use a deceptive argument to prove his point. That doesn't necessarily mean that he's doing it with malice....
And let's face it, he wasn't lying, he was just giving an educated guess at the costs and performance benefits of GPGPU. I just feel he is being overly harsh in judging GPGPU and that he was using a somewhat deceptive argument to prove his point.

me personally, wasn't sure if Sweeney had the intent!

That's why i asked:

Originally posted by: MODEL3
Do you think he is just trying to influence the future?

Or do you think that his statements are valid?

So also for this we don't disagree!


Originally posted by: akugami
When you are presenting a point, you do what you have to short of lying to win the argument. .......
I guess for me, if you can present a good argument and use your wording well, I don't really find it a personal fault. I also separate malicious intent with just trying to win an argument by presenting facts in your favor. If you can present a good argument and present it well with facts to back it up, even if the wording is tricky, then all the more power to you. So long as you're not trying to cheat someone, I don't find it an attack on that person to point out that his argument is deceptive

I guess we'll just have to agree to disagree if you still feel the way you do.

I will disagree with you on the text in bold!
This is the very definition of "The Ends Justify The Means"!
Which i completely despise!

Where to start, didn't the history taught us enough lessons already?
Means must always be morally bound!
That's my philosophy!

If your philosophy is different, I guess for this point, we'll just have to agree to disagree!
 
Originally posted by: Idontcare
The full-length original presentation is well worth the time to read.

http://graphics.cs.williams.ed...HPG2009/TimHPG2009.pdf

The guy knows what he is talking about, if you disagree with his comments then I respectfully suggest that perhaps you do not fully comprehend what it is that Sweeney is attempting to communicate in his presentation.
Much of his presentation is a bit wishy washy. He's a smart guy and he knows his stuff but some of his conclusions are.... out there.

page 25:
All games look similar
Derive little benefit from Moore?s Law
Crysis on high-end NVIDIA SLI solution only looks at most marginally better than top Xbox 360 games
I think his computer monitor might be broken. His own game Gears of War looks incredible on a GeForce 8 at 1920x1080 with full details. It far surpasses anything I've seen on the Xbox 360. The other benefit of Moore's' Law in gaming applies to draw distance and the amount of stuff shown rather than the quality of the stuff. In a game like Fallout 3 on PC, the draw distance goes on for miles and the game will draw objects as far as you can see if you happen to have the best video card money can buy. If you're limited to PS3 graphics, the draw distance is much shorter and objects can only be seen if they are close by. Moore's Law is not hard to see in modern games with modern hardware. Tim knows this.

The rest of the presentation goes into great detail on how to solve a problem that isn't really a problem. It seems like he's trying to imply that hardware rendering fails on a software level because DirectX and OpenGL don't have enough features (hence they all look the same in his opinion). This is partially true but it's mostly wrong. The reason DirectX and OpenGL don't have a good or good enough set of features is because we don't have hardware capable of running those features. He's talking about ray tracing and super awesome 100% realistic details, but most of us can't even max out Batman Arkham Asylum or Arma 2. Tim's presentation lists things like needing 4tb/s of bandwidth to do some kind of calculation he wants to do, but our best hardware is nowhere near that fast. Even if DirectX and OpenGL had every single feature Tim ever wanted and it was easy to program it, it would still be completely useless because the average gamer doesn't have 20 GTX 295 cards hooked in SLI.

He's right about GPGPU in general being a bit of a mess right now but it's for all the wrong reasons. Using GPGPU inside of games for physics calculations (PhysX) has so far been a complete failure. It looked nice in Fear and Mirror's Edge and Batman but overall it cripples the performance so bad that it's not even worth doing it. That would be bad enough if it was written by a game developer but much of that PhysX stuff was written by Nvidia and it's written specifically for their own hardware. If the people who made my video card can't even get this crap to run without destroying the game's frame rate, it's best that guys like Tim don't have access to this stuff just yet. Of course this goes back to the previous paragraph where Tim is expecting hardware to be more powerful than Commander Data; it should do pixel shading and do all this GPGPU stuff that no video card on the planet can handle.
Outside of games, he's absolutely 100% right. When the card isn't being used for drawing stuff and pixel shading, it would be nice to run programs on it.

It's nice that Epic Games has ambitious people who want to go above and beyond what our current hardware can do, but this is just a bit too ambitious.
 
Originally posted by: BenSkywalker
Sweeney is all over the place on this stuff, so much so it is very hard to know what he is saying by the summations posted.

So developing for GPGPU is 10x more expensive, developers can't afford more then 2x the cost- so our next engine is going to be GPGPU based? Huh? Anyone home Tim?

His interest in software rendering doesn't surprise me, to his credit he did create the last commercially succesful software 3d engine to hit the market, but him and Abrash seem to be operating on a different mindset entirely then the rest of the industry.

So development costs are skyrocketting, and your solution is to go lower level then the API's we already have. Development costs are too high, so your solution is to develop for GPGPU and increase it by an order of magnitude.

Furthermore, based on the link IDC posted, he knows it will be slower then dedicated hardware. He knows that going rasterizer will be faster, using a hybrid mode for additional effects, and based on what he stated that will also be cheaper.

Perhaps he has conflicting views on the topic still, but really he is just all over the place in his comments.

Unfortunately we don't really know how he spoke to the presentation materials when just going by the presentation itself. (at least I don't, I wasn't there)

Maybe he filled in the presentation gaps in a way that painted a cohesive story? Or perhaps it really was more of just a collage of his thoughts.

One thing I kept thinking when I read the presentation though was "ok, so what's this guy's angle here? He wants to make money at the end of all this...so is he defending his easy gross margins by vilifying the idea of anyone else doing what he plans on doing?".

So being a man of this industry as you are, what are you thinking Sweeney is up to? Think he's telling people to dodge right because he wants a clear and competition-free path for himself (and company) to dodge left?
 
Back
Top