"Inevitable Bleak Outcome for nVidia's Cuda + Physx Strategy"

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: munky
Basic physics will run on my cell phone for all I care, but the kind of physics NV fans are parading for their gpu will not run on x86. So when you say PhysX also runs on x86, it doesn't "run" the way it "runs" on the gpu, and it offers absolutely no advantage over alternative solutions when "running" on the cpu. Get it?

It offers the advantage that it can accelerate physics on systems with a GPU or PPU, while not having any disadvantages when running only on a CPU. Get it?
With PhysX you can have your cake and eat it too. That's why we currently see games using PhysX which can run fine on a CPU, and allow people with a PPU or nVidia GPU to enable extra effects. That's the advantage.

You can't fault PhysX for not being able to run the NV effects on CPU. x86 processors simply aren't capable of them. Not with Havok or any other API either. They're just too computationally expensive. Get it?

It's the same as complaining that you can't run the latest games with all the DX10 eyecandy on if you only have a simple non-accelerated videocard. Why can't it "run" on the CPU? Get it?

Originally posted by: munky
Then why doesn't NV certify its cards DX 10.1 compliant?

Because they aren't.
They support the multisample readback operations which allow DX10.1 to accelerate AA. Which seems to be the most redeeming value of DX10.1 anyway.
They don't support ALL DX10.1 features, so they can't be DX10.1-compliant.

Originally posted by: munky
Do other games like Stalker Clear Sky also use Nvidia's proprietary extensions for DX 10.1 effects?

I don't know the details of every game out there. I just know that Far Cry 2 is one of the games using nVidia's extension for efficient AA. It's likely that there are more 'DX10.1' titles which also use these extensions, especially if they are under the TWIMTBP-label.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: akugami
Maybe if you actually understood what I was trying to say instead of a somewhat snide opening reply it would foster better discussion. Try looking up math coprocessors, Altivec/VMX, and SSE 1/2/3/4.

These were technologies developed to enhance the CPU. Older CPU's could do math. Math coprocessors were more powerful at crunching numbers. The math coprocessors could handle offloaded instructions from the main CPU, freeing it up to crunch other instructions.

Altivec and SSE and similar instruction set extensions are sets of very special instructions that handle certain floating point and integer operations to speed up certain functions such as video encoding. Sure, SSE4 only added a few extra instructions over SSE3 which added only a few extra instructions over SSE2. However, early benchmarks of SSE2 vs SSE4 are showing a 40% increase in performance for DivX encoding.

Ramblings indeed.
You were arguing that "I don't believe nVidia has really designed a GPU for PhysX yet."
For all these ramblings you posted... All these extensions are STILL general purpose. Yes, SSE4 allows developers to improve performance in DivX encoding. That doesn't mean that this is the thing that SSE4 was specifically designed for, or that it is the only thing they're good for. SSE is also great for accelerating linear algebra or geometry-related math, to name but a few things.

Perhaps you meant to say "I think nVidia will extend its architecture to improve GPGPU performance in tasks like PhysX". That is quite different from what you said. What you said sounded more like nVidia would just insert a PPU core into the GPU. That most certainly is not going to happen. There is a clear trend away from fixed-function units, and towards more programmability and flexibility. Larrabee being the most obvious step in that direction.

Originally posted by: akugami
The fact that the G80 GPU cores and up were well designed with GPGPU in mind and in line with how Ageia used hardware to accelerated PhysX doesn't mean there isn't room for enhancements. If Ageia did not have tech that nVidia coveted (PhysX, both software AND hardware) then nVidia would not have shelled out good money.

I see no reason why nVidia would be interested in the hardware, to be honest.
Their PhysX performance is already VERY good. Much better than the PPU ever was.
The software was obviously important to nVidia, since PhysX (formerly NovodeX) was already a widely used API, especially on consoles. So they had a working solution, which developers were already familiar with. An ideal tool to leverage their GPU acceleration.

Originally posted by: akugami
I'm not a hardware engineer. I'm not even a programmer. I simply refuse to believe that there isn't hardware and software that nVidia obtained when they bought Ageia that can be integrated into their GPU's to further accelerate PhysX.

You also have to realize that technology is outdated very quickly. The PPU was already aging when nVidia bought Ageia. Just like nVidia bought out 3DFX, but never did much with their technology either (although both nVidia and 3DFX use a technology named SLI, the meaning of the acronym is different, as is the general technology). The technology just became irrelevant as GPU designs continued to evolve.

Originally posted by: akugami
I agree integrated GPU's will not be competitive with discrete video cards any time soon. However, with the pace technology can move, who can say what is possible in two-three years time. While discrete GPU's are larger than today's CPU's, AMD has shown what can be done with a smaller GPU core to be competitive with a larger one.

It's just a case of conflict of interest. CPUs and GPUs want different types of memory for optimal performance. I don't see any solution for that emerging anytime soon. With consoles they get away with using memory aimed at graphics, because CPU-performance is of secondary interest. Besides, GPU performance isn't as high as a high-end discrete card anyway, so the difference between CPU and GPU memory is smaller. But it remains a compromise between the two.

Bottom line is: there's no point in making super-powerful integrated GPUs when they're going to be bandwidth-starved anyway.

Originally posted by: akugami
I don't believe the decreased GPU power of integrated CPU/GPU's vs discrete GPU's will hurt Intel or AMD as much as you seem to think. Most general consumers simply won't care. Furthermore, OEM's definitely will like having one less part to stock. Don't discount the fact that Intel ships the most GPU chipsets even though they're current GPU's are crap for gaming comparatively speaking.

Except this discussion was never about that type of systems.
It's about Cuda and PhysX, which is of no interest to the average office user or casual home user.
It's only interesting for gaming and high-performance computing. Especially with high-performance computing, the name alone should be enough indication that people won't accept a slower integrated GPU.

Originally posted by: akugami
As for the memory issue. Hyper Transport and Intel Quickpath Interconnect or a similar bus technology can be made to deliver a high bandwidth bus along with plugin memory modules on a specially designed daughtercard port. Maybe it's some other solution.

Seems rather pointless. If you're going to use a separate module anyway, it might aswell be a PCI-e card with the GPU and memory close together with a direct connection.
HT and Quickpath still are no substitute for an on-die controller and a direct connection to the memory. Which is what both GPUs and CPUs have. A bus always adds extra overhead. Overhead you aren't interested in when your memory is dedicated anyway.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: akugami
From an end user standpoint, it doesn't matter a whole lot so long as in the end it delivers. CUDA is not bad in any way. At least not for end users, nor for nVidia. From a developers standpoint, whoever has the dominant standard is the one they'll eventually gravitate towards so if CUDA wins out (or some other format) then that's what they'll use.

Cuda isn't meant to run on anything but nVidia's GPUs.
In fact, part of what Cuda is, *is* nVidia's GPU architecture.
Just like AMD's x86 CPUs may use the same instructionset, but NOT the same architecture as Intel's (eg Conroe, Penryn, Nehalem).
At best, AMD could support C for Cuda (or actually the bytecode generated by its compiler) and implement their own driver API. But OpenCL is the equivalent of that anyway. So AMD will just go on the assumption that all current Cuda software will be ported to OpenCL, and then it will automatically work on both GPUs.
It's a safe assumption in general, except for products made by nVidia. We don't know yet what nVidia's plans are with PhysX, as said many times before. They MAY add OpenCL support.

That all depends on how commited nVidia is to PhysX. It seems pretty obvious that when a hardware-agnostic physics library arrives, that most developers will choose it over a vendor-specific one. Question is, is nVidia bothered by that?
If they are, they can just port PhysX to OpenCL, and continue to own a relevant physics API. In fact, they may have already ported PhysX to OpenCL. It's not that much work, coming from Cuda.
Then again, they may just think "Physics APIs aren't our core-business, it was fun while it lasted, but our cards can run Havok anyway, so who cares? We're not going to invest money in developing and maintaining PhysX anymore".
 
Apr 20, 2008
10,067
990
126
Just a little thought for everyone in this thread, who is this "Scali" person and why now (after 4 1/2 years) do they immediately pop up and only talk about nVidia's cuda/physx? And their position is obvious yet full of rhetorical contradictions; an nVidia supporter. Backed information from this person should be considered truth and you should listen to it, but take any opinion as with a grain of salt.

Its like a second coming of nRollo, yet less childish.

 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Scholzpdx
Just a little thought for everyone in this thread, who is this "Scali" person and why now (after 4 1/2 years) do they immediately pop up and only talk about nVidia's cuda/physx? And their position is obvious yet full of rhetorical contradictions; an nVidia supporter. Backed information from this person should be considered truth and you should listen to it, but take any opinion as with a grain of salt.

Its like a second coming of nRollo, yet less childish.

I'm not an nVidia supporter, I think the concept of fanboyism is retarded, and I didn't contradict myself anywhere. I have backed up everything I said in great detail.
If you want to throw such accusations around, you should back them up aswell, which you failed to do. Hence I accuse you of character assassination.

And I see nothing wrong with discussing on this forum. I happened to register here years ago (I don't recall why), but never used the forum before.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali

You forget that ALL physics are done by PhysX in engines like UE3. They don't use Havok or anything else. PhysX is a full physics solution for PCs and consoles. That is what PhysX does.
It also enables acceleration which you could use for either first-order or second-order physics effects. But that is not all it does. Get it?

Wrong.
I suggest you spend some time actually using UE3 . PhysX is not a requirement , nor is it the only physics engine in UE3.

 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Modelworks
Wrong.
I suggest you spend some time actually using UE3 . PhysX is not a requirement , nor is it the only physics engine in UE3.

Can you substantiate that claim?
What physics engine does it use, other than PhysX?
I can't find any references to anything other than PhysX on www.unrealtechnology.com
I'm quite sure that I'm right and you're wrong.
 
Apr 20, 2008
10,067
990
126
Hold your horses. I didn't toss out the F word there.

Reading through most of the thread and skimming parts of it, i'm not a programmer of any kind. I understand many of the facts being brought up.

The reason why i say this is that you made many general statements that shows that nVidia is a step ahead of Intel/AMD and no mater what they will be alright, which is completely subjective and can accurately label as pro-nVidia. By stating that if nVidia cant get Physx off to the mainstream or developers won't all hop on the boat, they can (or already have) easily port it over to the OpenCL API. Not mentioning that going this route would have made the entire phsyx acquisition a gaping, useless money pit. In a nutshell nVidia has screwed themselves as developers see that if they hold out on buying licenses for the Physx API, they can get it for free later on. Physx has already been proven to be nothing more then a graphical software layer as it can be ran on ATI hardware as well as G80+.

nVidia has a feature they can't sell. With the onset of OpenCL and DX11 coming, Physx will most likely be forgotten within a year or so.

OTOH, Cuda is very interesting and I would hope they find some way to reach an agreement for it to run on all modern GPU hardware.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali
Originally posted by: Modelworks
Wrong.
I suggest you spend some time actually using UE3 . PhysX is not a requirement , nor is it the only physics engine in UE3.

Can you substantiate that claim?
What physics engine does it use, other than PhysX?
I can't find any references to anything other than PhysX on www.unrealtechnology.com
I'm quite sure that I'm right and you're wrong.


It uses any physics engine you want to use.
When you purchase a game engine license you are not forced to use whatever the engine developer used. Instead you have the option to replace components of the engine with whatever you like. That is what makes things like UE3 useful to developers.

 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Scholzpdx
The reason why i say this is that you made many general statements that shows that nVidia is a step ahead of Intel/AMD and no mater what they will be alright, which is completely subjective and can accurately label as pro-nVidia.

Nope, it is not subjective at all.
If I said Core i7 is ahead of Phenom II in nearly all applications, is that subjective? Does that make me pro-Intel?
No, it's just a fact. A fact that people may accept more easily because it is common knowledge now.

Originally posted by: Scholzpdx
By stating that if nVidia cant get Physx off to the mainstream or developers won't all hop on the boat, they can (or already have) easily port it over to the OpenCL API.

It is a fact that porting from C for Cuda to OpenCL is not a very difficult task. Even the Bullet Physics developers have stated this in the presentation I linked to earlier. And they are independent developers of an opensource multiplatform solution.

Originally posted by: Scholzpdx
Not mentioning that going this route would have made the entire phsyx acquisition a gaping, useless money pit.

This makes no sense at all to me.
It's still PhysX, even if it runs on OpenCL.

Originally posted by: Scholzpdx
Physx has already been proven to be nothing more then a graphical software layer as it can be ran on ATI hardware as well as G80+.

Nobody can run PhysX on their ATi card, and ATi surely isn't going to make it happen either.
I have an ATi 4870 here. If you can prove that it can run PhysX... Tell me what I need to do to make it run Mirror's Edge...?

Originally posted by: Scholzpdx
OTOH, Cuda is very interesting and I would hope they find some way to reach an agreement for it to run on all modern GPU hardware.

That 'agreement' is OpenCL, which is fully backed by nVidia.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Modelworks
It uses any physics engine you want to use.
When you purchase a game engine license you are not forced to use whatever the engine developer used. Instead you have the option to replace components of the engine with whatever you like. That is what makes things like UE3 useful to developers.

Lol, that's the most pathetic argument I heard yet.
"Sure, it uses PhysX, but you could remove it and put something else in". Oh yea, right.
It's just that I don't know of anyone who bothered to do that.
Unreal Tournament uses PhysX, BioShock uses PhysX, Mirror's Edge uses PhysX...
Please, stop wasting my time. You said it USED other engines:
"PhysX is not a requirement , nor is it the only physics engine in UE3."
Not that you could modify it to remove PhysX, and put something else in.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Scholzpdx
Just a little thought for everyone in this thread, who is this "Scali" person and why now (after 4 1/2 years) do they immediately pop up and only talk about nVidia's cuda/physx? And their position is obvious yet full of rhetorical contradictions; an nVidia supporter. Backed information from this person should be considered truth and you should listen to it, but take any opinion as with a grain of salt.

Its like a second coming of nRollo, yet less childish.

Your post in its entirety can be considered a personal charectar attack on Scali. Without reason or provocation. Sorry dude, but this is getting forwarded to the modship. Entirely an attack on his credibility and charectar.
 
Apr 20, 2008
10,067
990
126
Originally posted by: Scali
Originally posted by: Modelworks
It uses any physics engine you want to use.
When you purchase a game engine license you are not forced to use whatever the engine developer used. Instead you have the option to replace components of the engine with whatever you like. That is what makes things like UE3 useful to developers.

Lol, that's the most pathetic argument I heard yet.
"Sure, it uses PhysX, but you could remove it and put something else in". Oh yea, right.
It's just that I don't know of anyone who bothered to do that.
Unreal Tournament uses PhysX, BioShock uses PhysX, Mirror's Edge uses PhysX...
Please, stop wasting my time. You said it USED other engines:
"PhysX is not a requirement , nor is it the only physics engine in UE3."
Not that you could modify it to remove PhysX, and put something else in.

I kind of laughed at this exchange.

In reality, The unreal engine 3 uses physx, as well as may other graphical interfaces to create its powerful, efficient engine. Yes, it uses phsyx, but not entirely.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Scholzpdx
In reality, The unreal engine 3 uses physx, as well as may other graphical interfaces to create its powerful, efficient engine. Yes, it uses phsyx, but not entirely.

Well obviously it uses more than just PhysX. It's the UnrealEngine, not the PhysXEngine.
Ofcourse it will also need something like Direct3D for graphics, then you'll want some character animation, audio, AI, and many other things that make up a game engine.
But it uses PhysX for the physics calculations in the engine.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali
Originally posted by: Modelworks
It uses any physics engine you want to use.
When you purchase a game engine license you are not forced to use whatever the engine developer used. Instead you have the option to replace components of the engine with whatever you like. That is what makes things like UE3 useful to developers.

Lol, that's the most pathetic argument I heard yet.
"Sure, it uses PhysX, but you could remove it and put something else in". Oh yea, right.
It's just that I don't know of anyone who bothered to do that.
Unreal Tournament uses PhysX, BioShock uses PhysX, Mirror's Edge uses PhysX...
Please, stop wasting my time. You said it USED other engines:
"PhysX is not a requirement , nor is it the only physics engine in UE3."
Not that you could modify it to remove PhysX, and put something else in.


I'm not arguing anything. It is fact. When a developer uses a game engine, unreal, quake, or whatever the most important thing is that it be modular. PhysX is not a requirement. I have the option to use any other physics engine, and there are many, that would suit my purposes best. The fact that you are making statements like "It's just that I don't know of anyone who bothered to do that." shows you have no idea what you are talking about.

This is the reason that developers hate to talk with gamers about things like this. They play games and suddenly they know all about how to develop them. Go spend some time developing with game engines, then come back . I'm betting your views will change.

Right now you are like a home owner trying to tell the contractor building a home across the street how to build a house.

 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Modelworks
The fact that you are making statements like "It's just that I don't know of anyone who bothered to do that." shows you have no idea what you are talking about.

Don't insult me. I'm an engine developer myself (which is why I never used UnrealEngine, I write my own).
Heck, I've written 3d engines just for fun in the past, for the demoscene and such.
Like this for example:
http://www.pouet.net/prod.php?which=10808

So don't come here insulting me. I'm probably a more experienced engine developer than you are.

You sound like SunnyD to me. You think you know a few things about programming, then try to bluff your way through. That may work on most people. It's just that I'm the real deal, an actual 3d engine developer.

If you need insults to try and prove a point, you've already lost.
 
Apr 20, 2008
10,067
990
126
Originally posted by: Keysplayr
Originally posted by: Scholzpdx
Just a little thought for everyone in this thread, who is this "Scali" person and why now (after 4 1/2 years) do they immediately pop up and only talk about nVidia's cuda/physx? And their position is obvious yet full of rhetorical contradictions; an nVidia supporter. Backed information from this person should be considered truth and you should listen to it, but take any opinion as with a grain of salt.

Its like a second coming of nRollo, yet less childish.

Your post in its entirety can be considered a personal charectar attack on Scali. Without reason or provocation. Sorry dude, but this is getting forwarded to the modship. Entirely an attack on his credibility and charectar.

Or maybe this:

Look at the situation. A person comes way out of the wood work and posts on one topic, and one topic only. Then consider the text itself. You in no way can say it doesn't speak "pro-nVidia" to your eyes. I'm not saying that any of the text was incorrect (didn't accuse of spreading FUD whatsoever) but the opinions do have a bias.

If you think this is wrong then fine, i don't care. It was a truthful post. For these said statements why i said what i did is reason enough.

A more respectable thing would be to say this in a PM rather then a public post.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali


Don't insult me. I'm an engine developer myself (which is why I never used UnrealEngine, I write my own).

Ranting, snipped.

Unless you have worked in the field and used licensed engines you don't have the experience to comment on how the whole thing works.

My experience comes from working on 6 projects, 5 that shipped to retail. All that used licensed engines as well as customizations. Just because you can program does not mean you know everything about game development. Until you have worked with teams of people and understand what is involved anything you comment on regarding the subject has to be taken with that in mind.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Scholzpdx
Or maybe this:

Look at the situation. A person comes way out of the wood work and posts on one topic, and one topic only. Then consider the text itself. You in no way can say it doesn't speak "pro-nVidia" to your eyes. I'm not saying that any of the text was incorrect (didn't accuse of spreading FUD whatsoever) but the opinions do have a bias.

If you think this is wrong then fine, i don't care. It was a truthful post. For these said statements why i said what i did is reason enough.

A more respectable thing would be to say this in a PM rather then a public post.

You're just drawing the wrong conclusions.
I'm not pro-nVidia, I'm pro GPGPU/hardware-accelerated physics.
nVidia just *happens* to currently have the best solution on the market (the ONLY solution in terms of physics).
That is why I *currently* prefer nVidia's technology. Not because it's nVidia's, but because it's good technology.
If ATi or Intel comes up with something better, I'll be the first to switch. I've switched brands many times in the past for the same reason.
Heck, I've been programming 3d long before 3d accelerators even existed, let alone that brands like 3DFX or nVidia existed. And if you bother to check the Croissant 9 demo I linked above... it doesn't even use 3d acceleration AT ALL, so it doesn't depend on nVidia or ATi or anything. It's as hardware-agnostic and platform-agnostic as you can get.
FYI, if I recall correctly, I used a Matrox videocard at the time I developed most of that code.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
If you were so concerned about respect, you wouldn't have publicly attacked Scali's character. You should give as much respect as you would expect for yourself, No?
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Modelworks
Unless you have worked in the field and used licensed engines you don't have the experience to comment on how the whole thing works.

Lol, I work in the field alright, I *write* engines, I don't need to use licensed ones.
Besides, what I do for a living has little to do with whether or not the things I say in this thread are correct. Why don't you stick to the subject, rather than going for character assassination?
 
Apr 20, 2008
10,067
990
126
Originally posted by: Scali
Originally posted by: Scholzpdx
Or maybe this:

Look at the situation. A person comes way out of the wood work and posts on one topic, and one topic only. Then consider the text itself. You in no way can say it doesn't speak "pro-nVidia" to your eyes. I'm not saying that any of the text was incorrect (didn't accuse of spreading FUD whatsoever) but the opinions do have a bias.

If you think this is wrong then fine, i don't care. It was a truthful post. For these said statements why i said what i did is reason enough.

A more respectable thing would be to say this in a PM rather then a public post.

You're just drawing the wrong conclusions.
I'm not pro-nVidia, I'm pro GPGPU/hardware-accelerated physics.
nVidia just *happens* to currently have the best solution on the market (the ONLY solution in terms of physics).
That is why I *currently* prefer nVidia's technology. Not because it's nVidia's, but because it's good technology.
If ATi or Intel comes up with something better, I'll be the first to switch. I've switched brands many times in the past for the same reason.
Heck, I've been programming 3d long before 3d accelerators even existed, let alone that brands like 3DFX or nVidia existed. And if you bother to check the Croissant 9 demo I linked above... it doesn't even use 3d acceleration AT ALL, so it doesn't depend on nVidia or ATi or anything. It's as hardware-agnostic and platform-agnostic as you can get.
FYI, if I recall correctly, I used a Matrox videocard at the time I developed most of that code.

If that was stated prior, then i would have understood your position. Most of the text speaks otherwise and so I (honestly) came up to said conclusion.

Originally posted by: Keysplayr
If you were so concerned about respect, you wouldn't have publicly attacked Scali's character. You should give as much respect as you would expect for yourself, No?

I really did not want to say this, but two wrongs don't make a right.

As a MOD you should have already known that. If you call me out in disrespect (same thing I allegedly did by the words of your last post) and I get reprimanded for it, shouldn't you as well?

Tossing out threats (or action) is downright silly to post in a public forum. Keep that inside of a PM.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Scholzpdx
Originally posted by: Keysplayr
Originally posted by: Scholzpdx
Just a little thought for everyone in this thread, who is this "Scali" person and why now (after 4 1/2 years) do they immediately pop up and only talk about nVidia's cuda/physx? And their position is obvious yet full of rhetorical contradictions; an nVidia supporter. Backed information from this person should be considered truth and you should listen to it, but take any opinion as with a grain of salt.

Its like a second coming of nRollo, yet less childish.

Your post in its entirety can be considered a personal charectar attack on Scali. Without reason or provocation. Sorry dude, but this is getting forwarded to the modship. Entirely an attack on his credibility and charectar.

Or maybe this:

Look at the situation. A person comes way out of the wood work and posts on one topic, and one topic only. Then consider the text itself. You in no way can say it doesn't speak "pro-nVidia" to your eyes. I'm not saying that any of the text was incorrect (didn't accuse of spreading FUD whatsoever) but the opinions do have a bias.

If you think this is wrong then fine, i don't care. It was a truthful post. For these said statements why i said what i did is reason enough.

A more respectable thing would be to say this in a PM rather then a public post.

Oh please. Stop attacking someone just for being on a side of the discussion that opposes your own and being new to the forum. Perhaps he was just reading the thread and decided to join the forum so he could take part in the discussion? All of Scali's arguments are well thought out and backed up. He seems to be more knowledgeable about the subject at hand than many other posters in this thread. If you disagree with him, then say so that's fine. I expect you to put forth well thought out and backed up arguments like he has.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali
Originally posted by: Modelworks
Unless you have worked in the field and used licensed engines you don't have the experience to comment on how the whole thing works.

Lol, I work in the field alright, I *write* engines, I don't need to use licensed ones.
Besides, what I do for a living has little to do with whether or not the things I say in this thread are correct. Why don't you stick to the subject, rather than going for character assassination?


I'm not trying to attack your character. Instead I'm pointing out that you are making comments that are not in line with the industry. You like PhysX , I think most people who have read the thread understand that.

What you do for a living does have an impact on whether what you say is correct or not. You are making claims that things like PhysX are the only physics used in things like UE3, when that is not true. Game engines that are not modular never succeed. Epic knows this. That is why UE has all along been very modular. I can customize it to fit my needs.
 
Apr 20, 2008
10,067
990
126
Originally posted by: dguy6789
Originally posted by: Scholzpdx
Originally posted by: Keysplayr
Originally posted by: Scholzpdx
Just a little thought for everyone in this thread, who is this "Scali" person and why now (after 4 1/2 years) do they immediately pop up and only talk about nVidia's cuda/physx? And their position is obvious yet full of rhetorical contradictions; an nVidia supporter. Backed information from this person should be considered truth and you should listen to it, but take any opinion as with a grain of salt.

Its like a second coming of nRollo, yet less childish.

Your post in its entirety can be considered a personal charectar attack on Scali. Without reason or provocation. Sorry dude, but this is getting forwarded to the modship. Entirely an attack on his credibility and charectar.

Or maybe this:

Look at the situation. A person comes way out of the wood work and posts on one topic, and one topic only. Then consider the text itself. You in no way can say it doesn't speak "pro-nVidia" to your eyes. I'm not saying that any of the text was incorrect (didn't accuse of spreading FUD whatsoever) but the opinions do have a bias.

If you think this is wrong then fine, i don't care. It was a truthful post. For these said statements why i said what i did is reason enough.

A more respectable thing would be to say this in a PM rather then a public post.

Oh please. Stop attacking someone just for being on a side of the discussion that opposes your own and being new to the forum. Perhaps he was just reading the thread and decided to join the forum so he could take part in the discussion? All of Scali's arguments are well thought out and backed up. He seems to be more knowledgeable about the subject at hand than many other posters in this thread. If you disagree with him, then say so that's fine. I expect you to put forth well thought out and backed up arguments like he has.

Oh dear..

When you get down to it, I don't disagree with alot of what he's saying. I stated there was a pro-nVidia undertone. It was explained in his last post and it was for good reason. Thus i never called him a fanboy or accused him of speading misinformation. I'm not questioning any knowledge. I'm stating that Physx is easily replaceable, (somewhat) unimportant and useless compared to Cuda. (obviously they are different but in terms of innovation)