• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

NVIDIA and Havok Demonstrate World's First GPU-Powered Game Physics Solution at Game Developer's Conference

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Finny
Originally posted by: apoppin
Originally posted by: Gstanfor
Appopin, you don't seriously believe that just because nVIDIA support Havok they are suddenly incapable of supporting any other physics API do you? Ever heard of Direct3D and OpenGL? Two different API's that work just fine on the same hardware...

where did you get that?
:Q

not from what i typed :p

otoh, just 'cause nVidia is "first" doesn't mean it is the best or only solution . . . M$, Aegia and ATi's engineers are clearly just as capable as nVidia's.

and i am DONE here . . . it's pure speculation - at this point
:roll:

and are you EVER gonnna get my 'nick' spelled right? :p
["apoppin" as in hell's a'poppin] :D

Good lord, do you ever give credit where it's due? I'm not trying to sound like some nVidia fanboy, but from what I've seen, you can never give them praise for a single damn thing they do! The way I see it, it's a good idea, and all you have to say is "Just 'cause they're first doesn't mean they're best!" At least they're saying that they're doing something, regardless of what ATi has done (Or yet to have done). Hell, this should help push ATi to put something out sooner so we can hopefully eliminate the need for those stupid PPU cards in the future!

I'm all for whatever nVidia & Havok are doing, and hope that ATi will follow suit somehow (Whether with Ageia or whoever). As far as I'm concerned, it's a step in the right direction regardless.

GOD :! ATI has already done something like this but back in dec 2005. They informed the developers and gave out information how to do and now it up to developer to take advantage of it :! .
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: tuteja1986
GOD :! ATI has already done something like this but back in dec 2005. They informed the developers and gave out information how to do and now it up to developer to take advantage of it :! .

They demonstrated nothing. It was all on paper (no suprise).
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Wreckage
Originally posted by: tuteja1986
GOD :! ATI has already done something like this but back in dec 2005. They informed the developers and gave out information how to do and now it up to developer to take advantage of it :! .

They demonstrated nothing. It was all on paper (no suprise).

wait for oblivion then :! i think developers did use the feature :!
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: Regs
Originally posted by: Griswold
Isnt ATI going the unified shader route with their next generation? And didnt nvidia already claim there is no need to have unified shaders for DX10 games yet? Wonder how that could affect such physics calculations on the GPU.

At any rate, buying into Ageia more and more seems to be a huge mistake. ATI and Nvidia will eat this cake, not Ageia.

Nvidia did not claim that. Nvidia claims there is a substantial trade off when going to a unified shader platform.

Thats exactly the same. They think its of no use. We'll see how useless it is when they want to pull off a stunt like this with physics calculated on the GPU.

 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Wreckage
Originally posted by: tuteja1986
GOD :! ATI has already done something like this but back in dec 2005. They informed the developers and gave out information how to do and now it up to developer to take advantage of it :! .

They demonstrated nothing. It was all on paper (no suprise).

you could essentially say the same about this ;) It's not as if AT have tried it out & reviewed it is it, you can't actually BUY it right now...heck, by your standards that's a paper launch isn't it?

And no one has answered the point i raised early on, which is that if you are GPU limited (and MOST people are at their preferred res/eyecandy levels in modern games), how is the GPU going to perform physics calulations within the shader pipes WITHOUT negatively impacting on FPS? Are we talking a blanket XX% fps drop (with that extra buffer used for physics calculations being adaptively changed depending on 'traditional' GPU function requirements..which raises yet more questions..)? 20%? Only really of any use in SLI?
 
Jun 14, 2003
10,442
0
0
Originally posted by: dug777
Originally posted by: Gstanfor
Hey, it's only a press release after all. I'm sure Anandtech and other sites will do a writeup shortly and the document isn't that long...

I don't know the intricate details myself, but I imagine the software would utilize the vertex shaders more than the pixel shaders simply because the vertex shaders virtually sit idle compared to pil shaders in most of todays games - developers focus on the "shiny" visible effects that the pixel shader offers, vertex shader advantages are not as immediately obvious so they get neglected.


I'd imagine you could also make use of the pixel shaders when a game engine is doing tasks such as stencil shadow rendering, heavy texturing etc and the pixel shaders are sitting idle.

interesting...that would mean that the x1900 cards would be extremely handy at that, with acres of vertex shader pipes essentially sitting twiddling their thumbs in even modern games...Kris was saying both nvidia & ati were well in on this havok stuff (although he thought nvidia would beat ati to the initial punch, & it looks like he was right...) and it would essentially killl the physx card idea...


dug i know they got 48 shaders, but i think they still got the same number of vertex shaders as nvidia dont they? 8 i think, and no advantage in clock speed.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: dug777

And no one has answered the point i raised early on, which is that if you are GPU limited (and MOST people are at their preferred res/eyecandy levels in modern games), how is the GPU going to perform physics calulations within the shader pipes WITHOUT negatively impacting on FPS? Are we talking a blanket XX% fps drop (with that extra buffer used for physics calculations being adaptively changed depending on 'traditional' GPU function requirements..which raises yet more questions..)? 20%? Only really of any use in SLI?

That's probably the point. SLI.

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Is funny people saying that nVidia has more vertex shader power than ATis cards without fundamentals, it is just lame. Since the Radeon X8X0 series, ATi always had a led over the GeForce 6800 series in vertex shaders, it runs at higher clock and can do more instructions per clock no matter if the pipeline is doing dependent texture reads on both, in the pixel shader and the vertex shader. That was an advantage of the R420 over the NV40 and since the G7X is just a rehash of the NV40, still having the same issues. The X1900XT and the 7900GTX runs at par because of the pretty much same clock speeds on the core, and we all know that the vertex and the pixel pipelines (texturing, not shader) are very bounded to it's theorical fillrates (Core Clock Speed). You can see it for example, the Radeon X800XL, having the same Vertex output as the 6800 Ultra, both runs at 400Mhz and having 6 vertex shaders and different architectures.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: otispunkmeyer
Originally posted by: dug777
Originally posted by: Gstanfor
Hey, it's only a press release after all. I'm sure Anandtech and other sites will do a writeup shortly and the document isn't that long...

I don't know the intricate details myself, but I imagine the software would utilize the vertex shaders more than the pixel shaders simply because the vertex shaders virtually sit idle compared to pil shaders in most of todays games - developers focus on the "shiny" visible effects that the pixel shader offers, vertex shader advantages are not as immediately obvious so they get neglected.


I'd imagine you could also make use of the pixel shaders when a game engine is doing tasks such as stencil shadow rendering, heavy texturing etc and the pixel shaders are sitting idle.

interesting...that would mean that the x1900 cards would be extremely handy at that, with acres of vertex shader pipes essentially sitting twiddling their thumbs in even modern games...Kris was saying both nvidia & ati were well in on this havok stuff (although he thought nvidia would beat ati to the initial punch, & it looks like he was right...) and it would essentially killl the physx card idea...


dug i know they got 48 shaders, but i think they still got the same number of vertex shaders as nvidia dont they? 8 i think, and no advantage in clock speed.

Yea, they both have 8 Vertex Pipes but the 7900GTX sems to be able to do ~50% more with them than the X1900XTX can.

http://www.gpureview.com/show_cards.php?card1=383&card2=378
 
Sep 6, 2005
135
0
0
Originally posted by: tuteja1986
GOD :! ATI has already done something like this but back in dec 2005. They informed the developers and gave out information how to do and now it up to developer to take advantage of it :! .

Now that you mention it, I do recall something about that, but haven't heard much since. In that case, I guess it's just a question of who's following who here for the fanboys to flame about. Like I said before: I couldn't care less about who's "got it first" or "who's gonna do it better"; I'm just glad they're doing it.

As far as GPU limitations are concerned, I believe these technologies are supposed to make use of vertex shaders... Finally, they'll be put to some use! :D

At least, that's something that I've read about. As others have stated, it's really speculation at this point, but I doubt that neither nVidia nor ATi would pull something that would kill frames on their video cards.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: DaveBaumann
http://www.havok.com/content/view/187/77/

Havok are targetting SM3.0 implementations because of the cross-platform benefits - PC NVIDIA/ATI, PS3 and XBOX 360. There's are greater math capability in all pixel pipelines so its actually more likely to use pixel shader than vertex shaders, with the geometry being deformed via render to vertex buffer like operations. It'll work both as a rendering pass in a frame, allowing for single graphics operation, or it can work by shuttling the ops off to a second board with the first doing the graphics rendering.

Edit: As explained by the Havok FAQ, this is not an "automatic" process accelerating current Havok implementations. This is a separate module for Havok, that is an additional cost over the rest of the Havok Middleware.



Listen to this guy, he knows a hell of a lot more than GStanfor ever could.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: dug777
It is all grand statements and little detailed information (and you certainly didn't need to copy and paste the whole bloody document :p)...

Game designers can include advanced physics effects without burdening the CPU and slowing game-play, since the effects are simulated and rendered on the GPU.

I was chatting to Kris about this a few months ago, i can't see how it could work without compromising your FPS (as 6 & 7 series cards don't have any specific extra hardware for this, from what i gather it will run in the existing shader pipes). Most people suffer from far more of a GPU than CPU FPS limitation at the Res/eyecandy levels they play at in modern games, and unless they can magically get the GPU to perform these calculations in real time without impacting on the amount of 'normal' GPU throughput it will merely exacerbate the situation...

I would be interested to read some more on it, looking forward to an article that is rather more 'informative' shall we say ;)

I mostly agree with you, though I would like bring up the point that if we were to have such realistics physics in our games they would quickly become CPU-limited again. The software demos for Ageia's Novadex API are still slow with a high-end CPU when there are a decent amount of objects interacting. Offloading the work onto the GPU might then make more sense as otherwise the GPU will still have resources unused as it will be waiting for your CPU to calculate all the physics for your game.

I think Havok is interesting, but I am hoping Ageia's solution pans out so we have some competition and further development on the technology. Ageia seems to be pushing their software API much more than the actual PhysX card recently, so maybe they are trying to go the same route as Havok and use the PPU card as an upsell for those who want maximum performance while the rest of us can limp along in software.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: apoppin
We'll SEE . . . don't count out M$, Aegia or ATi . . . their engineers are at least as competitent as nVidias
Really? Is that why Crossfire requires dongles, mastercards.. and pretty much is the inferior multi GPU implementation compared to SLI?
And what REALLY matters is that the GF7 series is WILDLY popular compared to the oddly named ATI cards.

There is no equality in engineering. Sorry Mop.

Originally posted by: 5150Joker
Listen to this guy, he knows a hell of a lot more than GStanfor ever could.

LOL.. ATI damage control much?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Crusader
Originally posted by: apoppin
We'll SEE . . . don't count out M$, Aegia or ATi . . . their engineers are at least as competitent as nVidias
Really? Is that why Crossfire requires dongles, mastercards.. and pretty much is the inferior multi GPU implementation compared to SLI?

Unless you want decent performance doing SLI AA, in which case the way ATI did it (using an external compositing engine) is actually better.

Of course, it only really matters for the sub-.5% of the enthusiast market that actually has an SLI or Crossfire system.

And what REALLY matters is that the GF7 series is WILDLY popular compared to the oddly named ATI cards.

..."oddly named"? :confused:

ATI's hardware is just fine; the GF7s have more market share largely because they had a 6-month head start.

Originally posted by: 5150Joker
Listen to this guy, he knows a hell of a lot more than GStanfor ever could.

LOL.. ATI damage control much?

Dave runs Beyond3D and probably knows a hell of a lot more than almost anybody outside of ATI/NVIDIA about graphics hardware.

GStanfor copy-and-pasted a press release, which hardly qualifies him as an expert in the area of using pixel/vertex shaders to do physics.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Matthias99
Originally posted by: Crusader
And what REALLY matters is that the GF7 series is WILDLY popular compared to the oddly named ATI cards.

..."oddly named"? :confused:
Yeah.. X800, X800XTX, XT.. X1800XTX, X1900XT XTX ZXNHZH
When does it get silly with all the X's?
I like the sound of GEFORCE 7. Just is a quality product, and has a great naming scheme.

ATI's hardware is just fine; the GF7s have more market share largely because they had a 6-month head start.
No I'd say the GF7 series has massively more popularity than the X1900s. We'll see in a year in the Valve survey though wont we.
ATI just cant seem to hit one out the park like NV has been doing consistently.

Originally posted by: 5150Joker
Listen to this guy, he knows a hell of a lot more than GStanfor ever could.

LOL.. ATI damage control much?

Dave runs Beyond3D and probably knows a hell of a lot more than almost anybody outside of ATI/NVIDIA about graphics hardware.

GStanfor copy-and-pasted a press release, which hardly qualifies him as an expert in the area of using pixel/vertex shaders to do physics.

Sure, but writing off GStanfor because hes not saying helpful things to ATI looks like the ATI Task Force is mobilizing. Rather than "seeking truth".
Baumann and his site is pretty much known to be pro-ATI.. I dont trust him anymore than anyone else.
Everyone knows this forum is ATI territory. Those of us that feel NV is superior to ATI are in the minority. Its no surprise GStanfor gets attacked like this.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Crusader
Originally posted by: Matthias99
Originally posted by: Crusader
And what REALLY matters is that the GF7 series is WILDLY popular compared to the oddly named ATI cards.

..."oddly named"? :confused:
Yeah.. X800, X800XTX, XT.. X1800XTX, X1900XT XTX ZXNHZH
When does it get silly with all the X's?
I like the sound of GEFORCE 7. Just is a quality product, and has a great naming scheme.

ATI's hardware is just fine; the GF7s have more market share largely because they had a 6-month head start.
No I'd say the GF7 series has massively more popularity than the X1900s. We'll see in a year in the Valve survey though wont we.
ATI just cant seem to hit one out the park like NV has been doing consistently.

Originally posted by: 5150Joker
Listen to this guy, he knows a hell of a lot more than GStanfor ever could.

LOL.. ATI damage control much?

Dave runs Beyond3D and probably knows a hell of a lot more than almost anybody outside of ATI/NVIDIA about graphics hardware.

GStanfor copy-and-pasted a press release, which hardly qualifies him as an expert in the area of using pixel/vertex shaders to do physics.

Sure, but writing off GStanfor because hes not saying helpful things to ATI looks like the ATI Task Force is mobilizing. Rather than "seeking truth".
Baumann and his site is pretty much known to be pro-ATI.. I dont trust him anymore than anyone else.
Everyone knows this forum is ATI territory. Those of us that feel NV is superior to ATI are in the minority. Its no surprise GStanfor gets attacked like this.


GStanfor incorrectly speculated about ATi's vertex shader capability and the fact that the Havok system wouldn't use pixel shaders. Why would we take him seriously? Dave corrected him and I pointed that out since all the nVidiots like you glossed over it. B3D has 3 AEG members in its forums including many coders that are pro-nV, it's hardly "ATi territory" as you put it. :roll:
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: 5150Joker
GStanfor incorrectly speculated about ATi's vertex shader capability and the fact that the Havok system wouldn't use pixel shaders. Why would we take him seriously? Dave corrected him and I pointed that out since all the nVidiots like you glossed over it. B3D has 3 AEG members in its forums including many coders that are pro-nV, it's hardly "ATi territory" as you put it. :roll:

HMMM I wonder who is ATI's AEG agent thats infested our forums? :disgust:
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Crusader
HMMM I wonder who is ATI's AEG agent thats infested our forums? :disgust:

ATI hasn't contracted with AEG, Nvidia has. Do you have a link showing which company does the same for ATI?
 

Bull Dog

Golden Member
Aug 29, 2005
1,985
1
81
Originally posted by: Crusader
Yeah.. X800, X800XTX, XT.. X1800XTX, X1900XT XTX ZXNHZH
Where does it get all those silly X's from? (edited sentence to make sense)
I like the sound of GEFORCE 7. Just is a quality product, and has a great naming scheme.
Thats an opinion, personal preference, NOTHING more. Further, you are mixing Product-Line names with product names.

GeForce is to Radeon as X1900XTX is to 7900GTX
or if you want to be stickler about it
GeForce7 is to Radeon X1 as X1900XTX is to 7900GTX

I could just as easily and with as much merit (gratuitous statements can be equally gratuitously denied) say that GeForce sounds crappy and that Radeon sounds like a quality product and that the X1 series cards have a "great naming scheme"

Man.....:disgust:....this thread has really degenerated into a massive flame fest.

Red vs Green

GO!

P.S. You can't accuse me of being a FanATItic (as you seem to like doing to anyone who people who states the FACTS) becuase I havn't said anything pro one-Company-or-another. I've stuck to "just-the-facs-Ma'am" (from Dragnet). :p
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
First: you suck at teh quoting.

Originally posted by: Crusader
Yeah.. X800, X800XTX, XT.. X1800XTX, X1900XT XTX ZXNHZH
When does it get silly with all the X's?

...again, nitpicking the naming scheme is kind of goofy at best. Both companies have some pretty wacky naming conventions, and don't even get me started on some of the OEMs.

I like the sound of GEFORCE 7. Just is a quality product, and has a great naming scheme.

Have a cookie. I like cookies. :cookie:

No I'd say the GF7 series has massively more popularity than the X1900s.

Based on... your opinion? The 7800GT had a six-month head start and until recently was the most affordable card of this generation.

And when did 'popularity' start to determine which card was actually better?

We'll see in a year in the Valve survey though wont we.

I'm sure we will.

ATI just cant seem to hit one out the park like NV has been doing consistently.

The 7800GTX 512MB wasn't exactly a "home run", so to speak. And rumors of supply issues with the 7900s aren't great news either.

Had the X1800s not been delayed by 4-6 months, the market situation could have been very different (not that the GF7s aren't great, but they didn't have any competition for six months!)

Sure, but writing off GStanfor because hes not saying helpful things to ATI looks like the ATI Task Force is mobilizing. Rather than "seeking truth".

This:

LOL.. ATI damage control much?

isn't a whole lot better. Unless something more concrete than a press release is going to be produced, there is not much to see here.

Baumann and his site is pretty much known to be pro-ATI.. I dont trust him anymore than anyone else.

Their reviews seem pretty even-handed to me. I don't read their forums, so I can't really comment on them.

Everyone knows this forum is ATI territory. Those of us that feel NV is superior to ATI are in the minority.

Well, except for all those vocal pro-NVIDIA people... :confused:

Why the confrontational attitude, the insistence that we have to 'take sides' or 'stake out territory'?

Its no surprise GStanfor gets attacked like this.

His posting history may have something to do with it as well. This thread from last week comes to mind.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I think you'll find if you do your homework that nVIDIA's vertex shaders remain more capable/ptogrammable than ATi's, and through a much broader range of architectures too... Ati's vertex shaders only started looking reasonable with R520.

As for Baumann's comments, note this part:
so its actually more likely to use pixel shader than vertex shaders
In other words he is speculating too, he doesn't know for sure and he has been wrong in the past (just like I have).

My reason for speculating vertex shaders over pixel shaders is three-fold:

1) pixel shaders already get loaded up pretty heavily already by developers and using them for physics as well would certainly lead to slowdowns in modern games making SLI almost essential (and even then there would probably still be a slowdown).

2) Physics affects in game objects position and shape, things that get dealt with in the GPU by the vertex processors. It makes no sense to calculate this sort of info in pixel shaders then have to transfer it somehow to the vertex shaders so it can actually be used.

3) When you are dealing with physics in the gameworld you will want good precision in your maths otherwise you will end up with errors with object/vertex placement. Vertex shader math is more precise than pixel shader math because it has to be (uses FP32 exclusively).
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
http://forums.anandtech.com/messageview...&STARTPAGE=4&FTVAR_FORUMVIEWTMP=Linear

Crusader, read that thread and read my post in there. I did a little searching and it turns out ATI sells nearly as much discrete GPU's as nVidia meaning that the popularity level of ATI's Radeon video cards are about on par with nVidia's Geforce video cards. Basically it shows that even with the superior performance of nVidia's Geforce 7 cards over what ATI had out at the time, ATI was still doing quite well. If you got numbers showing differently then please do so by all means.

As much as nVidia fans are giving a big pat on nVidia's back, I think ATI will have or is already in the works with similar stuff. I didn't read the press release, my eyes always glaze over reading those things but I'd assume unless Havok signs an exclusivity deal with nVidia there's nothing to say they can't partner with ATI doing the same thing. Even if Havok doesn't do so, there's nothing to prevent ATI from doing so either in house or through another middleware company. But either way, it's all speculation at this point.

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: tuteja1986
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: apoppin
Originally posted by: Gstanfor
Not wanting to rain on Ati's parade too much, but I think nVIDIA's vastly more capable vertex shaders will give a big advantage over ATi here. ATi has yet to learn that skimping on vertex shader capabilities is a bad thing.
you are right about ONE thing:
Originally posted by: Gstanfor
i don't know the intricate details myself
no kidding :p

ATi has stated - about a year ago - that r520 is capable of doing exactly what nVidia and Havok are doing [now] . . . i expect r580 and r600 to be even more adept. ;)

Be that as it may, nVIDIA and Havok are first to actually present a solution, and all I'm hearing from the fanATics is sour grapes (predictably).

One wonders why we didn't see this vaunted ATi physics engine a year ago if it is no great probelm... Actions speak louder than words to me.

what sour grapes? . . .

i believe both M$, Aegia and ATi are also working on it . . . nVidia just partnered with Havok to reach the market first . . .

the first to market does not necessarily have the best solution. ;)

and we need GAMES to support this . . . it's gonna take quite awhile.


And most of know that your Nvidia fanboy :! look at sig and your post and i swear people will say you have something to do with Nvidia.

This coming from the guy who has "I hate Nvidia with hate" in his sig. :roll:
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: akugami
http://forums.anandtech.com/messageview...&STARTPAGE=4&FTVAR_FORUMVIEWTMP=Linear

Crusader, read that thread and read my post in there. I did a little searching and it turns out ATI sells nearly as much discrete GPU's as nVidia meaning that the popularity level of ATI's Radeon video cards are about on par with nVidia's Geforce video cards. Basically it shows that even with the superior performance of nVidia's Geforce 7 cards over what ATI had out at the time, ATI was still doing quite well. If you got numbers showing differently then please do so by all means.

Dead wrong.
I was talking about current gen video card sales. Not integrated, not yesteryears. This is X1900 vs GF7 that I was referring too. Sorry. It has nothing to do with what I said.. therefore you are posting ATI fanboy garbage if you are directing your info towards me.

Originally posted by: Crusader
No I'd say the GF7 series has massively more popularity than the X1900s.
Try again next time.
You knew better than to try to post that crap as a rebuttal to my statement. See ya.