Nvidia makes physics card

TroubleM

Member
Nov 28, 2005
97
0
0
Link
GPUs aren't far front the point where they'll be able to deliver fully photorealistic imagery at high resolutions and high framerates. As such they're going to need new markets for their parallel processing technology. Scientific and financial modelling calculations are obvious markets to aim for, alongside game physics.
The revelation comes days after it emerged ATI will be likewise pitching its graphics chip technology as a co-processor for compute-intensive scientific and engineering applications, not just games physics.
The future looks promising for PCs.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Didn't ATI ask some motherboard manufacturers to produce boards with a third PCI-Ex16 slot about 2~3 months ago?

I wonder how well both Crossfire and SLI will incorporate physics processing with 3D rendering in one interface.
 

TroubleM

Member
Nov 28, 2005
97
0
0
Yeah, you're right, ATI did came up first with the concept of two video cards working in Crossfire with a third working on accelerating physics. And probably they will be first on the market with a working model when the RD600 chipset will appear on the market, if ever.
Most likely Nvidia just tries to show that they are not far behind.
Probably accelerated physics will come to the mainstream once AMD and Intel take the thing seriously and/or Ageia is incorporated into one of the big players.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
This might work out okay for AGIEA. Sounds like rather than using old graphics cards for physics as was first suggested by ATI and NV they've now chosen to go down the dedicated physics card route.
If they all run a common API that was built into DX10.1 (or whatever) then we have 3 players competing in the physics arena.
All AGEIA need to do is start producing x16 cards and say they're an alternative to the ATI/NV parts.

Interesting...
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Actually ATi's idea was to have Crossfire plus Physics card.

NV was Sli except one card does the rendering and the other physics.
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: Cookie Monster
Actually ATi's idea was to have Crossfire plus Physics card.

NV was Sli except one card does the rendering and the other physics.
Alright, ATI are teh pwnzorz. Happy? ;)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Sable
Originally posted by: Cookie Monster
Actually ATi's idea was to have Crossfire plus Physics card.

NV was Sli except one card does the rendering and the other physics.
Alright, ATI are teh pwnzorz. Happy? ;)

:laugh:

But i guess NV decided to do what ATi is going to do. But who can blame them. Look at ATi with the internal bridge crossfire conector.

AGEIA is not doing very well. They deccelerating performance instead of accelerating it, by adding couple of "physic" effects. Not sure whats wrong but through most reviews, it doesnt do much. Plus AGEIA cant match NV/ATi in terms of resources but its nice to see some other company either the big two.

Wish some companies were alive now for mroe competition (That would make things so much interesting and more options for us). XGI is pretty much dead, along with matrox/3dlabs, and SGI barely just lived from bankruptcy. 3dfx in the meantime is already digested in NV's stomach and probably sh*t out somewhere. Then we have SiS who dont do quite well. Intel is IGP to the core.

Before i go to sleep, Intel is apparently hiring engineers for GPU + Memory controller + CPU project. Im not sure hot CPU plus GPu is going to turn out but this is bad news for Nvidia as they dont have the resources to compete in the x86 market.
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: Cookie Monster
Before i go to sleep, Intel is apparently hiring engineers for GPU + Memory controller + CPU project. Im not sure hot CPU plus GPu is going to turn out but this is bad news for Nvidia as they dont have the resources to compete in the x86 market.

Do I smell a major ATI fanboy? :laugh:

nVidia isn't going anywhere my friend.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Whatever....

I just hope Nvidia can provide a driver that doesn't deliver negative results like Aegia did first go around.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
GPUs aren't far front the point where they'll be able to deliver fully photorealistic imagery at high resolutions and high framerates. As such they're going to need new markets for their parallel processing technology. Scientific and financial modelling calculations are obvious markets to aim for, alongside game physics.

BS. Sure, gpu's can be used for applications such as number crunching, but if the past 10 years or so are any indication, as soon as a new super-duper gpu comes out, a new game follows shortly that puts the gpu righ back in its place. And the first sentence of that paragraph could not be farther from the truth - the most near-photorealistic game I've seen so far is Oblivion, and the best single gpu's right now struggle with the game at the ideal settings. Maybe dual or even triple gpu configurations will handle the graphics and physic, but I dont buy for one second that a single gpu can handle a future 3D game and physics simultaneously.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
I remember when people had to buy math co-processors because the CPU was horrible at it. Of course that was over a decade ago.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Regs
Whatever....

I just hope Nvidia can provide a driver that doesn't deliver negative results like Aegia did first go around.

I thought that City of Villains or whatever that second game was that had PPU support had a decent speedup? I think it was obvious from most reviews that GRAW had a very poor implementation with respect to physics and that was mostly to blame for the slowdown when enabling the PPU.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
AGEIA is not doing very well. They deccelerating performance instead of accelerating it, by adding couple of "physic" effects. Not sure whats wrong but through most reviews, it doesnt do much. Plus AGEIA cant match NV/ATi in terms of resources but its nice to see some other company either the big two

Anand re-reviewed the Aeigia physics card in city of villains with equal settings for CPU rendered and PPU rendered physics. You will not that

A. ATI has a horrible time with physics on their cards
B. The PPU actually accelerates the game now.

http://www.anandtech.com/video/showdoc.aspx?i=2828
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Originally posted by: aka1nas
Originally posted by: Regs
Whatever....

I just hope Nvidia can provide a driver that doesn't deliver negative results like Aegia did first go around.

I thought that City of Villains or whatever that second game was that had PPU support had a decent speedup? I think it was obvious from most reviews that GRAW had a very poor implementation with respect to physics and that was mostly to blame for the slowdown when enabling the PPU.

Yes. new technology brings bugs. Though patience runs thin in my blood.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Pabster
Originally posted by: Cookie Monster
Before i go to sleep, Intel is apparently hiring engineers for GPU + Memory controller + CPU project. Im not sure hot CPU plus GPu is going to turn out but this is bad news for Nvidia as they dont have the resources to compete in the x86 market.

Do I smell a major ATI fanboy? :laugh:

nVidia isn't going anywhere my friend.

How am a ATi fanboy?
Stop your trolling dude.

Reality is nVIDIA compared to Intel is like David vs Goliath. Intel owns numerous fabs of their own. They make more money. They have the resources and experience in x86 architecture. NV however dont, although they have mroe experience in the GPU arena. But we all know the CPU is the heart of a PC.

If the industry goes toward the route of CPU+GPU which is most possible, how will nVIDIA be able to compete with their limited resources? They dont own a single fab, plus a CPU+GPU package is more favourable compared to a single discrete GPU solution. (Assuming performance was similiar). Not to mention AMD is taking this route as well with their chip code named "torrenza". (A possible reason why AMD bought out ATi).

But im talking in mabye 5~10 years times. :) or even more.

To Genx87 - Well, didnt realise they improved. Nice stuff. I wonder if some of the deceleration is caused by just a poor implementation of the AGEIA physics into the games like GRAW. Or simply it was just added on into the final stages of the creation of the game.