Poll: What would the price of Physx card have to be

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JAH

Member
Mar 4, 2005
165
0
0
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?
 
Jan 3, 2005
136
0
0
I've said it before and it's worth saying again. Everyone of the misinformed and even the ones who are just plain ignorant, will either have one of these cards or will be seriously wanting one by this time next year. If not sooner.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

 
Jan 3, 2005
136
0
0
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

Thats pretty close to being the stupidest thing said in this thread yet.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

You think you can cram a PPU into a GPU without any noticable effects? like adding layers to the card, as well as needing to share memory controllers, bandwidth, and memory space? Itd also increase power consumption, heat, and probably run slower than a discreet solution. Not to mention if its on a slower cycle than the graphics card, youll be paying for the same thing over and over, like people who buy All in wonders rather than an add in card for capture.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: JamesDax
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

Thats pretty close to being the stupidest thing said in this thread yet.

If you say so. That just means you didn't understand it.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Acanthus
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

You think you can cram a PPU into a GPU without any noticable effects? like adding layers to the card, as well as needing to share memory controllers, bandwidth, and memory space? Itd also increase power consumption, heat, and probably run slower than a discreet solution. Not to mention if its on a slower cycle than the graphics card, youll be paying for the same thing over and over, like people who buy All in wonders rather than an add in card for capture.

Ummm Acanthus? What makes you think a discreet solution will not utilize bandwidth, memory space, increase power consumption, heat? Are you guessing at all of this?
You keep saying you wouldn't want to buy the same thing over and over, yet you have bought numerous graphics cards that each contains pipelines did you not? I mean, by your logic, you are paying for pipelines over, and over, and over again. You are not making any discernable difference in your statements from discreet to integrated. I know, I know... This is totally different.

 
Jan 3, 2005
136
0
0
Originally posted by: keysplayr2003
Originally posted by: JamesDax
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

Thats pretty close to being the stupidest thing said in this thread yet.

If you say so. That just means you didn't understand it.

You are the one who dosn't understand. In fact, you don't have a clue as to what the ****** you are talking about.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

You think you can cram a PPU into a GPU without any noticable effects? like adding layers to the card, as well as needing to share memory controllers, bandwidth, and memory space? Itd also increase power consumption, heat, and probably run slower than a discreet solution. Not to mention if its on a slower cycle than the graphics card, youll be paying for the same thing over and over, like people who buy All in wonders rather than an add in card for capture.

Ummm Acanthus? What makes you think a discreet solution will not utilize bandwidth, memory space, increase power consumption, heat? Are you guessing at all of this?
You keep saying you wouldn't want to buy the same thing over and over, yet you have bought numerous graphics cards that each contains pipelines did you not? I mean, by your logic, you are paying for pipelines over, and over, and over again. You are not making any discernable difference in your statements from discreet to integrated. I know, I know... This is totally different.

It is, you wont be seeing exponential increases in physics on a 6 month cycle.

This isnt a hard concept.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Acanthus

It is, you wont be seeing exponential increases in physics on a 6 month cycle.

This isnt a hard concept.

And you know this because.......?
So we are all supposed to get your concept, but mine is completely ridiculous? Is that what your open mind is thinking? If you can't have a two way conversation, don't bother having one at all. I told you, I am not saying you are wrong. It just seems that nobody has any foresight here. Well, I'll tell you what, since I grow tired of talking about this to someone who is not interested, I'll revisit it when it becomes more competitive and other solutions come around.

 

Philippine Mango

Diamond Member
Oct 29, 2004
5,594
0
0
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

And integrated solution would only be good on a system that is considered low end, I don't know about you but I'd like my video card working ONLY on video..
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Philippine Mango
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

And integrated solution would only be good on a system that is considered low end, I don't know about you but I'd like my video card working ONLY on video..

Then you need to go out and buy a RivaTNT or a Rage128 then. Because your current GPU's don't do "just" video and haven't for quite a while now. If you ask "what other things" you need to go back a few posts/pages and read em again.

 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: keysplayr2003
Originally posted by: Philippine Mango
Originally posted by: keysplayr2003
Originally posted by: Acanthus
Originally posted by: JAH
"Jules, if you give that ******' nimrod fifteen hundred dollars, I'm gonna shoot him on general principles." - Vincent Vega, Pulp Fiction.

Heh, I'm not buying a PPU on general principles. It should be integrated into graphic board.

Why? So you have to rebuy the same chip every time you change cards?

But thats like saying you have to buy the same transistors every time you change cards.
If it's integrated, it can become MUCH cheaper than a stand alone card to the point of almost being an unnoticeable cost. Eventually.

And integrated solution would only be good on a system that is considered low end, I don't know about you but I'd like my video card working ONLY on video..

Then you need to go out and buy a RivaTNT or a Rage128 then. Because your current GPU's don't do "just" video and haven't for quite a while now. If you ask "what other things" you need to go back a few posts/pages and read em again.

Maybe if we got them back to doing just video they could render a scene fast enough again instead of 35fps in the newest games...Hate to see what asking them to calculate physics would do to the framerate.....
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: keysplayr2003
These cards are probably going to be priced as much as the corporation thinks the enthusiasts market will bear, due to the enthusiasts niche it is intended for.

But remember guys, GPU's weren't always GPU's. The original GeForce was "I think" the first Geometry Processing Unit, designed to take "some" duties from the CPU. Or allow for a slower CPU without sacrificing performance.

Dunno if you read some of my other posts in different threads on physics processors but my thoughts went along the same line. Basically I think that for now there is a lack of a killer app to push PPU's like Aegia's PhysX processors. Second, I think that both nVidia and ATI are working on physics processing additions to their GPU's.

The PhysX has been announced a very long time ago. There's been plenty of time for both nVidia and ATI to begin implementing physics routines in their GPU's since that announcements. It's very likely that we will see the beginnings of this in the R600 and G80 cards. This may even account for the lack of new features in the 7900 series as nVidia's focus would mostly be on the G80 and how they can further improve it.

Sadly, I think that while Aegia had the right idea and is a pioneering company I don't feel their product will succeed. I think that the second coming of the video card will be due to physics processing enabled on the gpu cores. By the time the PhysX cards would warrant a large enough presence to be seriously considered by most gamers for inclusion in builds I think that both ATI and nVidia will have physics enabled GPU's out. It might not be as good as the PhysX stand-alone cards but it'll be good enough. With the fact that the developers don't have to worry about your machine having a physics processor or not since it'll be integrated with modering GPU's, it'll get the most support. Probably with a physics engine from Havoc or maybe even a set of DirectX API's dealing specifically with physics.
 
Jan 3, 2005
136
0
0
Originally posted by: akugami
Originally posted by: keysplayr2003
These cards are probably going to be priced as much as the corporation thinks the enthusiasts market will bear, due to the enthusiasts niche it is intended for.

But remember guys, GPU's weren't always GPU's. The original GeForce was "I think" the first Geometry Processing Unit, designed to take "some" duties from the CPU. Or allow for a slower CPU without sacrificing performance.

Dunno if you read some of my other posts in different threads on physics processors but my thoughts went along the same line. Basically I think that for now there is a lack of a killer app to push PPU's like Aegia's PhysX processors. Second, I think that both nVidia and ATI are working on physics processing additions to their GPU's.

The PhysX has been announced a very long time ago. There's been plenty of time for both nVidia and ATI to begin implementing physics routines in their GPU's since that announcements. It's very likely that we will see the beginnings of this in the R600 and G80 cards. This may even account for the lack of new features in the 7900 series as nVidia's focus would mostly be on the G80 and how they can further improve it.

Sadly, I think that while Aegia had the right idea and is a pioneering company I don't feel their product will succeed. I think that the second coming of the video card will be due to physics processing enabled on the gpu cores. By the time the PhysX cards would warrant a large enough presence to be seriously considered by most gamers for inclusion in builds I think that both ATI and nVidia will have physics enabled GPU's out. It might not be as good as the PhysX stand-alone cards but it'll be good enough. With the fact that the developers don't have to worry about your machine having a physics processor or not since it'll be integrated with modering GPU's, it'll get the most support. Probably with a physics engine from Havoc or maybe even a set of DirectX API's dealing specifically with physics.

Another without a clue.

Intergrated into 5 game engines including Gamebryo Elements and Unreal Engine 3.

Supported by 12 Publishers including Microsoft, SEGA, and Sony.

Announced to be used by/in 54 delvelopers/games including GRIN, Epic, Obsidian, Mythic, and SEGA. City of Villians, GRAW, UT2007, Sacred II, RoN:Rise of Lengends, and Gothic 3.

And thats just the begining. You can bet that they will be making major annoucements next month at E3.

Yet people like you have already called the device a failure despite all the evidence to the contrary.

Yeap, clueless.
 
Jan 3, 2005
136
0
0
Here are some quick facts for the cluesless among you.

- PhysX Physics Processing Unit (PhysX Accelerator) was officially announced March 8th, 2005
- Dedicated chip designed to offload physics calculations from the CPU/GPU to the PPU
- Aimed for video/computer games and other applications
- PCI add-in card (PCI-E version expected further in the future)
- Manufactured by Taiwan Semiconductor Manufacturing Company (TSMC)
- Chip: 125 million transistors, 182 sq mm die size, 130nm process, multi-core system (multiple independent processing elements)
- Power consumption: chip = 20 watts (peak), entire PhysX card = 28 watts
- Memory interface: 128bit GDDR3
- Internal read/write memory bandwidth: ~2Tbits/s (terabits per second)
- 200 times the physics power of a modern CPU (32000 rigid bodies, 40000 to 50000 particles)
- Chip is optimized for 32-bit floating-point math
- SDK/API: AGEIA PhysX SDK (formerly called NovodeX), multi-threaded (multi-processors/multi-core CPUs), PhysX native, PC & console support
- PhysX API/SDK supports both software and hardware modes and does not necessarily require a PPU

- PhysX SDK v2.3 was the first public version that supported the PPU and also worked as a software PPU emulation tool
- May be integrated in graphics cards or motherboards in the future
- Only 1 model at launch
- Retail add-in board manufacturers: ASUSTeK Computer & BFG Technologies
- Add-in boards expected to become available in May 2006 (waiting for content)
- Price range: 199-299USD (MSRP 299USD)
- PhysX PPU enabled OEM systems launched in March 22nd, 2006
- OEM launch partners: Alienware, Dell and Falcon Northwest
- Latest free public AGEIA PhysX SDK version: 2.3.2
- Latest AGEIA PhysX driver version: 2.40

Note the listed items in bold.

Intergration on a video card stills seems unlikely with the turn around rate (6mos) of those cards. But intergration on a MB might just work. At any rate, the AGEIA PPU is here and will be with us for quite some time.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: JamesDax
Originally posted by: akugami
Originally posted by: keysplayr2003
These cards are probably going to be priced as much as the corporation thinks the enthusiasts market will bear, due to the enthusiasts niche it is intended for.

But remember guys, GPU's weren't always GPU's. The original GeForce was "I think" the first Geometry Processing Unit, designed to take "some" duties from the CPU. Or allow for a slower CPU without sacrificing performance.

Dunno if you read some of my other posts in different threads on physics processors but my thoughts went along the same line. Basically I think that for now there is a lack of a killer app to push PPU's like Aegia's PhysX processors. Second, I think that both nVidia and ATI are working on physics processing additions to their GPU's.

The PhysX has been announced a very long time ago. There's been plenty of time for both nVidia and ATI to begin implementing physics routines in their GPU's since that announcements. It's very likely that we will see the beginnings of this in the R600 and G80 cards. This may even account for the lack of new features in the 7900 series as nVidia's focus would mostly be on the G80 and how they can further improve it.

Sadly, I think that while Aegia had the right idea and is a pioneering company I don't feel their product will succeed. I think that the second coming of the video card will be due to physics processing enabled on the gpu cores. By the time the PhysX cards would warrant a large enough presence to be seriously considered by most gamers for inclusion in builds I think that both ATI and nVidia will have physics enabled GPU's out. It might not be as good as the PhysX stand-alone cards but it'll be good enough. With the fact that the developers don't have to worry about your machine having a physics processor or not since it'll be integrated with modering GPU's, it'll get the most support. Probably with a physics engine from Havoc or maybe even a set of DirectX API's dealing specifically with physics.

Another without a clue.

Intergrated into 5 game engines including Gamebryo Elements and Unreal Engine 3.

Supported by 12 Publishers including Microsoft, SEGA, and Sony.

Announced to be used by/in 54 delvelopers/games including GRIN, Epic, Obsidian, Mythic, and SEGA. City of Villians, GRAW, UT2007, Sacred II, RoN:Rise of Lengends, and Gothic 3.

And thats just the begining. You can bet that they will be making major annoucements next month at E3.

Yet people like you have already called the device a failure despite all the evidence to the contrary.

Yeap, clueless.

A note to the clueless. What I was talking about was the PhysX cards having a large enough installed user base to warrant it being a "must have" item in a gaming rig. Reading over my post, I think I wasn't quite as clear as I could have been and that's my fault. My only defense is that forum discussions are like this and people don't do too much proof reading before firing off their thoughts. Too bad instead of people being nice and posting in a positive manner they have to post in an attack style. But you have a nice day anyways.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: keysplayr2003
Originally posted by: Acanthus

It is, you wont be seeing exponential increases in physics on a 6 month cycle.

This isnt a hard concept.

And you know this because.......?
So we are all supposed to get your concept, but mine is completely ridiculous? Is that what your open mind is thinking? If you can't have a two way conversation, don't bother having one at all. I told you, I am not saying you are wrong. It just seems that nobody has any foresight here. Well, I'll tell you what, since I grow tired of talking about this to someone who is not interested, I'll revisit it when it becomes more competitive and other solutions come around.

Im going by what the people designed the chip, bulit the API, and wrote the drivers are telling us. You're just being pessimistic.

Edit: I could see this being integrated onto mobos as long as the memory goes with it. Running it off of DDRII 800 would be disastrous for performance, not to mention shared memory just doesnt bode well with me for either graphics or physics. (Turbocache/Hypermemory).

Videocards can have the memory available and bandwidth to share, but might have to fight over the bandwidth/memory space utilized by both (unless they were entirely seperate) either way for integration it would take an R&D partnership which i have heard nothing about from either ATi or Nv. And again you run into issues or rebuying the same thing over and over because of the longer cycle of PPUs vs GPUs. I see an integrated motherboard solution being the most likely candidate of any integration as Asus is one of the companies Spearheading the PPU card production. They can get their hands on the signaling and memory layouts to actually begin work on building it onto a motherboard.

It would be weird seeing 64-128MB of memory built onto the mobo though :p havent seen that since the old days of the flopped 3DFX integrated mobos with dedicated graphics memory.
 

framerateuk

Senior member
Apr 16, 2002
224
0
0
I think this is going to be a massive step forward for gaming, and ill gladly pay the £217 that OCUK have it for.

Im not planning on getting it until some games actually support it though (GRAW doesent really appeal to me), and hopefully when some do, maybe itll come bundled with a game or two :) .
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
bump, before the inevitable repost.

Also, the cellfactor demo, which shows off just the kind of complexity that this card is capable of, is in my sig, on a blasing fast streaming host.