any consideration to PhysiX factor?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: keysplayr2003
If you actually do have an 8800GT

Nice Keys, you just effecively called me a liar, over something so insignificant as what videocard I use.

Shall I dig up my posts here from well before this conversation where I mention my card? Or how about the ones from over on EVGA's forum back when I got it? Or other forms? Perhaps you'd like a picture of the card in my case with this forum thread next to it? Or would you brush that all off as one big conspiracy to defame your benefactors?


Originally posted by: keysplayr2003
So the other 1400 dollars I spent this year on a rig means nothing to me I guess?

Obviously not, as you still couldn't even manage to acknowledge my point.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: nRollo
BFG-

1. That was a LOT of years ago, lol, how is it relevant to a PhysX conversation?

2. It was a lot easier not to be impressed by the pipes that were a little shinier. PhysX not only makes a difference in gameplay, but you can't miss the huge differences in image quality.

Thing is... When the Radeons were pushing DX9, the FX series was getting slaughtered running anything in DX9 (HL2 anyone? FC?). Reasons like "looks almost as good, hardly any difference" were used againt buying the Radeons by nVidia fans. Pure and utter BS - aka marketing talk. You had quality games that used this tech! Later, the 6-series hit and SM3.0 was praised to heaven and back by the same people. Even though the X800/X850 cards were faster (well, the XL model was kinda sucky :p) and you could find a handful games (if any worthwhile...) that could use this feature (Chronicles of Riddick comes to mind - running SM3.0 killed FPS - great game btw!). Sure, it became the standard later on, but that doesn't mean the 6-series were a better buy then - you wouldn't run the SM3.0 games on those cards anyway! Perhaps one or two.

And here we are... 2008/2009... PhysX praised to heaven and back again... I find PhysX in current implementation not a valid reason to get a GeForce over a Radeon. Unless you like what is shown in those PhysX "demos", just get whatever is faster+cheaper in your budget. WIll it be the future? Hardware accelerated physics is awesome on paper, it will see heavy use in games when every "current gen" card supports it. Now it's just fluff.

Once next gen goes live and we get games that have it implemented to a great degree - sure. Right now it's as useless as SM3.0 was for the 6-series.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: keysplayr2003
If you actually do have an 8800GT

Nice Keys, you just effecively called me a liar, over something so insignificant as what videocard I use.

Shall I dig up my posts here from well before this conversation where I mention my card? Or how about the ones from over on EVGA's forum back when I got it? Or other forms? Perhaps you'd like a picture of the card in my case with this forum thread next to it? Or would you brush that all off as one big conspiracy to defame your benefactors?


Originally posted by: keysplayr2003
So the other 1400 dollars I spent this year on a rig means nothing to me I guess?

Obviously not, as you still couldn't even manage to acknowledge my point.

Maybe you are, and maybe you aren't, but that's not what I meant and probably a different conversation altogether. I should have been more clear. What I meant was, you don't have any rig in your sig. I thought you may have another card by now. But if you actually did have the 8800GT, you would be able to run the Cryostasis demo. If you have another card that did not support PhysX, you wouldn't. That's all. I will do my best to be very, very clear the next time. ;)

Now, can this childishness cease? I'd appreciate it.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Of course you get the error message ? it?s because the mini-GL driver is not compatible with non-3dfx cards, which is quite obvious since it?s 3dfx?s driver. Delete the library and it'll run on any OpenGL board.

I know this because I?ve used that version of GLQuake on non-3dfx cards and had I left the library there, I would?ve gotten the exact same error message.

So all you had to do was modify the game file structure to get it to run on a board that didn't exist when the game came out..... of course that shows that Carmack wasn't coding that version explicitly for the 3Dfx boards.

Like I said before, this is an OpenGL application and 3dfx simply provided a mini-GL driver for their boards.

Name me one board that would run GLQuake when it came out that wasn't 3Dfx. That's all I ask, just a singular example of any non 3Dfx part that could run GLQuake.

The difference is that the R300 was designed for the DX9 standard and there was never any doubt that it wouldn?t get released. This is really no different to the G80 architecture which didn?t actually get DX10 until Vista shipped. Did anyone think DX10 wasn?t coming?

But there were questions on what the final specs for DX9 would entail, MS has pulled late game switches many times before.

I can?t comment there as I didn?t have any issues. The point is that you admit at least OpenGL games worked and that?s all I need, thereby backing my claim that 32 bit color could be forced into legacy titles. I know this worked because I frequently did it without issue.

A small portion of legacy titles. Most of the games you would 'force' 32bit color in the game would do nothing.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: keysplayr2003
I should have been more clear. What I meant was, you don't have any rig in your sig. I thought you may have another card by now. But if you actually did have the 8800GT, you would be able to run the Cryostasis demo.
I've said my current card is an 8800gt in this thread more than a few times, and I also mentioned here that I just ordered a gtx260 last Friday. I don't have any sig at all, and I can't imagine ever wanting to advertise any of my possessions in one either. I also have no interest in downloading a tech demo, though I do hope there will be a gameplay demo of Cryostasis sometime, as it seems like my type of game.

Regardless, what cards I own is completely immaterial to my point; which I have reiterated multiple times, yet which you have continually failed to even acknowledge, let alone dispute. Granted, I don't expect there is anything to dispute, but your rambling on about what card I have and what it may or may not do instead of addressing my point is most certainly childish.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
Name me one board that would run GLQuake when it came out that wasn't 3Dfx. That's all I ask, just a singular example of any non 3Dfx part that could run GLQuake.
The point is that it was built on an open and accepted standard, other cards would come in time to run it well, and no one had to pay anyone else for the right to make them do so.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: keysplayr2003
I should have been more clear. What I meant was, you don't have any rig in your sig. I thought you may have another card by now. But if you actually did have the 8800GT, you would be able to run the Cryostasis demo.
I've said my current card is an 8800gt in this thread more than a few times, and I also mentioned here that I just ordered a gtx260 last Friday. I don't have any sig at all, and I can't imagine ever wanting to advertise any of my possessions in one either. I also have no interest in downloading a tech demo, though I do hope there will be a gameplay demo of Cryostasis sometime, as it seems like my type of game.

Regardless, what cards I own is completely immaterial to my point; which I have reiterated multiple times, yet which you have continually failed to even acknowledge, let alone dispute. Granted, I don't expect there is anything to dispute, but your rambling on about what card I have and what it may or may not do instead of addressing my point is most certainly childish.

Riiiiight Kyle. Good Luck with that. What ever "that" is. I have no idea what your point was. So I guess you can just keep me guessing. I'm on the edge of my seat.

Happy New Year.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
You are keeping yourself guessing, I just I've just grown tierd of watching you do so.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Ok, so I tried the Cryostasis demo on the 9800GTX+ at higher than I feel is acceptable for this level of card, (1920x1080 DX10 Highest settings possible) and it appeared to carry out with an acceptable average framerate of 27.5. There were sudden dips down to 14, and whenever this was reported, there was some obvious hard drive swapping. Maximum framerate was 61. And this is on a 2.2GHz Phenom. The 512MB GDDR3 on this card looks like it is inhibiting the card with this particular demo. I'll try turning down the res first and see what I get, if there is still hitching, I'll turn down each feature a peg and see what happens. I have some time to kill before our New Years party.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: keysplayr2003
Ok Snowman, I just ran Cryostasis at the highest possible settings I can muster with this LCDTV. 1920x1080, all settings maxxed, DX10, sound on.

Minimum 14.8 (this seems to be due to hard drive hitching. I have a feeling 512MB may be a factor.
Average 25
Maximum 61

I ran this on the Phenom system in my sig except the Phenom is no longer overclocked (couldn't get it stable) and is at the stock 2.2GHz. and a single 9800GTX+.

Originally posted by: keysplayr2003
Ok, so I tried the Cryostasis demo on the 9800GTX+ at higher than I feel is acceptable for this level of card, (1920x1080 DX10 Highest settings possible) and it appeared to carry out with an acceptable average framerate of 27.5. There were sudden dips down to 14, and whenever this was reported, there was some obvious hard drive swapping. Maximum framerate was 61. And this is on a 2.2GHz Phenom. The 512MB GDDR3 on this card looks like it is inhibiting the card with this particular demo.

Why are you repeating yourself? And why are you responding as if I ever requested any such benchmarks anyway?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The point is that it was built on an open and accepted standard

Accepted? Number of consumer boards that ran OpenGL when GLQuake came out- 0. Calling OpenGL an accepted standard would be akin to saying that Irix was an accepted standard in the same time frame. While on a technical basis it is accurate, it is misleading when having discussions about the consumer market.

and no one had to pay anyone else for the right to make them do so.

Generally, hardware vendors that are creating binaries to ship with their hardware, or software developers that write an OpenGL driver, are the only developers that need to have a license.

So yes, GLQuake did end up working the same exact way that a PhysX implementation would work today. The big difference is that instead of SGi controlling OpenGL it is nVidia controlling PhysX.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: keysplayr2003
Ok Snowman, I just ran Cryostasis at the highest possible settings I can muster with this LCDTV. 1920x1080, all settings maxxed, DX10, sound on.

Minimum 14.8 (this seems to be due to hard drive hitching. I have a feeling 512MB may be a factor.
Average 25
Maximum 61

I ran this on the Phenom system in my sig except the Phenom is no longer overclocked (couldn't get it stable) and is at the stock 2.2GHz. and a single 9800GTX+.

Originally posted by: keysplayr2003
Ok, so I tried the Cryostasis demo on the 9800GTX+ at higher than I feel is acceptable for this level of card, (1920x1080 DX10 Highest settings possible) and it appeared to carry out with an acceptable average framerate of 27.5. There were sudden dips down to 14, and whenever this was reported, there was some obvious hard drive swapping. Maximum framerate was 61. And this is on a 2.2GHz Phenom. The 512MB GDDR3 on this card looks like it is inhibiting the card with this particular demo.

Why are you repeating yourself? And why are you responding as if I ever requested any such benchmarks anyway?

Aren't you going to wish me a happy new year?
Let go of the hate dude. No reason for it. Just let it gooooo..... It's evident in your every post to me. Can't be healthy. Let it gooooo..... :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: nRollo

Much as I hate to admit it, it appears none of us are entirely correct about the great GLQuake debate:
How so? I said it was an OpenGL application and I was right despite several claims to the contrary. It was not a 3dfx/Glide application. That 3dfx made a mini-GL driver is inconsequential to this fact.

Ironically, this proves the point I was trying to make: devs do code for non-standard solutions. Rendition was proprietary to their Verite chipset.
But Carmack himself said it was a big mistake targeting a proprietary standard, a standard that PhysX currently is.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: BenSkywalker

So all you had to do was modify the game file structure to get it to run on a board that didn't exist when the game came out..... of course that shows that Carmack wasn't coding that version explicitly for the 3Dfx boards.
He wasn?t coding anything for 3dfx cards. He made an OpenGL application and 3dfx made a mini-GL driver for their boards which was bundled with his application. By deleting the driver 3dfx made, any GPU with an OpenGL ICD could run it.

I mean why would you expect 3dfx?s mini-GL to function on non-3dfx cards?

Using your argument, since atiogl.dll doesn?t work on nVidia cards, can I conclude that all OpenGL games have been made for ATi because I have to remove that file through add/remove programs when I try to run OpenGL applications on nVidia hardware?

Name me one board that would run GLQuake when it came out that wasn't 3Dfx. That's all I ask, just a singular example of any non 3Dfx part that could run GLQuake.
Intergraph Realizm. Other workstation cards with OpenGL drivers could run it too though performance might not have been exceptional, but the point is they could run it.

But there were questions on what the final specs for DX9 would entail, MS has pulled late game switches many times before.
But there was never any question that it was coming and that shaders would be improved over that of DX8.1, improvements the R300 had.

A small portion of legacy titles.
Which is more than the zero legacy titles that can have PhysX forced into them. Also I?d like to add that I?m pretty sure 3dfx had some options to force 32 bit mode into legacy Glide games on the VSA-100 chips.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I for one, welcome any new technology that makes games better. I guess some people are still happy with old school stuff like minesweeper and that's fine. I would be happy if AMD/ATI would put some effort into new gaming technology but they either don't have the resources or the interest in doing so.

I look forward to implementations of physics using Havok and DX11.

However right now PhysX is available and in use. It's also free to anyone who made the correct card purchase. What's not to like?
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Wreckage
I for one, welcome any new technology that makes games better. I guess some people are still happy with old school stuff like minesweeper and that's fine. I would be happy if AMD/ATI would put some effort into new gaming technology but they either don't have the resources or the interest in doing so.

I look forward to implementations of physics using Havok and DX11.

However right now PhysX is available and in use. It's also free to anyone who made the correct card purchase. What's not to like?
Your missing the point. The OP was looking at buying a 9800GTX+, which personally from the cryostatsis demo looks to weak to even run PhysX. Now if that's only the second new GPU PhysX title, what can the OP expect out of other new titles?

All I'm saying is current buyers don't need to consider features that don't have enough titles out, this goes for both DX10.1 and PhysX. You can copy and paste that useless list if you like though.

@Everyone - Happy New Year
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SSChevy2001

Your missing the point. The OP was looking at buying a 9800GTX+, which personally from the cryostatsis demo looks to weak to even run PhysX. Now if that's only the second new GPU PhysX title, what can the OP expect out of other new titles?

All I'm saying is current buyers don't need to consider features that don't have enough titles out, this goes for both DX10.1 and PhysX. You can copy and paste that useless list if you like though.

@Everyone - Happy New Year

So he should base his whole decision on one demo? Now that's pretty low standards even for you. Also Keys showed that it does not run bad at all so I think your point is moot.

http://www.techreport.com/articles.x/15261
8800GT running PhysX just fine.



 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Wreckage
So he should base his whole decision on one demo? Now that's pretty low standards even for you. Also Keys showed that it does not run bad at all so I think your point is moot.

http://www.techreport.com/articles.x/15261
8800GT running PhysX just fine.
Sure why not? Can you or Keys promise the OP that he'll be able to handle all the GPU PhysX titles that come out this year? At least give the OP a years worth of GPU PhysX enjoyment.

UT3 is by no means a very demanding title. In that benchmark the GPU PhysX map reduce the AVG FPS by 40%, which is a good chunk of change.

This year titles more than likely aren't going to avg 60FPS on a 9800GTX+. For the hell of it lets just say they avg 40 FPS on a none GPU PhysX map, all of a sudden your avg FPS with GPU PhysX becomes 25FPS and your mins become 16FPS just like Keys showed.
 

WelshBloke

Lifer
Jan 12, 2005
32,654
10,831
136
Given that the OP has yet to return to this thread and that its not going to stop being a massive bitch fest I think it needs locking.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: SSChevy2001
Originally posted by: Wreckage
So he should base his whole decision on one demo? Now that's pretty low standards even for you. Also Keys showed that it does not run bad at all so I think your point is moot.

http://www.techreport.com/articles.x/15261
8800GT running PhysX just fine.
Sure why not? Can you or Keys promise the OP that he'll be able to handle all the GPU PhysX titles that come out this year? At least give the OP a years worth of GPU PhysX enjoyment.

UT3 is by no means a very demanding title. In that benchmark the GPU PhysX map reduce the AVG FPS by 40%, which is a good chunk of change.

This year titles more than likely aren't going to avg 60FPS on a 9800GTX+. For the hell of it lets just say they avg 40 FPS on a none GPU PhysX map, all of a sudden your avg FPS with GPU PhysX becomes 25FPS and your mins become 16FPS just like Keys showed.

This statement has me a little stymied here. Would you yourself recommend a 9800GTX to someone who wants to be able to play all of this years games? PhysX or not? There is no way of knowing that man. You just asked an impossible question to answer, plus asked for a guaranty. Who does that when recommending a mainstream card? If you want any sort of longevity and do not purchase GPU's very often, then you buy a higher end card. You know this. The numbers I see you suggesting are fictional, set up by you to fail. Your example number could easily have been avg 60fps without PhysX and 45 with.
That 16fps minimum was due to hard drive hitching. Even repeating myself and showing my results twice, everyone glosses over this little tidbit. Without the hard drive hitching, the fps remained between 27-35 and never dropping lower until the HDD light cranked on for some swappage. That would be the 512MB on the card. Not PhysX. And also don't forget I was running at 1920x1200 with the maximum possible settings in DX10. Not a setting I would recommend to begin with on a 9800GTX and a 2.2 GHz Phenom. A 3GHz C2D and a GTX260 would be a better match for such settings on most games out there. And thats what I recommend.
There are also step up options from eVGA and BFG. Those would be the manufacturers I would suggest to give the OP a step up option if needed or wanted. It's only 90 days, but that should be sufficient to see a few PhysX titles anyway.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SSChevy2001

Sure why not? Can you or Keys promise the OP that he'll be able to handle all the GPU PhysX titles that come out this year?

Can you promise that any card will run all titles that come out this year with full AA/AF at the highest resolutions?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: keysplayr2003
Aren't you going to wish me a happy new year?
Let go of the hate dude. No reason for it. Just let it gooooo..... It's evident in your every post to me. Can't be healthy. Let it gooooo..... :)

I don't hate you or anyone. I also don't ascribe any value to arbitrary designations such as that of a new year. However, since it seems such things are impotant to you; happy new year, Keys!

Originally posted by: keysplayr2003
This statement has me a little stymied here. Would you yourself recommend a 9800GTX to someone who wants to be able to play all of this years games?

I would, and did, recommend both the 9800gtx+ and the 4850 in the price range he is shopping in.

Do you think there is any games coming out this year which either won't play? Are you suggesting he should either increase his budget or else forget about buying a card all together?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: keysplayr2003
Aren't you going to wish me a happy new year?
Let go of the hate dude. No reason for it. Just let it gooooo..... It's evident in your every post to me. Can't be healthy. Let it gooooo..... :)

I don't hate you or anyone. I also don't ascribe any value to arbitrary designations such as that of a new year. However, since it seems such things are impotant to you; happy new year, Keys!
Thank you Snowman. Happy New Year to you as well.
Originally posted by: keysplayr2003
This statement has me a little stymied here. Would you yourself recommend a 9800GTX to someone who wants to be able to play all of this years games?

I would, and did, recommend both the 9800gtx+ and the 4850 in the price range he is shopping in.

This was actually directed at SSChevy, but your recommendations were good. But we can't ignore the concern the OP had for PhysX. This must factor in to your recommendations.

Do you think there is any games coming out this year which either won't play? Are you suggesting he should either increase his budget or else forget about buying a card all together?

Impossible to say. I know the 4850 won't be able to play PhysX titles at least with PhysX enabled and only if there is option to run without it. Which I think would be prudent of developers to include. There could be the next killer "Crysis" for 2009 that would make even the most powerful cards crawl (with or without PhysX). Who knows.
I'm saying to buy smart. If he must buy now, get something that will allow himself to step up if even needed. And since a 4850 and 9800GTX perform comparably and are priced comparably, it could not hurt to choose the 9800GTX over the 4850 for PhysX capabilities.
In fact, it is the more sensible purchase of the two you recommended. IMHO.



 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: keysplayr2003
If he must buy now, get something that will allow himself to step up if even needed.

Step up is a great option, and lead me to pick EVGA for my current 8800gt, as well as the gtx260 which should be here tommorow. I figure I will likely hold out for DX11 cards, but I might swap out for the gtx285 or whatever if it seems worth the cost. However, that is off topic in this thread.

Originally posted by: keysplayr2003
And since a 4850 and 9800GTX perform comparably and are priced comparably, it could not hurt to choose the 9800GTX over the 4850 for PhysX capabilities.

It might not hurt, it might prove useful, or features like DX10.1 or HD audio support might prove more useful. I can't rightly answer which will happen for for myself let alone someone else, and neither can you.

Originally posted by: keysplayr2003
In fact, it is the more sensible purchase of the two you recommended. IMHO.

You say that as if I ever could have expected any different from you. :p
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: keysplayr2003
If he must buy now, get something that will allow himself to step up if even needed.

Step up is a great option, and lead me to pick EVGA for my current 8800gt, as well as the gtx260 which should be here tommorow. I figure I will likely hold out for DX11 cards, but I might swap out for the gtx285 or whatever if it seems worth the cost. However, that is off topic in this thread.

Originally posted by: keysplayr2003
And since a 4850 and 9800GTX perform comparably and are priced comparably, it could not hurt to choose the 9800GTX over the 4850 for PhysX capabilities.

It might not hurt, it might prove useful, or features like DX10.1 or HD audio support might prove more useful. I can't rightly answer which will happen for for myself let alone someone else, and neither can you.

Originally posted by: keysplayr2003
In fact, it is the more sensible purchase of the two you recommended. IMHO.

You say that as if I ever could have expected any different from you. :p

I know. But it's because I really believe it. :beer: