THQ chooses Nvidia's PhysX technology for better gaming

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nosfe

Senior member
Aug 8, 2007
424
0
0
so the problem isn't with the GPU per se but with the heat it generates so wouldn't that make microsoft responsible because they didn't handle the heat properly? ati designed the chip, the cooling of it was left to someone else
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: nosfe
so the problem isn't with the GPU per se but with the heat it generates so wouldn't that make microsoft responsible because they didn't handle the heat properly? ati designed the chip, the cooling of it was left to someone else
Certainly Microsoft has to shoulder some of the blame, but the root cause of the problem was still the lil fireball of a GPU ATI designed. I guess if Microsoft did their homework and saw how ATI parts generally ran hot in order to squeeze every bit of performance out of their chips, they might've put something like a Tuniq Tower on the thing. But that obviously wouldn't work if they hoped to fit it inside a slim profile like the 360.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@chizow

Cold solder joints are the problem not the GPU heat. Who ever balled the GPU is the one responsible for the RROD. Then again the X-Clamps and cooling system could of been designed better aswell.

If you want to blame ATi, then find me proof that they balled the GPU for MS.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
@chizow

Cold solder joints are the problem not the GPU heat. Who ever balled the GPU is the one responsible for the RROD. Then again the X-Clamps and cooling system could of been designed better aswell.

If you want to blame ATi, then find me proof that they balled the GPU for MS.
And cold solder/low-lead joints don't melt by themselves.....this is the same thing that happened to Nvidia and its caused by excessive heat coupled with cooling and power cycling. Look at the picture in AT's link. Between the heat melting the joints and warping the board, the GPU was physically being severed from the mobo and shifting around lol. Think of it like....taking a piece of bread, putting some dabs of mayo on it, then pressing it against a window and smushing it around haha. As for proof, I've already provided plenty.
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: geokilla
I'm not going to argue with you guys on how Physics work and stuff. However, this news can't be good for ATI and Havoc can it? Maybe it's time for all of us to make the switch to NVIDIA GPUs. More and more game publishers and them seem to choose PhsyX over Havoc physx.

If that happens I don't think you would be able to afford even that video card that's in your sig. You want another monopoly? Think before you type.

No phsyx game will make me switch to nvidia... (at least not on that feature alone) Once Intel enters the market along with AMD getting serious about it, game Devs will need to fully support everybody otherwise they'll lose sales, guaranteed.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: chizow
Originally posted by: SSChevy2001
@chizow

Cold solder joints are the problem not the GPU heat. Who ever balled the GPU is the one responsible for the RROD. Then again the X-Clamps and cooling system could of been designed better aswell.

If you want to blame ATi, then find me proof that they balled the GPU for MS.
And cold solder/low-lead joints don't melt by themselves.....this is the same thing that happened to Nvidia and its caused by excessive heat coupled with cooling and power cycling. Look at the picture in AT's link. Between the heat melting the joints and warping the board, the GPU was physically being severed from the mobo and shifting around lol. Think of it like....taking a piece of bread, putting some dabs of mayo on it, then pressing it against a window and smushing it around haha. As for proof, I've already provided plenty.
You accused ATi GPU as the cause of RROD. But ATi didn't design the cooling system and they didn't attach the GPU to the MB. You don't even have proof they balled the GPU. You seriously need more proof before you continue with this conversation.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
the reason why it's not ati's fault IF RROD was caused by heat is because when handing over a gpu design you also hand over the thermal envelope. Microsoft knew how much heat the gpu would produce. Microsoft then commissioned a cooling solution for the 360 that was cheap and inadequate. It's highly likely that microsoft's at fault here though simply because they didn't say who's fault it is. If i were Balmer i'd sue the ground beneath ati if it was ati's fault(like if ati mislead microsoft about the thermal envelope), wouldn't you? I'd say the RROD was caused mainly because microsoft was in a hurry to release the 360 and so didn't properly test and notice the high number of "kinks" it had

Also, nvidia's problem wasn't caused by the heat, the heat merely hastened the inevitable. there's a small difference between: a)the motherboard melts because the gpu has a crappy cooler and b)the chip peels off because the bumps are bad.Of course it could also be attributed to heat because nvidia said that the chips could sustain X amount of heat but in reality they couldn't.

Anyway lets get back to the previous off topic discussion :p
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.

The FX5800 dustbuster?
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: nosfe
Anyway lets get back to the previous off topic discussion :p
The point is Nvidia throwing around the word PhysX, and people think it means every title will have GPU or PPU PhysX. That just not the case. Currently no console supports GPU or PPU PhysX effects.

The end result is Nvidia sold their physic engine to be used, nothing else.

Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.

The FX5800 dustbuster?
LOL so true.

http://www.youtube.com/watch?v=WOVjZqC1AE4
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
You accused ATi GPU as the cause of RROD. But ATi didn't design the cooling system and they didn't attach the GPU to the MB. You don't even have proof they balled the GPU. You seriously need more proof before you continue with this conversation.
Yes I've provided evidence from multiple reliable sources and publications that have come to the same conclusion, that the excessive heat from ATI designed GPUs was the primary cause of RROD. If you're going to list off a bunch of arbitrary reasons for future console design decisions I would certainly expect the single biggest debacle in consumer electronics history to be included as a deciding factor, wouldn't you?

Originally posted by: nosfe
Also, nvidia's problem wasn't caused by the heat
Yes it was, and they're taking the fall for it.

Originally posted by: Creig
The FX5800 dustbuster?
You're comparing the engineering and acoustics of the FX5800 to an endemic system failure resulting in at least 1 billion repair expenditure and estimated 30% failure rate? Not surprising. :)

But ya the Dustbuster was actually quite innovative for its time, it was just poorly executed. Dual-slot blower design, heat pipes, heat expelled outside the case. Those are all standard features now on the best GPU coolers.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
@chizow

If it was ATi fault then why weren't they sued?
Because MS bought the design from ATI, and by doing so assumed mea culpa. But that doesn't mean heat and their previous designer won't be their primary consideration when they commission the next design. Again, this is widely documented so if you're truly interested in RROD you can read up at your leisure.
 

akugami

Diamond Member
Feb 14, 2005
5,995
2,328
136
Originally posted by: MarcVenice
All of you should take a deep breath, and rethink this.

No-one here is against physx, some of us are just skeptical about it. What is so hard to understand about most if not all games being developed with consoles in mind, and consoles not having 8x00 serie's videocards or a ppu? How are you going to explain physx in console titles, that will ultimately be ported to the pc, when those same consoles simply can't run physx, because they lack the power to do so. One prime example is mirror's edge, developed for the consoles, ported to the PC, with extra physx added to it. Why didn't the consoles have those physx if they can run it just the same? They can't, because they lack the horsepower to do so. Physx requires massive amounts of parallel processing, even the cell processor doesn't come close to all the stream processors even midrange nvidia cards have.

So, if you took a deep breath, you know that me, nor anyone else is against physx, we just doubt how it's going to find it's way to the PC, when games are clearly being developed for platforms, that do not have the paralell processing power of a PC. Sure, they can add physx during the port, like with Mirror's Edge, which is great ( if it's free, why not huh ) but I'm still going to wait and see what happens, because adding physx is in no way free for any developer. And, if it's added during a port, and not when the game is being developed from the ground up, it's going to be eye-candy stuff, and not gameplay altering physx (which is my gripe, but that's a whole other discussion).

/end_thread

Seriously, I have not seen one title that uses PhysX to any large degree that wasn't possible before. When I imagine physics acceleration, I want it to be in a state where the added physics acceleration added a new level to game play. Pretty flying debris and realistic clothes flapping is not adding a new level of game play.

I agree 100% with what MarcVenice said and anyone who is the least bit objective would have zero problem with what he said. Even those who favor AMD/ATI or nVidia should have zero problem with what he said. It is those blind followers of one company or another who is foaming at the mouth if you have the slightest doubt in the glory of PhysX or think PhysX is 100% pure fecal matter that have a real problem with PhysX and whether it succeeds or fails.

I am intrigued by physics acceleration but when ATI (for whatever reasons) and who owns a very sizable chunk of the market decides they are not supporting PhysX, one has to wonder at the long term viability of PhysX and whether or not it might be supplanted by a solution from a 3rd party such as Microsoft. This solution might not be as good as a proprietary solution from either ATI or nVidia but at least it'll work everywhere. Much like DirectX does now.

Originally posted by: dadach
one thing you must understand...all most of these guys see is rollos and keys signatures where it says FREE GFX cards for something they write on forums...can you really blame for trying so hard to get in that grp?...i dont, so i dont take them too seriously

as far as physx goes, bring it on already...and not in slideshow version...just the fact that something can be done, doesnt always mean it can be done well or efficient...the gaming world is still waiting

My personal view on Rollo is as it has always been from day one. Keys however is a very objective person. I have no problem with his views and can say from past experiences debating with him that he is not a blind follower of anyone. Just because the person favors one company over another does not mean they can't be objective. I also know some who favor ATI that are objective.

It is the true fanatics (of any monolithic company that contributes jack & sh!t to them personally) that I find annoying.


Originally posted by: chizow
Also you seem to be forgetting ATI GPUs in the XBox360 largely contributed to the most catostrophic and widespread system failures in the history of consoles/PC hardware with the RROD debacle. I know you're going to bring up NV mobile GPUs so I'll preemptively add 1 billion+ > 200 million.

I think this is more of a case of MS screwing up. Later revisions of the Xbox 360 hardware with beefier cooling had less problems with the RROD. I think the original Xbox 360 had engineering problems unrelated (though caused) by anything ATI did. MS tried to cut corners and it bit them in the rear.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
yep, same for me. Physics? why not, it IS the next step toward nicer graphics(how big a step is debatable). However I'm not yet ready to jump on the PhysX bandwagon, it's too early. I'll wait and see how it fares against dx11 and OpenCL. It'll take more than 10 games with "real" gpu PhysX support for me to consider it when i'm purchasing(or recommending) a graphics card
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
Originally posted by: SSChevy2001
Originally posted by: nosfe
Anyway lets get back to the previous off topic discussion :p
The point is Nvidia throwing around the word PhysX, and people think it means every title will have GPU or PPU PhysX. That just not the case. Currently no console supports GPU or PPU PhysX effects.

The end result is Nvidia sold their physic engine to be used, nothing else.

Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.

The FX5800 dustbuster?
LOL so true.

http://www.youtube.com/watch?v=WOVjZqC1AE4

That's hilarious!
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: akugami
I am intrigued by physics acceleration but when ATI (for whatever reasons) and who owns a very sizable chunk of the market decides they are not supporting PhysX, one has to wonder at the long term viability of PhysX and whether or not it might be supplanted by a solution from a 3rd party such as Microsoft. This solution might not be as good as a proprietary solution from either ATI or nVidia but at least it'll work everywhere. Much like DirectX does now.
See this is the part that most people don't seem to understand. PhysX is just the software tool kit that will be compatible with any API. These are the layers of FUD that need to be constantly peeled away from those trying to detract from the significance or future of PhysX. PhysX and CUDA will be compatible with OpenCL and DirectX 11. That means Nvidia hardware will be compatible with OpenCL and DirectX 11. That also means Nvidia hardware will be compatible with any other physics SDK. This will also most likely be true with ATI hardware, as long as they are compatible with OpenCL and DirectX 11.

Simple questions for you here: do you have a Sound Blaster card? Do you have a DX10 video card? Do they offer tangible benefit over lower standards? How do these standards co-exist and offer benefits when not all hardware supports them? PhysX should be viewed no differently and when you compare the potential market share and install-base you'll see all indicators favor PhysX's chances for survival.

I think this is more of a case of MS screwing up. Later revisions of the Xbox 360 hardware with beefier cooling had less problems with the RROD. I think the original Xbox 360 had engineering problems unrelated (though caused) by anything ATI did. MS tried to cut corners and it bit them in the rear.
Well this is getting OT but:

Xenon = 90nm CPU, 90nm GPU. Reported fail rates of at least 30%.
Falcon = 65nm CPU, 80nm GPU, better GPU HS and glue. Reported fail rates still as high as 20%
Jasper = 65nm CPU, 65nm GPU. Results? Most seem to think much lower fail rates.

MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?

 

nosfe

Senior member
Aug 8, 2007
424
0
0
yep, PhysX also works on the cpu only. I've even played some games where PhysX is implemented in this way and i didn't notice that there's physics in those games at all, that's one of the main reasons why i'm skeptical about PhysX. So far i've seen either cpu only effects that don't look like anything that couldn't be done without it or gpu only effects where the effects look nice but they also eat the your frames alive: 36 avg fps for the cryostasis techdemo at 1680x1050 at 1xAA/16xAF with a gtx 260 c216? what's this? crysis? And the worst part of that demo is that most of the time it's inside small, cramped spaces without any enemies to shoot at. At this pace you'll be watching slideshows if two monsters go under a shower of water. If there would be large open spaces in the game(that's the new trend in shooters, right?) i highly doubt that even a gtx 280 would be able to handle it

PhysX is nice, it's just ahead of it's time
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.

The FX5800 dustbuster?

:confused:

I thought this thread was about PhysX, not six year old video cards.

The PhysX revolution is gaining momentum Creig and there's only one place to get it. The EA, THQ, 2KGames, etc. announcements of today are the launching point of games with higher levels of immersion for those of us with NVIDIA graphics cards tomorrow.

With the launch of every game, and the addition of every developer, PhysX is becoming more difficult for card buyers to ignore.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Originally posted by: chizow
Originally posted by: MarcVenice
All of you should take a deep breath, and rethink this.

No-one here is against physx, some of us are just skeptical about it.
Actually from your comments here and in previous threads, that's highly questionable. Looks like you're pushing AMD's agenda, against physx but not physics.... So now that your earlier claims have been corrected and its been shown that PhysX is widely adopted on both the PC and consoles, you've shifted your focus to accelerated vs. CPU PhysX? Hardware physics is here for good folks, it doesn't much matter what brand, highly parallel processors (GPUs) make effects never possible before, possible.

Originally posted by: MarcVenice
I don't think you get it, I've read all of those, I've played all of those games, I've talked to developers, and more. I review games, remember, I dare say I know twice as much about games as you do. Studio's using physx doesn't necesarily mean GPU accelerated physx. Which is the problem, we've seen physx and havok in plenty of games, but not on a scale like we see physx in those few special maps UT3 has, or the tech demo's nvidia gives us.

So, take another deep breath, and try to wrap your mind about this. Games are developed for consoles, consoles can't run gpu or ppu accelerated physx, thus games made for the consoles won't have physx that take advantage of nvidia's gpu's, through cuda. Since most games are developed for the consoles, PC's games won't see many physx that nvidia gpu owners can take advantage of.
How can you claim this knowing Mirror's Edge falls directly in this vein? Its a console port, with additional GPU accelerated PhysX effects which were directly cited as the reason it was delayed. So yes, there will need to be additional PhysX effects added for the PC because consoles will not be able to run them, however, the PhysX SDK offers seamless integration of additional effects since it is the same toolset.

I've never doubted physx's adoption, I've been installing games for years that needed physx to be installed as a software solution. I've been eyeballing that techdemo, not sure what it's called anymore, the free game, to showcase what the PPU could do. I was excited, till I found out no1 had a ppu, and no other games supported the same kind of physx the showcase game had.

What I doubt is the adoption of non-software physx that actually mean something, like we will soon be seeing in Mirror's Edge, because so many games, like mirror's edge are developed for the consoles, and thus need extra developing time if they are to have extra physx, that can take advantage of nvidia gpu's. Which is exactly what I have been saying. I know what physx can do, and how it can be used, and that it's almost seamless. Adding those things doesn't happen for free though.

So really, saying I'm pushing AMD's agenda is laughable, when one does not agree with you, it doesn't instantly mean they are a fanboy of the opposite side. If AMD could run physx, through cuda or whatever else, and the consoles could too, I bet a dozen developers would jump on it, and use it to innovate their games. But imo, it's not going to happen right now. Maybe with next-gen consoles, a standardized physics api, who knows.

So, why should I be excited again about this deal? Oh woopdiedoo, THQ is going to implement software emulated physx, like tiny ragdoll effects. Wait, I've had those for years. /end sarcasm
 

akugami

Diamond Member
Feb 14, 2005
5,995
2,328
136
Originally posted by: chizow
Originally posted by: akugami
I am intrigued by physics acceleration but when ATI (for whatever reasons) and who owns a very sizable chunk of the market decides they are not supporting PhysX, one has to wonder at the long term viability of PhysX and whether or not it might be supplanted by a solution from a 3rd party such as Microsoft. This solution might not be as good as a proprietary solution from either ATI or nVidia but at least it'll work everywhere. Much like DirectX does now.
See this is the part that most people don't seem to understand. PhysX is just the software tool kit that will be compatible with any API. These are the layers of FUD that need to be constantly peeled away from those trying to detract from the significance or future of PhysX. PhysX and CUDA will be compatible with OpenCL and DirectX 11. That means Nvidia hardware will be compatible with OpenCL and DirectX 11. That also means Nvidia hardware will be compatible with any other physics SDK. This will also most likely be true with ATI hardware, as long as they are compatible with OpenCL and DirectX 11.

Simple questions for you here: do you have a Sound Blaster card? Do you have a DX10 video card? Do they offer tangible benefit over lower standards? How do these standards co-exist and offer benefits when not all hardware supports them? PhysX should be viewed no differently and when you compare the potential market share and install-base you'll see all indicators favor PhysX's chances for survival.

I think this is more of a case of MS screwing up. Later revisions of the Xbox 360 hardware with beefier cooling had less problems with the RROD. I think the original Xbox 360 had engineering problems unrelated (though caused) by anything ATI did. MS tried to cut corners and it bit them in the rear.
Well this is getting OT but:

Xenon = 90nm CPU, 90nm GPU. Reported fail rates of at least 30%.
Falcon = 65nm CPU, 80nm GPU, better GPU HS and glue. Reported fail rates still as high as 20%
Jasper = 65nm CPU, 65nm GPU. Results? Most seem to think much lower fail rates.

MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?

Well, the issue remains IMHO regarding PhysX. I think GPU PhysX will be stifled without ATI support and ATI has no reason to support it. CPU PhysX will help a bit with some extra debris or some clothes flapping but aside from a little extra visual appeal...what does it do for changing the way we play games? Even Mirrors Edge which will make use of PhysX seems to be the particle effect / clothes flapping variety.

Again, this is not a rant against PhysX. I also don't believe the ATI only solution (forget what it's called if it tells you how closely I'm paying attention to it) will also fail. There are hundreds of dead standards, even if they were once promising, simply because they were not widely adopted. ATI has reason to not adopt PhysX and like it or not, that will hinder its widespread adoption. Just because it is the most robust and complete solution today does not mean it will not be overtaken by a competitor down the line. Especially considering physics acceleration is still in its infancy. Remember IBM used to be synonymous with the personal computer. Look at where it is now.

As a developer, do you heavily code for PhysX and potentially change the way your game is played for a sizable chunk of the market? At this point I'd say no, unless you get some of your development costs subsidized. If it's merely software PhysX I think the changes are not going to be innovative and merely slightly more eye candy. Considering the state of PhysX GPU acceleration now, we need to buy a second video card for it and most users will simply not do that. That chokes off not just ATI card owners but a good chunk of nVidia owners as well since most gamers still only use a single video card in their system. The enthusiast found on web sites such as Anandtech are simply the exception and not the norm.

Perhaps I'm seeing PhysX wrong but why code for a different physics SDK that will then make calls to the PhysX or ATI API's when MS can dictate a set of standardized physics calls that will be part of (or a subset of) DX11. ATI and nVidia will provide drivers for their respective video cards much like how DirectX works now with normal 3D graphics acceleration.

I do not see any physics acceleration truly taking off until we see the launch of $100-150 video cards that can be found in the like of Dells and HP's that provide not only good 3D accelerated graphics but provide good physics acceleration as well. Right now I feel we are on the cusp of physics acceleration much like when 3dfx first came out with 3D accelerated graphics cards. Again, I don't see any ATI only or nVidia only solution succeeding. I truly believe it will be someone such as MS or even Intel that reigns and takes charge of physics acceleration.

MS makes sense due to their OS domination as well as DirectX. Intel makes sense because of their entering the enthusiast GPU market. Intel is, even if not on the high end, a major player in the video chipset market.

Off Topic. I still have to say the blame lies squarely on MS for the RROD debacle that is related to ATI GPU's on the Xbox 360. There is no way in hell ATI did not give them the proper specs for that GPU. It was MS's fault for improper heat dissipation that resulted in overheated solder joints that caused the RROD. Also, I think the Falcon revision of the Xbox 360's have a failure rate in the low teens if not under 10% and not 20%. A surface scan of the HSF was registered maxing out at 150F for earlier Xbox's while on the Falcon revision it was at 110F, meaning the new HSF was doing a much better job of keeping things cooled.
 

akugami

Diamond Member
Feb 14, 2005
5,995
2,328
136
Oh, and if it hasn't been mentioned. There is no financial or competitive reason for ATI to adopt PhysX in any way shape or form as it currently stands. ATI would actually be at the mercy of their main competitor, nVidia, as nVidia can hold off on new revisions to PhysX and not release it to ATI until the last minute. Much like how MS kept some API's proprietary or at least held in secret some API calls that it used for its own software but did not release to others.

If PhysX was made open source then it would make sense for ATI to implement it. It would have transparency in what gets included and what direction PhysX takes so it isn't blind sided by some new feature that nVidia can take advantage of for at least six months and likely a year. On the flip side, nVidia has zero financial or competitive reason to make PhysX open source and benefit ATI. PhysX cost nVidia too much money for them to just give it away for free, and I'm not talking about free licenses to the API's and SDK's. Not to mention nVidia believes (rightly so) that it gives them a competitive advantage over ATI.

Do not in any way believe nVidia licensing PhysX to ATI for "free" is in any way shape or form for altruistic reasons.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,770
54
91
Originally posted by: nRollo
Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.

The FX5800 dustbuster?

:confused:

I thought this thread was about PhysX, not six year old video cards.

The PhysX revolution is gaining momentum Creig and there's only one place to get it. The EA, THQ, 2KGames, etc. announcements of today are the launching point of games with higher levels of immersion for those of us with NVIDIA graphics cards tomorrow.

With the launch of every game, and the addition of every developer, PhysX is becoming more difficult for card buyers to ignore.
^^ABOVE IS A PAID ADVERTISEMENT.^^

physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months. what really affects people's buying decision besides framerates and ability to perform at highres/high settings is drivers. the 8800gt was my first shot at nvidia since 2004ish and i've gotten a ratio of about 5:1 beta/leaked drivers:eek:fficial drivers. (oh and less nv4.disp BSOD's would be nice.)

again, physx will not affect my next buying decision. right now all i see with this is some nice explosions that i can live without.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
Originally posted by: cmdrdredd
The gameplay on Wii is for kids...really. I'd never be caught dead playing that horrible piece of crap.

Look at games sales, they're pitiful. The system sells, people buy one game and realize they wasted their money and then they buy a 360 and buy 3 games.

Your opinions, all of them, are now worthless. Although I never really put that much worth in them to begin with.