Certainly Microsoft has to shoulder some of the blame, but the root cause of the problem was still the lil fireball of a GPU ATI designed. I guess if Microsoft did their homework and saw how ATI parts generally ran hot in order to squeeze every bit of performance out of their chips, they might've put something like a Tuniq Tower on the thing. But that obviously wouldn't work if they hoped to fit it inside a slim profile like the 360.Originally posted by: nosfe
so the problem isn't with the GPU per se but with the heat it generates so wouldn't that make microsoft responsible because they didn't handle the heat properly? ati designed the chip, the cooling of it was left to someone else
And cold solder/low-lead joints don't melt by themselves.....this is the same thing that happened to Nvidia and its caused by excessive heat coupled with cooling and power cycling. Look at the picture in AT's link. Between the heat melting the joints and warping the board, the GPU was physically being severed from the mobo and shifting around lol. Think of it like....taking a piece of bread, putting some dabs of mayo on it, then pressing it against a window and smushing it around haha. As for proof, I've already provided plenty.Originally posted by: SSChevy2001
@chizow
Cold solder joints are the problem not the GPU heat. Who ever balled the GPU is the one responsible for the RROD. Then again the X-Clamps and cooling system could of been designed better aswell.
If you want to blame ATi, then find me proof that they balled the GPU for MS.
Originally posted by: geokilla
I'm not going to argue with you guys on how Physics work and stuff. However, this news can't be good for ATI and Havoc can it? Maybe it's time for all of us to make the switch to NVIDIA GPUs. More and more game publishers and them seem to choose PhsyX over Havoc physx.
You accused ATi GPU as the cause of RROD. But ATi didn't design the cooling system and they didn't attach the GPU to the MB. You don't even have proof they balled the GPU. You seriously need more proof before you continue with this conversation.Originally posted by: chizow
And cold solder/low-lead joints don't melt by themselves.....this is the same thing that happened to Nvidia and its caused by excessive heat coupled with cooling and power cycling. Look at the picture in AT's link. Between the heat melting the joints and warping the board, the GPU was physically being severed from the mobo and shifting around lol. Think of it like....taking a piece of bread, putting some dabs of mayo on it, then pressing it against a window and smushing it around haha. As for proof, I've already provided plenty.Originally posted by: SSChevy2001
@chizow
Cold solder joints are the problem not the GPU heat. Who ever balled the GPU is the one responsible for the RROD. Then again the X-Clamps and cooling system could of been designed better aswell.
If you want to blame ATi, then find me proof that they balled the GPU for MS.
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.
The point is Nvidia throwing around the word PhysX, and people think it means every title will have GPU or PPU PhysX. That just not the case. Currently no console supports GPU or PPU PhysX effects.Originally posted by: nosfe
Anyway lets get back to the previous off topic discussion
LOL so true.Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.
The FX5800 dustbuster?
Yes I've provided evidence from multiple reliable sources and publications that have come to the same conclusion, that the excessive heat from ATI designed GPUs was the primary cause of RROD. If you're going to list off a bunch of arbitrary reasons for future console design decisions I would certainly expect the single biggest debacle in consumer electronics history to be included as a deciding factor, wouldn't you?Originally posted by: SSChevy2001
You accused ATi GPU as the cause of RROD. But ATi didn't design the cooling system and they didn't attach the GPU to the MB. You don't even have proof they balled the GPU. You seriously need more proof before you continue with this conversation.
Yes it was, and they're taking the fall for it.Originally posted by: nosfe
Also, nvidia's problem wasn't caused by the heat
You're comparing the engineering and acoustics of the FX5800 to an endemic system failure resulting in at least 1 billion repair expenditure and estimated 30% failure rate? Not surprising.Originally posted by: Creig
The FX5800 dustbuster?
Because MS bought the design from ATI, and by doing so assumed mea culpa. But that doesn't mean heat and their previous designer won't be their primary consideration when they commission the next design. Again, this is widely documented so if you're truly interested in RROD you can read up at your leisure.Originally posted by: SSChevy2001
@chizow
If it was ATi fault then why weren't they sued?
Originally posted by: MarcVenice
All of you should take a deep breath, and rethink this.
No-one here is against physx, some of us are just skeptical about it. What is so hard to understand about most if not all games being developed with consoles in mind, and consoles not having 8x00 serie's videocards or a ppu? How are you going to explain physx in console titles, that will ultimately be ported to the pc, when those same consoles simply can't run physx, because they lack the power to do so. One prime example is mirror's edge, developed for the consoles, ported to the PC, with extra physx added to it. Why didn't the consoles have those physx if they can run it just the same? They can't, because they lack the horsepower to do so. Physx requires massive amounts of parallel processing, even the cell processor doesn't come close to all the stream processors even midrange nvidia cards have.
So, if you took a deep breath, you know that me, nor anyone else is against physx, we just doubt how it's going to find it's way to the PC, when games are clearly being developed for platforms, that do not have the paralell processing power of a PC. Sure, they can add physx during the port, like with Mirror's Edge, which is great ( if it's free, why not huh ) but I'm still going to wait and see what happens, because adding physx is in no way free for any developer. And, if it's added during a port, and not when the game is being developed from the ground up, it's going to be eye-candy stuff, and not gameplay altering physx (which is my gripe, but that's a whole other discussion).
Originally posted by: dadach
one thing you must understand...all most of these guys see is rollos and keys signatures where it says FREE GFX cards for something they write on forums...can you really blame for trying so hard to get in that grp?...i dont, so i dont take them too seriously
as far as physx goes, bring it on already...and not in slideshow version...just the fact that something can be done, doesnt always mean it can be done well or efficient...the gaming world is still waiting
Originally posted by: chizow
Also you seem to be forgetting ATI GPUs in the XBox360 largely contributed to the most catostrophic and widespread system failures in the history of consoles/PC hardware with the RROD debacle. I know you're going to bring up NV mobile GPUs so I'll preemptively add 1 billion+ > 200 million.
Originally posted by: SSChevy2001
The point is Nvidia throwing around the word PhysX, and people think it means every title will have GPU or PPU PhysX. That just not the case. Currently no console supports GPU or PPU PhysX effects.Originally posted by: nosfe
Anyway lets get back to the previous off topic discussion
The end result is Nvidia sold their physic engine to be used, nothing else.
LOL so true.Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.
The FX5800 dustbuster?
http://www.youtube.com/watch?v=WOVjZqC1AE4
See this is the part that most people don't seem to understand. PhysX is just the software tool kit that will be compatible with any API. These are the layers of FUD that need to be constantly peeled away from those trying to detract from the significance or future of PhysX. PhysX and CUDA will be compatible with OpenCL and DirectX 11. That means Nvidia hardware will be compatible with OpenCL and DirectX 11. That also means Nvidia hardware will be compatible with any other physics SDK. This will also most likely be true with ATI hardware, as long as they are compatible with OpenCL and DirectX 11.Originally posted by: akugami
I am intrigued by physics acceleration but when ATI (for whatever reasons) and who owns a very sizable chunk of the market decides they are not supporting PhysX, one has to wonder at the long term viability of PhysX and whether or not it might be supplanted by a solution from a 3rd party such as Microsoft. This solution might not be as good as a proprietary solution from either ATI or nVidia but at least it'll work everywhere. Much like DirectX does now.
Well this is getting OT but:I think this is more of a case of MS screwing up. Later revisions of the Xbox 360 hardware with beefier cooling had less problems with the RROD. I think the original Xbox 360 had engineering problems unrelated (though caused) by anything ATI did. MS tried to cut corners and it bit them in the rear.
Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.
The FX5800 dustbuster?
Originally posted by: chizow
Actually from your comments here and in previous threads, that's highly questionable. Looks like you're pushing AMD's agenda, against physx but not physics.... So now that your earlier claims have been corrected and its been shown that PhysX is widely adopted on both the PC and consoles, you've shifted your focus to accelerated vs. CPU PhysX? Hardware physics is here for good folks, it doesn't much matter what brand, highly parallel processors (GPUs) make effects never possible before, possible.Originally posted by: MarcVenice
All of you should take a deep breath, and rethink this.
No-one here is against physx, some of us are just skeptical about it.
How can you claim this knowing Mirror's Edge falls directly in this vein? Its a console port, with additional GPU accelerated PhysX effects which were directly cited as the reason it was delayed. So yes, there will need to be additional PhysX effects added for the PC because consoles will not be able to run them, however, the PhysX SDK offers seamless integration of additional effects since it is the same toolset.Originally posted by: MarcVenice
I don't think you get it, I've read all of those, I've played all of those games, I've talked to developers, and more. I review games, remember, I dare say I know twice as much about games as you do. Studio's using physx doesn't necesarily mean GPU accelerated physx. Which is the problem, we've seen physx and havok in plenty of games, but not on a scale like we see physx in those few special maps UT3 has, or the tech demo's nvidia gives us.
So, take another deep breath, and try to wrap your mind about this. Games are developed for consoles, consoles can't run gpu or ppu accelerated physx, thus games made for the consoles won't have physx that take advantage of nvidia's gpu's, through cuda. Since most games are developed for the consoles, PC's games won't see many physx that nvidia gpu owners can take advantage of.
Originally posted by: chizow
See this is the part that most people don't seem to understand. PhysX is just the software tool kit that will be compatible with any API. These are the layers of FUD that need to be constantly peeled away from those trying to detract from the significance or future of PhysX. PhysX and CUDA will be compatible with OpenCL and DirectX 11. That means Nvidia hardware will be compatible with OpenCL and DirectX 11. That also means Nvidia hardware will be compatible with any other physics SDK. This will also most likely be true with ATI hardware, as long as they are compatible with OpenCL and DirectX 11.Originally posted by: akugami
I am intrigued by physics acceleration but when ATI (for whatever reasons) and who owns a very sizable chunk of the market decides they are not supporting PhysX, one has to wonder at the long term viability of PhysX and whether or not it might be supplanted by a solution from a 3rd party such as Microsoft. This solution might not be as good as a proprietary solution from either ATI or nVidia but at least it'll work everywhere. Much like DirectX does now.
Simple questions for you here: do you have a Sound Blaster card? Do you have a DX10 video card? Do they offer tangible benefit over lower standards? How do these standards co-exist and offer benefits when not all hardware supports them? PhysX should be viewed no differently and when you compare the potential market share and install-base you'll see all indicators favor PhysX's chances for survival.
Well this is getting OT but:I think this is more of a case of MS screwing up. Later revisions of the Xbox 360 hardware with beefier cooling had less problems with the RROD. I think the original Xbox 360 had engineering problems unrelated (though caused) by anything ATI did. MS tried to cut corners and it bit them in the rear.
Xenon = 90nm CPU, 90nm GPU. Reported fail rates of at least 30%.
Falcon = 65nm CPU, 80nm GPU, better GPU HS and glue. Reported fail rates still as high as 20%
Jasper = 65nm CPU, 65nm GPU. Results? Most seem to think much lower fail rates.
MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?
^^ABOVE IS A PAID ADVERTISEMENT.^^Originally posted by: nRollo
Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.
The FX5800 dustbuster?
I thought this thread was about PhysX, not six year old video cards.
The PhysX revolution is gaining momentum Creig and there's only one place to get it. The EA, THQ, 2KGames, etc. announcements of today are the launching point of games with higher levels of immersion for those of us with NVIDIA graphics cards tomorrow.
With the launch of every game, and the addition of every developer, PhysX is becoming more difficult for card buyers to ignore.
Originally posted by: cmdrdredd
The gameplay on Wii is for kids...really. I'd never be caught dead playing that horrible piece of crap.
Look at games sales, they're pitiful. The system sells, people buy one game and realize they wasted their money and then they buy a 360 and buy 3 games.