THQ chooses Nvidia's PhysX technology for better gaming

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: chizow
MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?

You keep saying this but do you have any actual numbers of how much power the Xenos GPU consumes?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
I'd say until both ATi and nVidia support something, it won't be widespread (it's the minimum requirement). Because a developer won't create a game that will run on half of the cards on the market only. Now, PhysX adding stuff (like in Mirror's Edge) is nice, but I don't think such a thing will be available in many games anyway. As long as a game is nice and enjoyable without PhysX, why would a developer spend extra time and money on adding anything extra that can be run on selected cards only? The game is fun anyway without it.

Now, I *do* like the things that can be seen on that Mirror's Edge for example. But that's not enough of a reason to change a card, if my old one runs the game as fast but doesn't get those few extra things in a game or two only. Now, if I was buying a new card, sure, that'd be a thing to consider, cause both companies have similarly performing cards and they cost the same (in the US :p), so why not buy the one that has more things? Though you also consider past experience with a company, noise and warranty probably.

Bottom line is, unless something gets picked up by both side, it just won't fly. People will try to scream their lungs out saying that this is the future and what not, but unless most of the people can use it, it just won't stay.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Originally posted by: chizow

Yes I've provided evidence from multiple reliable sources and publications that have come to the same conclusion, that the excessive heat from ATI designed GPUs was the primary cause of RROD.
Actually it appears Microsoft are to blame. They used ATi's core design but they added extras and oversaw the manufacturing process instead of ATi:

http://www.reghardware.co.uk/2008/06/11/ms_xbox_gp/

The Xbox 360's infamous Red Ring of Death problem was the price Microsoft paid for attempting to save money by designing its own graphics chip for the console rather than buy one off a specialist supplier, it has been claimed.

When the 360 was finally released and able to be taken apart, pundits found both the console's GPU and CPU stamped with Microsoft logos. Essentially, Microsoft took ATI's design, added some extra bits and sent to out to manufacturing.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: LOUISSSSS
Originally posted by: nRollo
Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.

The FX5800 dustbuster?

:confused:

I thought this thread was about PhysX, not six year old video cards.

The PhysX revolution is gaining momentum Creig and there's only one place to get it. The EA, THQ, 2KGames, etc. announcements of today are the launching point of games with higher levels of immersion for those of us with NVIDIA graphics cards tomorrow.

With the launch of every game, and the addition of every developer, PhysX is becoming more difficult for card buyers to ignore.
^^ABOVE IS A PAID ADVERTISEMENT.^^

physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months. what really affects people's buying decision besides framerates and ability to perform at highres/high settings is drivers. the 8800gt was my first shot at nvidia since 2004ish and i've gotten a ratio of about 5:1 beta/leaked drivers:eek:fficial drivers. (oh and less nv4.disp BSOD's would be nice.)

again, physx will not affect my next buying decision. right now all i see with this is some nice explosions that i can live without.

There are literally dozens of games in development now that will use PhysX for more realistic, less static environments.

Many will be released this year, at least two within the next 60 days.

Given how close the performance is between ATi and NVIDIA cards, just why would people choose to not have the option to use PhysX?



 

Mem

Lifer
Apr 23, 2000
21,476
13
81
^^ABOVE IS A PAID ADVERTISEMENT.^^

physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.

I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.

I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Mem
^^ABOVE IS A PAID ADVERTISEMENT.^^

physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.

I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.

I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.

Why would PhysX have anything to do with your buying decisions when you don't own a card that can run PhysX, according to your sig?

Also- no one is saying you should buy games to see PhysX effects, at least I'm not. Heh- I'd never play a game just to see some better effects, unless it was free. (and then I wouldn't spend much time)

The point is that the games you would buy anyway are better with PhysX than without.

For example, I would buy Cryostasis no matter what- I like survival horror games like Doom3 and Dead Space.

Same with UT3. I love the UT games, just playing the bots. So for me it's better to have the option to play those PhysX levels with a bunch of eye candy added than not.

So while Nurien has PhysX, I won't buy it for that, not my thing.

 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Originally posted by: nRollo
Originally posted by: Mem
^^ABOVE IS A PAID ADVERTISEMENT.^^

physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.

I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.

I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.

Why would PhysX have anything to do with your buying decisions when you don't own a card that can run PhysX, according to your sig?

Also- no one is saying you should buy games to see PhysX effects, at least I'm not. Heh- I'd never play a game just to see some better effects, unless it was free. (and then I wouldn't spend much time)

The point is that the games you would buy anyway are better with PhysX than without.

For example, I would buy Cryostasis no matter what- I like survival horror games like Doom3 and Dead Space.

Same with UT3. I love the UT games, just playing the bots. So for me it's better to have the option to play those PhysX levels with a bunch of eye candy added than not.

So while Nurien has PhysX, I won't buy it for that, not my thing.

Actually I own a few Nvidia cards and more then one PC but thats besides the point,what I'm saying is PhysX may or may not be better with games,eitherway its not a factor when I buy a game.

THQ and other companies can choose PhysX or anything else,again it has no bearing on buying games for most gamers at this time,when and if PhysX becomes the standard for ALL then thats a different story.
We buy games to enjoy not because it has a big nvidia logo or PhysX wrighting on the box.

























 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 450R
Has anyone seen the Mirror's Edge PhysX trailer?

There's nothing meaningful in that video. Minor visual differences and absolutely zero impact on gameplay. And this is supposed to showcase PhysX?

If you watch the side by side, it's a little more obvious what the differences are

Flags, drapes, glass breaking, water fountain, destructable environment, this is all stuff that makes the PhysX version better.

We've been playing with static, unrealistic environments long enough IMO.

In the world I live in, almost nothing is impervious. In the worlds I play in, almost everything is.

That's not "virtual reality" and anything that brings us closer to it is by definition good.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Mem
Originally posted by: nRollo
Originally posted by: Mem
^^ABOVE IS A PAID ADVERTISEMENT.^^

physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.

I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.

I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.

Why would PhysX have anything to do with your buying decisions when you don't own a card that can run PhysX, according to your sig?

Also- no one is saying you should buy games to see PhysX effects, at least I'm not. Heh- I'd never play a game just to see some better effects, unless it was free. (and then I wouldn't spend much time)

The point is that the games you would buy anyway are better with PhysX than without.

For example, I would buy Cryostasis no matter what- I like survival horror games like Doom3 and Dead Space.

Same with UT3. I love the UT games, just playing the bots. So for me it's better to have the option to play those PhysX levels with a bunch of eye candy added than not.

So while Nurien has PhysX, I won't buy it for that, not my thing.

Actually I own a few Nvidia cards and more then one PC but thats besides the point,what I'm saying is PhysX may or may not be better with games,eitherway its not a factor when I buy a game.
Isn't that exactly what I said? You buy the games you like, but those games are better with PhysX?

Originally posted by: Mem
THQ and other companies can choose PhysX or anything else,again it has no bearing on buying games for most gamers at this time,when and if PhysX becomes the standard for ALL then thats a different story.

What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.

For example, remember the early days of Glide? Not many games at first, but if you were a gamer, you bought a 3dfx card because that was the only way you could run Glide.

Or SLi before Crossfire existed/worked. People bought it to run higher resolutions and levels of AA/AF, and it didn't matter at all ATi didn't have it.

People will buy what gives them the best gaming experience, and they don't really care who sells it to them. (at least they shouldn't)

Originally posted by: Mem
We buy games to enjoy not because it has a big nvidia logo or PhysX wrighting on the box.

Again, this is exactly what I said. You should buy the games you like, but you're better off if those games have PhysX effects in them than not.

 

Mem

Lifer
Apr 23, 2000
21,476
13
81
What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.


Most gamers I know hate to choose sides and like to use ATi as well as Nvidia cards,I know I do,as a gamer I would hate to be forced to go one side,standard for ALL is an important factor for a lot of gamers,obviously all those die hard and paid Nvidia users don't see that.

Gamers like choices without restrictions,standard for all is important.












 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Mem
What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.


Most gamers I know hate to choose sides and like to use ATi as well as Nvidia cards,I know I do,as a gamer I would hate to be forced to go one side,standard for ALL is an important factor for a lot of gamers,obviously all those die hard and paid Nvidia users don't see that.

Gamers like choices without restrictions,standard for all is important.

There are no "sides", only products with features to choose.

What exactly does a person buying a card that supports PhysX "lose"? If they don't lose anything, what does it matter if they only have one choice that offers a feature?

Many of the big development firms are signing up for PhysX, and there are many titles coming.

If Ageia had put together this kind of support for PhysX, there would be an Ageia add on card in my box.

What would you say about PhysX if neither NVIDIA nor ATi offered it? If it was a stand alone card from Agiea still?

If you still think "It's bad, the only good features are ones everyone has so everyone has an equal shot at your dollars" we're going to have to agree to disagree.

Here's a good analogy:

If GM invents a truck suspension that makes the truck ride 300% smoother, I'm not going to post on the truck forums "This is crap! I won't support this until Ford and Chrysler have it too, I don't want to be limited to only GM trucks!".

Market differentiation is what business is about, and as end users, the gaming experience should be all that matters to us.











[/quote]

 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,770
54
91
Originally posted by: cusideabelincoln
Originally posted by: cmdrdredd
The gameplay on Wii is for kids...really. I'd never be caught dead playing that horrible piece of crap.

Look at games sales, they're pitiful. The system sells, people buy one game and realize they wasted their money and then they buy a 360 and buy 3 games.

Your opinions, all of them, are now worthless. Although I never really put that much worth in them to begin with.

why r they worthless? i agree with him. i'm not sure of game sales, but i know i'd never play the wii. its so childish. but i love to play my ps3 on a daily basis
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,770
54
91
Originally posted by: nRollo
Originally posted by: Mem
What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.


Most gamers I know hate to choose sides and like to use ATi as well as Nvidia cards,I know I do,as a gamer I would hate to be forced to go one side,standard for ALL is an important factor for a lot of gamers,obviously all those die hard and paid Nvidia users don't see that.

Gamers like choices without restrictions,standard for all is important.

There are no "sides", only products with features to choose.

What exactly does a person buying a card that supports PhysX "lose"? If they don't lose anything, what does it matter if they only have one choice that offers a feature?

Many of the big development firms are signing up for PhysX, and there are many titles coming.

If Ageia had put together this kind of support for PhysX, there would be an Ageia add on card in my box.

What would you say about PhysX if neither NVIDIA nor ATi offered it? If it was a stand alone card from Agiea still?


If you still think "It's bad, the only good features are ones everyone has so everyone has an equal shot at your dollars" we're going to have to agree to disagree.

Here's a good analogy:

If GM invents a truck suspension that makes the truck ride 300% smoother, I'm not going to post on the truck forums "This is crap! I won't support this until Ford and Chrysler have it too, I don't want to be limited to only GM trucks!".

Market differentiation is what business is about, and as end users, the gaming experience should be all that matters to us.

[/quote]

i still wouldn't have it. i didn't buy it when physx WAS a standalone card, and i won't buy it now. there actually is a cost to it. nvidia RD will now have to devote even less time to their crappy drivers and put some more into developing that crap.
BTW, a NIC provides better gameplay from a standalone card, why dont u have one? if i were you i'd get one soon...

 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Buying a card with extra features that may or may not be used, cause they're not a universal standard is not the problem. Recommending a card because it has PhysX is a problem - putting a recommendation for a card because it uses something that is only supported by nVidia and is not something widely used. See the clear distinction? You can buy Radeons cheaper where I live. They are as fast and as reliable (more imo) as nVidia cards. And cheaper. But we're being told to buy nVidia cards cause they can do PhysX, something hardly used right now. And something hardly game defining from the previews we got so far.
 

450R

Senior member
Feb 22, 2005
319
0
0
Originally posted by: nRollo
If you watch the side by side, it's a little more obvious what the differences are

Flags, drapes, glass breaking, water fountain, destructable environment, this is all stuff that makes the PhysX version better.

We've been playing with static, unrealistic environments long enough IMO.

In the world I live in, almost nothing is impervious. In the worlds I play in, almost everything is.

That's not "virtual reality" and anything that brings us closer to it is by definition good.

I've seen the side-by-side. There isn't that much of a difference, and none of it applies to gameplay! It's just eye-candy fluff and hardly makes any progress towards a 'realistic environment'. I don't think I would even notice some of the stuff while I'm playing.

I don't know what games you've been playing but there are plenty of non-static game environments out there that don't use PhysX.

This is thread is going down the same drain that the majority of video threads do: e-penis wars over useless shit.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: LOUISSSSS

i still wouldn't have it. i didn't buy it when physx WAS a standalone card, and i won't buy it now. there actually is a cost to it. nvidia RD will now have to devote even less time to their crappy drivers and put some more into developing that crap.

There wasn't much reason to buy a stand alone PhysX card back then- no dev support.

This is the second time in this thread you've tried to derail it with assertions about NVIDIA drivers, so let's discuss that topic here">http://forums.anandtech.com/me...=2259746&enterthread=y</a> and not violate TOS by posting off topic!

Originally posted by: LOUISSSSS
BTW, a NIC provides better gameplay from a standalone card, why dont u have one? if i were you i'd get one soon...

I don't know what NICs have to do with PhysX either, but if I was an online gamer I'd own one if what you say is true.

In any case, so we've gotten down to "I won't support or buy PhysX until ATi has it" for you, and that's fine.

That's where we differ though, because if ATi made a stand alone PhysX card, it would be in my box.











[/quote]

[/quote]

[/quote]

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
Originally posted by: chizow
MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?

You keep saying this but do you have any actual numbers of how much power the Xenos GPU consumes?
Again from the AT article:
The first Xbox 360 (Xenon) needs a 203W power supply. Falcon needs a 175W power supply but can also work with the 203W unit (it just doesn't need to draw that much power so the 203W unit is overkill, but it'll work). Jasper needs 150W but can work with a 203W and a 175W.

Xbox 360 Revision System Off Idle Halo 3 Rockband 2 Gears of War 2 BioShock Demo
Xenon 2.3W 155.7W 177.8W 167.7W 177.1W 172W
Falcon 2.8W 101.4W 121.2W 112.8W 121.5W 115.5W
Jasper 2.0W 93.7W 105.9W 101.0W 105.9W 98.1W
Feel free to parse that yourself if you're really interested. Power draw was cut 30-50% between revisions and almost 20% under load just from shrinking the GPU.

Originally posted by: BFG10K
Originally posted by: chizow

Yes I've provided evidence from multiple reliable sources and publications that have come to the same conclusion, that the excessive heat from ATI designed GPUs was the primary cause of RROD.
Actually it appears Microsoft are to blame. They used ATi's core design but they added extras and oversaw the manufacturing process instead of ATi:

http://www.reghardware.co.uk/2008/06/11/ms_xbox_gp/

The Xbox 360's infamous Red Ring of Death problem was the price Microsoft paid for attempting to save money by designing its own graphics chip for the console rather than buy one off a specialist supplier, it has been claimed.

When the 360 was finally released and able to be taken apart, pundits found both the console's GPU and CPU stamped with Microsoft logos. Essentially, Microsoft took ATI's design, added some extra bits and sent to out to manufacturing.
The first quote is by an analyst from Gartner and clearly inaccurate. MS commissioned ATI to design the chip, the only bits they added to it was their logo because they bought the design, probably in an attempt to avoid any fixed licensing and pricing issues like with the original Xbox. You really think MS with its limited hardware experience was able to design the world's first unified shader GPU? So yes, Microsoft assumed culpability as they technically owned the chip, but they surely did not design it.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Glanced over and jumped around quite a bit....

Originally posted by: akugami
Well, the issue remains IMHO regarding PhysX. I think GPU PhysX will be stifled without ATI support and ATI has no reason to support it. CPU PhysX will help a bit with some extra debris or some clothes flapping but aside from a little extra visual appeal...what does it do for changing the way we play games? Even Mirrors Edge which will make use of PhysX seems to be the particle effect / clothes flapping variety.
Again, do you have a DX10 part + Vista or a Sound Blaster? Is DX10 being stifled despite 70-80% of gaming machines not supporting DX10? Is EAX being stifled despite 10-20% market share of Creative Sound Cards?

As a developer, do you heavily code for PhysX and potentially change the way your game is played for a sizable chunk of the market? At this point I'd say no, unless you get some of your development costs subsidized. If it's merely software PhysX I think the changes are not going to be innovative and merely slightly more eye candy.
And even if it is just eye-candy, its going to be more visually stunning than anything offered on the PC since DX9. And yes, developers have shown time and time again that they are willing to put in the latest features if that will make their game better and differentiate it from the competition.

Perhaps I'm seeing PhysX wrong but why code for a different physics SDK that will then make calls to the PhysX or ATI API's when MS can dictate a set of standardized physics calls that will be part of (or a subset of) DX11. ATI and nVidia will provide drivers for their respective video cards much like how DirectX works now with normal 3D graphics acceleration.
Again, you need to read-up more on the cross-portability of all the standards being discussed. Or just read my summary again. All this tin foil hat uncertainty and disinformation isn't doing anyone any good.

I do not see any physics acceleration truly taking off until we see the launch of $100-150 video cards that can be found in the like of Dells and HP's that provide not only good 3D accelerated graphics but provide good physics acceleration as well.
Like 9800GTs or $50 9600GSOs? Even the GTX 260 is falling close to that upper $150 range. The Steam survey shows some 60%+ is still using 1280x1024 or lower for their primary display and in that case a 9800GT should still be plenty for both GPU and PhysX acceleration.

MS makes sense due to their OS domination as well as DirectX. Intel makes sense because of their entering the enthusiast GPU market. Intel is, even if not on the high end, a major player in the video chipset market.
MS will provide the standard but that will not eliminate the need for front-end SDKs and toolkits. And Intel? They don't even have a GPU, their answer for highly parallel is going to be 20-something Pentium 1 class CPUs in Larrabee.....they don't even mention it anymore in their discussions of high-end graphics and both Nvidia and AMD have completely dismissed them as competition in the discrete GPU market.

A surface scan of the HSF was registered maxing out at 150F for earlier Xbox's while on the Falcon revision it was at 110F, meaning the new HSF was doing a much better job of keeping things cooled.
Have a link to that? Genuinely curious.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: MarcVenice
I've never doubted physx's adoption, I've been installing games for years that needed physx to be installed as a software solution. I've been eyeballing that techdemo, not sure what it's called anymore, the free game, to showcase what the PPU could do. I was excited, till I found out no1 had a ppu, and no other games supported the same kind of physx the showcase game had.
I think all of your comments to-date on the subject would dispute that whether intentional or not. As for comparisons to the PPU, clearly you can make the distinction that PhysX is well beyond that right? I mean, expensive $100-200 add-in card for 3-5 games that few bought vs. 100 million+ Nvidia GPUs almost overnight.

What I doubt is the adoption of non-software physx that actually mean something, like we will soon be seeing in Mirror's Edge, because so many games, like mirror's edge are developed for the consoles, and thus need extra developing time if they are to have extra physx, that can take advantage of nvidia gpu's. Which is exactly what I have been saying. I know what physx can do, and how it can be used, and that it's almost seamless. Adding those things doesn't happen for free though.
Yes and that's certainly going to be a concern if titles are PC ports but its also going to be less of a concern if devs have access to the SDK from the start of development. I'm quite sure older games with software back-end solvers will not be retrofitted with PhysX, that just won't happen given the nature of the gaming industry. But if the tools are accessible at the start, the difference in cost to implement GPU PhysX will be much more palatable. Still, in the big picture you have to realize the next generation consoles will most likely integrate hardware physics acceleration, so the PC would make the perfect test platform and devs would benefit from the experience now.

So really, saying I'm pushing AMD's agenda is laughable, when one does not agree with you, it doesn't instantly mean they are a fanboy of the opposite side. If AMD could run physx, through cuda or whatever else, and the consoles could too, I bet a dozen developers would jump on it, and use it to innovate their games. But imo, it's not going to happen right now. Maybe with next-gen consoles, a standardized physics api, who knows.
I didn't mean you were pushing AMD's agenda knowingly, just that your viewpoint was the same as what AMD is pushing. PhysX is evil because 100% of the market doesn't have access to it, but hardware physics will be great once DX11 gives it to everyone for free. My problem is that AMD is saying what they're saying for very different reasons than what they're saying in public, and ultimately it will hurt the adoption rate of hardware physics in the short-term even when they know its something they want going forward.

So, why should I be excited again about this deal? Oh woopdiedoo, THQ is going to implement software emulated physx, like tiny ragdoll effects. Wait, I've had those for years. /end sarcasm
Actually I'd say this means they're looking at hardware PhysX as many of their studios (Relic) have been using software physics for years (Havok). While this announcement on its own isn't all too exciting, taken with all of the other recent announcements (2k/EA), it clearly shows devs and major publishers are interested in the technology.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: 450R
Has anyone seen the Mirror's Edge PhysX trailer?

There's nothing meaningful in that video. Minor visual differences and absolutely zero impact on gameplay. And this is supposed to showcase PhysX?
Yes I'm sure most here are familiar with that video, but here's a better illustration of the differences:
Mirror's Edge Side-by-Side

While you can claim minor visual differences and absolutely zero impact on gameplay, there is no question that the PhysX version of the game will be significantly better than the one without.

There's entire websites like TweakGuides that give excellent tips on how to improve IQ and still, applying all of them for a particular game would pale in comparison to the enhancements provided by PhysX.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: chizow
Xbox 360 Revision System Off Idle Halo 3 Rockband 2 Gears of War 2 BioShock Demo
Xenon 2.3W 155.7W 177.8W 167.7W 177.1W 172W
Falcon 2.8W 101.4W 121.2W 112.8W 121.5W 115.5W
Jasper 2.0W 93.7W 105.9W 101.0W 105.9W 98.1W

Feel free to parse that yourself if you're really interested. Power draw was cut 30-50% between revisions and almost 20% under load just from shrinking the GPU.

Jasper was a GPU + memory shrink so it wasn't all due to the GPU:
http://gear.ign.com/articles/826/826652p1.html
And you have no idea what percent of the total load is from the GPU so you have no idea how hot it actually runs.

Also, that still doesn't answer who was actually responsible for the inadequate cooling. Obviously MS approved the ATI design. There's no way MS would have used their design if they didn't think they could cool it.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
Jasper was a GPU + memory shrink so it wasn't all due to the GPU:
http://gear.ign.com/articles/826/826652p1.html
And you have no idea what percent of the total load is from the GPU so you have no idea how hot it actually runs (ie. you're making "fireball" claims without proof).

Also, that still doesn't answer who was actually responsible for the inadequate cooling. Obviously MS approved the ATI design. There's no way MS would have used their design if they didn't think they could cool it.
Well its obvious you're not interested enough to do your own research, so this is the last time I'm going to feed it to you since this is already OT. The move to Jasper was only a GPU shrink of those 3 components, the shrinks to the CPU and eDRAM already came with the move from Xenon to Falcon.

AT article
360Rev CPU GPU eDRAM
Xenon 90nm 90nm 90nm
Falcon 65nm 80nm 80nm
Jasper 65nm 65nm 80nm

While inadequate cooling certainly contributed to the issue, the cause is still undoubtedly the excessively hot GPU as a cooler running GPU might not have resulted in the GPU melting away the solder and physically dismembering the chip from the board. This is all well-documented in the links I provided that also give the failure rates, as techs in the RMA departments were describing what they were seeing with RROD machines.

Now compare this to the Nvidia problems. Are you assigning blame to the notebook makers, or the GPU maker?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: cusideabelincoln
Originally posted by: cmdrdredd
The gameplay on Wii is for kids...really. I'd never be caught dead playing that horrible piece of crap.

Look at games sales, they're pitiful. The system sells, people buy one game and realize they wasted their money and then they buy a 360 and buy 3 games.

Your opinions, all of them, are now worthless. Although I never really put that much worth in them to begin with.

I'm glad an opinion makes you so vile and works you up so feverishly. I mean wow...good job.

Seriously, the Wii was NOT marketed to gamers. It's marketed to kids and non-gamers. Those who wouldn't play GTA, Quake Wars, Half-Life, Halo, and just about every other game that has been really popular in the last year or two. There is a very high percentage of gamers, the ones who are building PCs for gaming who would rather have a 360 or a PS3 than a Wii. I am one of those people. You have to understand, the Wii by my standards is old technology. It doesn't use my TV's native HD resolution, it doesn't have any of the games I want like Gears of War 2, GTA, Metal Gear Solid, Fallout 3, or Left 4 dead etc. It's online play is a joke. So I don't want it and it is my firm opinion that waving my arms around for some "virtual sports" game is a pathetic way to spend time. If I want to play a game I'll sit down, grab a drink and relax. If I want to be active, I'll go out and play football or something.

From my perspective, games should always move forward graphically. If a game released today could be done 5 years ago I wonder what the point is. That's how I am. That's why I think Physx is a good idea. It offers visual immersion that you don't have with any other API at the current time. Havok sure, but since Physx is GPU accelerated it offers the ability to do more than Havok. I don't think physx is necessary for a good game, but then I still think The Legend of Zelda is a good game with 2D sprites.
 

Andrew1990

Banned
Mar 8, 2008
2,153
0
0
I dont know about you guys. but that mirrors edge PhysX Video looked pretty cool. Since I already bought a PhysX enabled card, I am glad some games are supporting the feature. I didnt buy the card for PhysX but if game developers do throw a few extras in for us I for one am glad.