thilanliyan
Lifer
Originally posted by: chizow
MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?
You keep saying this but do you have any actual numbers of how much power the Xenos GPU consumes?
Originally posted by: chizow
MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?
Actually it appears Microsoft are to blame. They used ATi's core design but they added extras and oversaw the manufacturing process instead of ATi:Originally posted by: chizow
Yes I've provided evidence from multiple reliable sources and publications that have come to the same conclusion, that the excessive heat from ATI designed GPUs was the primary cause of RROD.
The Xbox 360's infamous Red Ring of Death problem was the price Microsoft paid for attempting to save money by designing its own graphics chip for the console rather than buy one off a specialist supplier, it has been claimed.
When the 360 was finally released and able to be taken apart, pundits found both the console's GPU and CPU stamped with Microsoft logos. Essentially, Microsoft took ATI's design, added some extra bits and sent to out to manufacturing.
Originally posted by: LOUISSSSS
^^ABOVE IS A PAID ADVERTISEMENT.^^Originally posted by: nRollo
Originally posted by: Creig
Originally posted by: chizow
Yes I'm sure you figured someone would mention it, yet you chose to ignore the single biggest debacle in consumer electronics history.
The FX5800 dustbuster?
😕
I thought this thread was about PhysX, not six year old video cards.
The PhysX revolution is gaining momentum Creig and there's only one place to get it. The EA, THQ, 2KGames, etc. announcements of today are the launching point of games with higher levels of immersion for those of us with NVIDIA graphics cards tomorrow.
With the launch of every game, and the addition of every developer, PhysX is becoming more difficult for card buyers to ignore.
physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months. what really affects people's buying decision besides framerates and ability to perform at highres/high settings is drivers. the 8800gt was my first shot at nvidia since 2004ish and i've gotten a ratio of about 5:1 beta/leaked drivers😱fficial drivers. (oh and less nv4.disp BSOD's would be nice.)
again, physx will not affect my next buying decision. right now all i see with this is some nice explosions that i can live without.
^^ABOVE IS A PAID ADVERTISEMENT.^^
physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.
Originally posted by: Mem
^^ABOVE IS A PAID ADVERTISEMENT.^^
physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.
I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.
I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.
Originally posted by: nRollo
Originally posted by: Mem
^^ABOVE IS A PAID ADVERTISEMENT.^^
physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.
I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.
I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.
Why would PhysX have anything to do with your buying decisions when you don't own a card that can run PhysX, according to your sig?
Also- no one is saying you should buy games to see PhysX effects, at least I'm not. Heh- I'd never play a game just to see some better effects, unless it was free. (and then I wouldn't spend much time)
The point is that the games you would buy anyway are better with PhysX than without.
For example, I would buy Cryostasis no matter what- I like survival horror games like Doom3 and Dead Space.
Same with UT3. I love the UT games, just playing the bots. So for me it's better to have the option to play those PhysX levels with a bunch of eye candy added than not.
So while Nurien has PhysX, I won't buy it for that, not my thing.
Originally posted by: 450R
Has anyone seen the Mirror's Edge PhysX trailer?
There's nothing meaningful in that video. Minor visual differences and absolutely zero impact on gameplay. And this is supposed to showcase PhysX?
Isn't that exactly what I said? You buy the games you like, but those games are better with PhysX?Originally posted by: Mem
Originally posted by: nRollo
Originally posted by: Mem
^^ABOVE IS A PAID ADVERTISEMENT.^^
physx did NOT affect my buying decision in my last buying decision, nor will it affect my buying decision in the coming months.
I have to agree ,when I buy a game it has nothing to do with if it has PhysX or not,besides you could argue gamers are more concerned about DRM on their game/software then PhysX.
I can say if I like a game then I buy it end of story,PhysX has nothing to do with my decision.
Why would PhysX have anything to do with your buying decisions when you don't own a card that can run PhysX, according to your sig?
Also- no one is saying you should buy games to see PhysX effects, at least I'm not. Heh- I'd never play a game just to see some better effects, unless it was free. (and then I wouldn't spend much time)
The point is that the games you would buy anyway are better with PhysX than without.
For example, I would buy Cryostasis no matter what- I like survival horror games like Doom3 and Dead Space.
Same with UT3. I love the UT games, just playing the bots. So for me it's better to have the option to play those PhysX levels with a bunch of eye candy added than not.
So while Nurien has PhysX, I won't buy it for that, not my thing.
Actually I own a few Nvidia cards and more then one PC but thats besides the point,what I'm saying is PhysX may or may not be better with games,eitherway its not a factor when I buy a game.
Originally posted by: Mem
THQ and other companies can choose PhysX or anything else,again it has no bearing on buying games for most gamers at this time,when and if PhysX becomes the standard for ALL then thats a different story.
Originally posted by: Mem
We buy games to enjoy not because it has a big nvidia logo or PhysX wrighting on the box.
What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.
Originally posted by: Mem
What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.
Most gamers I know hate to choose sides and like to use ATi as well as Nvidia cards,I know I do,as a gamer I would hate to be forced to go one side,standard for ALL is an important factor for a lot of gamers,obviously all those die hard and paid Nvidia users don't see that.
Gamers like choices without restrictions,standard for all is important.
Originally posted by: cusideabelincoln
Originally posted by: cmdrdredd
The gameplay on Wii is for kids...really. I'd never be caught dead playing that horrible piece of crap.
Look at games sales, they're pitiful. The system sells, people buy one game and realize they wasted their money and then they buy a 360 and buy 3 games.
Your opinions, all of them, are now worthless. Although I never really put that much worth in them to begin with.
Originally posted by: nRollo
Originally posted by: Mem
What does whether it's a standard "for ALL" have to do with anything? If there are good effects and higher levels of immersion in games people want to play with a specific brand of hardware, people will buy that hardware.
Most gamers I know hate to choose sides and like to use ATi as well as Nvidia cards,I know I do,as a gamer I would hate to be forced to go one side,standard for ALL is an important factor for a lot of gamers,obviously all those die hard and paid Nvidia users don't see that.
Gamers like choices without restrictions,standard for all is important.
There are no "sides", only products with features to choose.
What exactly does a person buying a card that supports PhysX "lose"? If they don't lose anything, what does it matter if they only have one choice that offers a feature?
Many of the big development firms are signing up for PhysX, and there are many titles coming.
If Ageia had put together this kind of support for PhysX, there would be an Ageia add on card in my box.
What would you say about PhysX if neither NVIDIA nor ATi offered it? If it was a stand alone card from Agiea still?
If you still think "It's bad, the only good features are ones everyone has so everyone has an equal shot at your dollars" we're going to have to agree to disagree.
Here's a good analogy:
If GM invents a truck suspension that makes the truck ride 300% smoother, I'm not going to post on the truck forums "This is crap! I won't support this until Ford and Chrysler have it too, I don't want to be limited to only GM trucks!".
Market differentiation is what business is about, and as end users, the gaming experience should be all that matters to us.
Originally posted by: nRollo
If you watch the side by side, it's a little more obvious what the differences are
Flags, drapes, glass breaking, water fountain, destructable environment, this is all stuff that makes the PhysX version better.
We've been playing with static, unrealistic environments long enough IMO.
In the world I live in, almost nothing is impervious. In the worlds I play in, almost everything is.
That's not "virtual reality" and anything that brings us closer to it is by definition good.
Originally posted by: LOUISSSSS
i still wouldn't have it. i didn't buy it when physx WAS a standalone card, and i won't buy it now. there actually is a cost to it. nvidia RD will now have to devote even less time to their crappy drivers and put some more into developing that crap.
Originally posted by: LOUISSSSS
BTW, a NIC provides better gameplay from a standalone card, why dont u have one? if i were you i'd get one soon...
Again from the AT article:Originally posted by: thilan29
Originally posted by: chizow
MS is certainly at fault but you think they're doing cartwheels over ATI's lil fireball of a GPU design?
You keep saying this but do you have any actual numbers of how much power the Xenos GPU consumes?
Feel free to parse that yourself if you're really interested. Power draw was cut 30-50% between revisions and almost 20% under load just from shrinking the GPU.The first Xbox 360 (Xenon) needs a 203W power supply. Falcon needs a 175W power supply but can also work with the 203W unit (it just doesn't need to draw that much power so the 203W unit is overkill, but it'll work). Jasper needs 150W but can work with a 203W and a 175W.
Xbox 360 Revision System Off Idle Halo 3 Rockband 2 Gears of War 2 BioShock Demo
Xenon 2.3W 155.7W 177.8W 167.7W 177.1W 172W
Falcon 2.8W 101.4W 121.2W 112.8W 121.5W 115.5W
Jasper 2.0W 93.7W 105.9W 101.0W 105.9W 98.1W
The first quote is by an analyst from Gartner and clearly inaccurate. MS commissioned ATI to design the chip, the only bits they added to it was their logo because they bought the design, probably in an attempt to avoid any fixed licensing and pricing issues like with the original Xbox. You really think MS with its limited hardware experience was able to design the world's first unified shader GPU? So yes, Microsoft assumed culpability as they technically owned the chip, but they surely did not design it.Originally posted by: BFG10K
Actually it appears Microsoft are to blame. They used ATi's core design but they added extras and oversaw the manufacturing process instead of ATi:Originally posted by: chizow
Yes I've provided evidence from multiple reliable sources and publications that have come to the same conclusion, that the excessive heat from ATI designed GPUs was the primary cause of RROD.
http://www.reghardware.co.uk/2008/06/11/ms_xbox_gp/
The Xbox 360's infamous Red Ring of Death problem was the price Microsoft paid for attempting to save money by designing its own graphics chip for the console rather than buy one off a specialist supplier, it has been claimed.
When the 360 was finally released and able to be taken apart, pundits found both the console's GPU and CPU stamped with Microsoft logos. Essentially, Microsoft took ATI's design, added some extra bits and sent to out to manufacturing.
Again, do you have a DX10 part + Vista or a Sound Blaster? Is DX10 being stifled despite 70-80% of gaming machines not supporting DX10? Is EAX being stifled despite 10-20% market share of Creative Sound Cards?Originally posted by: akugami
Well, the issue remains IMHO regarding PhysX. I think GPU PhysX will be stifled without ATI support and ATI has no reason to support it. CPU PhysX will help a bit with some extra debris or some clothes flapping but aside from a little extra visual appeal...what does it do for changing the way we play games? Even Mirrors Edge which will make use of PhysX seems to be the particle effect / clothes flapping variety.
And even if it is just eye-candy, its going to be more visually stunning than anything offered on the PC since DX9. And yes, developers have shown time and time again that they are willing to put in the latest features if that will make their game better and differentiate it from the competition.As a developer, do you heavily code for PhysX and potentially change the way your game is played for a sizable chunk of the market? At this point I'd say no, unless you get some of your development costs subsidized. If it's merely software PhysX I think the changes are not going to be innovative and merely slightly more eye candy.
Again, you need to read-up more on the cross-portability of all the standards being discussed. Or just read my summary again. All this tin foil hat uncertainty and disinformation isn't doing anyone any good.Perhaps I'm seeing PhysX wrong but why code for a different physics SDK that will then make calls to the PhysX or ATI API's when MS can dictate a set of standardized physics calls that will be part of (or a subset of) DX11. ATI and nVidia will provide drivers for their respective video cards much like how DirectX works now with normal 3D graphics acceleration.
Like 9800GTs or $50 9600GSOs? Even the GTX 260 is falling close to that upper $150 range. The Steam survey shows some 60%+ is still using 1280x1024 or lower for their primary display and in that case a 9800GT should still be plenty for both GPU and PhysX acceleration.I do not see any physics acceleration truly taking off until we see the launch of $100-150 video cards that can be found in the like of Dells and HP's that provide not only good 3D accelerated graphics but provide good physics acceleration as well.
MS will provide the standard but that will not eliminate the need for front-end SDKs and toolkits. And Intel? They don't even have a GPU, their answer for highly parallel is going to be 20-something Pentium 1 class CPUs in Larrabee.....they don't even mention it anymore in their discussions of high-end graphics and both Nvidia and AMD have completely dismissed them as competition in the discrete GPU market.MS makes sense due to their OS domination as well as DirectX. Intel makes sense because of their entering the enthusiast GPU market. Intel is, even if not on the high end, a major player in the video chipset market.
Have a link to that? Genuinely curious.A surface scan of the HSF was registered maxing out at 150F for earlier Xbox's while on the Falcon revision it was at 110F, meaning the new HSF was doing a much better job of keeping things cooled.
I think all of your comments to-date on the subject would dispute that whether intentional or not. As for comparisons to the PPU, clearly you can make the distinction that PhysX is well beyond that right? I mean, expensive $100-200 add-in card for 3-5 games that few bought vs. 100 million+ Nvidia GPUs almost overnight.Originally posted by: MarcVenice
I've never doubted physx's adoption, I've been installing games for years that needed physx to be installed as a software solution. I've been eyeballing that techdemo, not sure what it's called anymore, the free game, to showcase what the PPU could do. I was excited, till I found out no1 had a ppu, and no other games supported the same kind of physx the showcase game had.
Yes and that's certainly going to be a concern if titles are PC ports but its also going to be less of a concern if devs have access to the SDK from the start of development. I'm quite sure older games with software back-end solvers will not be retrofitted with PhysX, that just won't happen given the nature of the gaming industry. But if the tools are accessible at the start, the difference in cost to implement GPU PhysX will be much more palatable. Still, in the big picture you have to realize the next generation consoles will most likely integrate hardware physics acceleration, so the PC would make the perfect test platform and devs would benefit from the experience now.What I doubt is the adoption of non-software physx that actually mean something, like we will soon be seeing in Mirror's Edge, because so many games, like mirror's edge are developed for the consoles, and thus need extra developing time if they are to have extra physx, that can take advantage of nvidia gpu's. Which is exactly what I have been saying. I know what physx can do, and how it can be used, and that it's almost seamless. Adding those things doesn't happen for free though.
I didn't mean you were pushing AMD's agenda knowingly, just that your viewpoint was the same as what AMD is pushing. PhysX is evil because 100% of the market doesn't have access to it, but hardware physics will be great once DX11 gives it to everyone for free. My problem is that AMD is saying what they're saying for very different reasons than what they're saying in public, and ultimately it will hurt the adoption rate of hardware physics in the short-term even when they know its something they want going forward.So really, saying I'm pushing AMD's agenda is laughable, when one does not agree with you, it doesn't instantly mean they are a fanboy of the opposite side. If AMD could run physx, through cuda or whatever else, and the consoles could too, I bet a dozen developers would jump on it, and use it to innovate their games. But imo, it's not going to happen right now. Maybe with next-gen consoles, a standardized physics api, who knows.
Actually I'd say this means they're looking at hardware PhysX as many of their studios (Relic) have been using software physics for years (Havok). While this announcement on its own isn't all too exciting, taken with all of the other recent announcements (2k/EA), it clearly shows devs and major publishers are interested in the technology.So, why should I be excited again about this deal? Oh woopdiedoo, THQ is going to implement software emulated physx, like tiny ragdoll effects. Wait, I've had those for years. /end sarcasm
Yes I'm sure most here are familiar with that video, but here's a better illustration of the differences:Originally posted by: 450R
Has anyone seen the Mirror's Edge PhysX trailer?
There's nothing meaningful in that video. Minor visual differences and absolutely zero impact on gameplay. And this is supposed to showcase PhysX?
Originally posted by: chizow
Xbox 360 Revision System Off Idle Halo 3 Rockband 2 Gears of War 2 BioShock Demo
Xenon 2.3W 155.7W 177.8W 167.7W 177.1W 172W
Falcon 2.8W 101.4W 121.2W 112.8W 121.5W 115.5W
Jasper 2.0W 93.7W 105.9W 101.0W 105.9W 98.1W
Feel free to parse that yourself if you're really interested. Power draw was cut 30-50% between revisions and almost 20% under load just from shrinking the GPU.
Well its obvious you're not interested enough to do your own research, so this is the last time I'm going to feed it to you since this is already OT. The move to Jasper was only a GPU shrink of those 3 components, the shrinks to the CPU and eDRAM already came with the move from Xenon to Falcon.Originally posted by: thilan29
Jasper was a GPU + memory shrink so it wasn't all due to the GPU:
http://gear.ign.com/articles/826/826652p1.html
And you have no idea what percent of the total load is from the GPU so you have no idea how hot it actually runs (ie. you're making "fireball" claims without proof).
Also, that still doesn't answer who was actually responsible for the inadequate cooling. Obviously MS approved the ATI design. There's no way MS would have used their design if they didn't think they could cool it.
Originally posted by: cusideabelincoln
Originally posted by: cmdrdredd
The gameplay on Wii is for kids...really. I'd never be caught dead playing that horrible piece of crap.
Look at games sales, they're pitiful. The system sells, people buy one game and realize they wasted their money and then they buy a 360 and buy 3 games.
Your opinions, all of them, are now worthless. Although I never really put that much worth in them to begin with.