THQ chooses Nvidia's PhysX technology for better gaming

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: chizow
Originally posted by: thilan29
Jasper was a GPU + memory shrink so it wasn't all due to the GPU:
http://gear.ign.com/articles/826/826652p1.html
And you have no idea what percent of the total load is from the GPU so you have no idea how hot it actually runs (ie. you're making "fireball" claims without proof).

Also, that still doesn't answer who was actually responsible for the inadequate cooling. Obviously MS approved the ATI design. There's no way MS would have used their design if they didn't think they could cool it.
Well its obvious you're not interested enough to do your own research, so this is the last time I'm going to feed it to you since this is already OT. The move to Jasper was only a GPU shrink of those 3 components, the shrinks to the CPU and eDRAM already came with the move from Xenon to Falcon.

AT article
360Rev CPU GPU eDRAM
Xenon 90nm 90nm 90nm
Falcon 65nm 80nm 80nm
Jasper 65nm 65nm 80nm

While inadequate cooling certainly contributed to the issue, the cause is still undoubtedly the excessively hot GPU as a cooler running GPU might not have resulted in the GPU melting away the solder and physically dismembering the chip from the board. This is all well-documented in the links I provided that also give the failure rates, as techs in the RMA departments were describing what they were seeing with RROD machines.

Now compare this to the Nvidia problems. Are you assigning blame to the notebook makers, or the GPU maker?

My mistake about the memory but as I said, you still don't know how much power the GPU was dissipating relative to the cooling capacity. Since MS obviously tested the chip before putting it into the 360 they should have had an idea of what kind of cooling they needed but obviously they decided to cut some corners and I'm fairly certain they wouldn't have approved the chip for use if it ran too hot in their testing. Of course a cooler running GPU might not have resulted in the problems but you don't know whether that was because of inadequate cooling or whether it was dissipating a lot of power.

From what I've read about the RROD, it's caused by warping of the motherboard and subsequent cracking of the solder bumps due to the tension created by the warped motherboard. The solder bumps were also embrittled due to:
1)inadequate cooling
2)GPU running too hot

Actually both 1&2 would be taken care of by adequate cooling and I'd put the blame on MS (and they seem to think so too given they've shouldered the full cost of the extended warranty and repairs) since they were responsible for the design of the case and cooling. In the nV case wasn't it due to the specific type of solder bumps they were using (wasn't it nV who prescribed those?) (even in this case nV have shouldered the full cost so wouldn't the blame be in their hands?)?
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Yes and that's certainly going to be a concern if titles are PC ports but its also going to be less of a concern if devs have access to the SDK from the start of development. I'm quite sure older games with software back-end solvers will not be retrofitted with PhysX, that just won't happen given the nature of the gaming industry. But if the tools are accessible at the start, the difference in cost to implement GPU PhysX will be much more palatable. Still, in the big picture you have to realize the next generation consoles will most likely integrate hardware physics acceleration, so the PC would make the perfect test platform and devs would benefit from the experience now.

I'm not sure if I got this right, and since programming isn't my forte, maybe you can explain. If a developer has acces to physx, is it possible to add it in a game for a console, and when people play it on a console with the physx disabled, people won't see special dust particles, or flapping cloth, but just regulat 'static' stuff, and as soon as the same game, without any changes made to it, gets run a hardware platform that can run the game with physx enabled, like a PC with a nvidia videocard, said game will have all those physx features, once again, without having to make any modifications?

Because if so, developing for the consoles suddenly becomes a moot point, developers can make a game, with pretty cool physx in it without any added cost? But if not, then I think we stand in agreement, and we won't see full scale physx adoption for a while, untill we see next-gen consoles that can run those physx.

Also you got me all wrong here:

I didn't mean you were pushing AMD's agenda knowingly, just that your viewpoint was the same as what AMD is pushing. PhysX is evil because 100% of the market doesn't have access to it, but hardware physics will be great once DX11 gives it to everyone for free. My problem is that AMD is saying what they're saying for very different reasons than what they're saying in public, and ultimately it will hurt the adoption rate of hardware physics in the short-term even when they know its something they want going forward.

I don't think physx are evil, I just don't see it getting adopted ( full scale hardware acceleration ) because not everyone can run it. It has nothing to do with AMD, or whatever they are pushing. I'm just relating to gaming developers.

Actually I'd say this means they're looking at hardware PhysX as many of their studios (Relic) have been using software physics for years (Havok). While this announcement on its own isn't all too exciting, taken with all of the other recent announcements (2k/EA), it clearly shows devs and major publishers are interested in the technology.

And more power to them. I just don't see it happening this fast, when only nvidia videocards can support said hardware accelerated physx. But, if it's really 'that' easy ( see above comment ) to add-in hardware accelerated physx, they actually might, because it won't cost them much extra time and/or developing costs. Then, it will be like EAX and DX10 like you've said before.

One last remark though. With only nvidia supporting said physx, it will be eye-candy, and not gameplay altering physx, because that would mean developing 2 different kinds of games, which is ofcourse unheard of.
 

akugami

Diamond Member
Feb 14, 2005
5,660
1,851
136
http://xbox360.qj.net/Ben-Heck...model/pg/49/aid/112801

Jasper, Zephyr Xbox


Chizow, I'm not going into the long quoting game, this conversation is getting too long as it is.

EAX is not exactly on top of anyone's list. I'm curious of what the adoption rate of the latest revisions of EAX is in games? If I recall correctly EAX 2.0 is an open standard with no royalties while EAX 3, 4, and 5 require a license. Creative is doing so well with sound cards and sound technologies that they're a company going down the toilet. There is DirectSound and DirectSound3D as an alternative to EAX.

IMHO, free eye candy is one thing. I am not in any way shape or form against PhysX. You probably have the mistaken impression I'm railing against PhysX. Your "tin foil hat" comments were unwarranted. I gave my reasons for believing that PhysX can fail. Those are my opinions and I gave my reasons for that. You can disagree with my reasons and poke holes in them. Instead, we get a borderline personal attack.

You seem to be the one that is championing PhysX to the point where you can't envision it failing. I'm being realistic in saying that there is indeed a possibility of it failing. Heck, it was impossible to envision IBM leaving the PC market much less losing its market dominance. Look at where it is now. And PhysX is nowhere near dominating the market as IBM once did until the IBM compatibles came out.

I am not going to restate the same arguments over and over. We'll just be arguing in circles over and over. Suffice to say I have stated my case. You can believe it is valid or not. But from your comments, it seems you are set in your thinking and anyone who is the least bit objective to your views on PhysX seem to be spewing "tin foil hat uncertainty."
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
From what I've read about the RROD, it's caused by warping of the motherboard and subsequent cracking of the solder bumps due to the tension created by the warped motherboard. The solder bumps were also embrittled due to:
1)inadequate cooling
2)GPU running too hot
But again, it comes down to causality and excessive GPU heat. If 2 isn't running too hot, then 1 is not an issue. So while more cooling may have solved the problem, it was simply not an option without completely redesigning the profile of the 360. This ultimately points to a GPU design that exceeded the TDP of initial designs. Look at what the R500 360 GPU is, its essentially a hybrid of R580 and R600, both chips that ran excessively hot and required loud dual slot coolers for proper cooling. So while MS may have shouldered the blame, they're certainly going to place heat and GPU design as their primary concern when planning for the XBox720 as all indicators point to excessive GPU heat as the primary cause of RROD. Excluding this footnote from any list of arbitrary console design considerations is simply irresponsible and revisionist in nature.
 

zebrax2

Senior member
Nov 18, 2007
972
62
91
Originally posted by: chizow
Glanced over and jumped around quite a bit....

Originally posted by: akugami
Well, the issue remains IMHO regarding PhysX. I think GPU PhysX will be stifled without ATI support and ATI has no reason to support it. CPU PhysX will help a bit with some extra debris or some clothes flapping but aside from a little extra visual appeal...what does it do for changing the way we play games? Even Mirrors Edge which will make use of PhysX seems to be the particle effect / clothes flapping variety.
Again, do you have a DX10 part + Vista or a Sound Blaster? Is DX10 being stifled despite 70-80% of gaming machines not supporting DX10? Is EAX being stifled despite 10-20% market share of Creative Sound Cards?

DX10 is being adopted since it is the future for games since a huge percentage of games are made for windows.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Thanks for the link, although it doesn't dispute anything I said and actually contradicts what you wrote:

Originally posted by: akugami
A surface scan of the HSF was registered maxing out at 150F for earlier Xbox's while on the Falcon revision it was at 110F, meaning the new HSF was doing a much better job of keeping things cooled.
That's very different from what the article is saying, especially since the CPU cooler changed very little. Its obvious that the reduction in temperature, for the CPU only, was a direct result of the CPU die shrink only and your link confirms it:

But as for peak heat levels, he found that the GPU still shoots up to as high as 140 degrees Fahrenheit, while the CPU is cooler, peaking at 110 degrees Fahrenheit.

Chizow, I'm not going into the long quoting game, this conversation is getting too long as it is.
That's probably in your best interest.

EAX is not exactly on top of anyone's list. I'm curious of what the adoption rate of the latest revisions of EAX is in games? If I recall correctly EAX 2.0 is an open standard with no royalties while EAX 3, 4, and 5 require a license. Creative is doing so well with sound cards and sound technologies that they're a company going down the toilet. There is DirectSound and DirectSound3D as an alternative to EAX.
EAX is supported in probably 50% of the titles I've purchased over the last 5 years and even older titles sound nearly as good as new games. All this despite a market share that you so objectively describe as "going down the toilet". Sure DirectSound and DirectSound3D are alternatives but they're poor ones compared to EAX.

Your "tin foil hat" comments were unwarranted. I gave my reasons for believing that PhysX can fail. Those are my opinions and I gave my reasons for that. You can disagree with my reasons and poke holes in them. Instead, we get a borderline personal attack.
It was actually more directed at various comments I didn't bother quoting earlier:

Originally posted by: akugami
Oh, and if it hasn't been mentioned. There is no financial or competitive reason for ATI to adopt PhysX in any way shape or form as it currently stands. ATI would actually be at the mercy of their main competitor, nVidia, as nVidia can hold off on new revisions to PhysX and not release it to ATI until the last minute. Much like how MS kept some API's proprietary or at least held in secret some API calls that it used for its own software but did not release to others.
Along with similar comments re: Nvidia forcing PhysX to run slower on ATI hardware and PhysX being incompatible with other standards, I'd say the comments were warranted.

You seem to be the one that is championing PhysX to the point where you can't envision it failing. I'm being realistic in saying that there is indeed a possibility of it failing. Heck, it was impossible to envision IBM leaving the PC market much less losing its market dominance. Look at where it is now. And PhysX is nowhere near dominating the market as IBM once did until the IBM compatibles came out.
Actually all market indicators show PhysX/hardware physics will not fail, so no I can't see it failing especially with the absence of any competing standard. Your IBM example doesn't make sense because the platform they were pushing, the x86 PC, has in fact become the standard.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: MarcVenice
I'm not sure if I got this right, and since programming isn't my forte, maybe you can explain. If a developer has acces to physx, is it possible to add it in a game for a console, and when people play it on a console with the physx disabled, people won't see special dust particles, or flapping cloth, but just regulat 'static' stuff, and as soon as the same game, without any changes made to it, gets run a hardware platform that can run the game with physx enabled, like a PC with a nvidia videocard, said game will have all those physx features, once again, without having to make any modifications?

Because if so, developing for the consoles suddenly becomes a moot point, developers can make a game, with pretty cool physx in it without any added cost? But if not, then I think we stand in agreement, and we won't see full scale physx adoption for a while, untill we see next-gen consoles that can run those physx.
No there will always be additional cost and effects will still have to be added-in, but if they're planned for and designed from the outset, it will make the process much less costly. Even in the Mirror's Edge trailers, you can essentially see PhysX "placeholders", inanimate objects that serve as attachment points for enhanced PhysX effects.

The PhysX SDK being widely adopted just means it will be much easier to place custom PhysX, so instead of placing software PhysX effects a developer might use some of the advanced effects instead. There may be some global commands to essentially increase all simple effects, but for much of the effects in Mirror's Edge and especially Cryostasis, those look to be hand-crafted custom effects.

I don't think physx are evil, I just don't see it getting adopted ( full scale hardware acceleration ) because not everyone can run it. It has nothing to do with AMD, or whatever they are pushing. I'm just relating to gaming developers.
Again, not everyone has to be able to run it before we see it supported in games. Just like not everyone has to run DX10, or EAX, or High Detail, or Ultra High Textures, or High Shadows etc. Again, it'll just be another scalable option, which has always been one of the most appealing features of the PC for both enthusiasts and developers alike.

And more power to them. I just don't see it happening this fast, when only nvidia videocards can support said hardware accelerated physx. But, if it's really 'that' easy ( see above comment ) to add-in hardware accelerated physx, they actually might, because it won't cost them much extra time and/or developing costs. Then, it will be like EAX and DX10 like you've said before.

One last remark though. With only nvidia supporting said physx, it will be eye-candy, and not gameplay altering physx, because that would mean developing 2 different kinds of games, which is ofcourse unheard of.
Well again, with 60-65% discrete GPU market share and a quoted 100 million CUDA capable parts, I think that certainly favors PhysX's chances with or without ATI's support. One key thing to note also, PhysX doesn't rely on DX10 for funcionality, so it actually has a larger install base than DX10 for Nvidia parts (and probably overall). Just another thing to consider for the AMD camp waiting around for DX11 as a better alternative.

And yes I agree in the short-term you won't see dynamic physics gameplay or PhysX-dependent scripting, but PhysX will still most certainly improve any game if its limited to just visual effects and eye-candy. As an interim solution I could easily see devs substituting scripted or pre-rendered cut scenes initiated with a generic interact button like "F" for machines that weren't capable of PhysX. Kinda like Dragon's Lair or some earlier games like Resident Evil.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: zebrax2
DX10 is being adopted since it is the future for games since a huge percentage of games are made for windows.
Yes but they'll still be supported by DX9, especially given poor Vista adoption rates (70% XP, 30% Vista according to latest Steam survey). The point is PhysX arguably has a larger install base than some of these non-standard standards so lack of full market share isn't enough reason by itself to discount it.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: chizow
Originally posted by: thilan29
From what I've read about the RROD, it's caused by warping of the motherboard and subsequent cracking of the solder bumps due to the tension created by the warped motherboard. The solder bumps were also embrittled due to:
1)inadequate cooling
2)GPU running too hot
But again, it comes down to causality and excessive GPU heat. If 2 isn't running too hot, then 1 is not an issue. So while more cooling may have solved the problem, it was simply not an option without completely redesigning the profile of the 360. This ultimately points to a GPU design that exceeded the TDP of initial designs. Look at what the R500 360 GPU is, its essentially a hybrid of R580 and R600, both chips that ran excessively hot and required loud dual slot coolers for proper cooling.

If you look at this pic:
http://images.anandtech.com/re...x360/therm_removed.jpg
that small flat heatsink is for the GPU and the tower one is for the CPU. To me that is woefully inadequate cooling for the GPU as compared to the CPU. And you wouldn't have to redesign the case to make a better cooler than what they put on there considering the small, flat size of the heatsink.

Compare that to the PS3 heatsink:
http://images.dailytech.com/ni...000_large_heatsink.jpg
See the difference? Much larger and heatpipes to boot, cooled by a 160mm fan.

After looking at those pics can you honestly say MS did even an adequate job with the cooling for the 360?

EDIT: Here's a pic of the Jasper heatsink (more surface area and a heatpipe as well):
http://www.blogcdn.com/www.eng...a/2008/12/jasper_6.jpg

They could have done this the first time around without needing to redesign the case but they cut corners and paid for it.

Oh and here's a bit regarding how the nV notebook failures as related to the manufacturing and 360 failures:
"In 2005, ATI switched from high-lead bumps (90% lead, 10% tin) to eutectic bumps (37% lead, 63% tin). These eutectic bumps can't carry as much current as high-lead bumps, they have a lower melting point but most importantly, they are not as rigid as high-lead bumps. So in those high stress situations caused by many power cycles, they don't crack, and thus you don't get the same GPU failure rates in notebooks as you do with NVIDIA hardware.

What does all of this have to do with the Xbox 360 and its RRoD problems? Although ATI made the switch to eutectic bumps with its GPUs in 2005, Microsoft was in charge of manufacturing the Xenos GPU and it was still built with high-lead bumps, just like the failed NVIDIA GPUs. Granted NVIDIA's GPUs back in 2005 and 2006 didn't have these problems, but the Microsoft Xenos design was a bit ahead of its time. It is possible, although difficult to prove given the lack of publicly available documentation, that a similar problem to what plagued NVIDIA's GPUs also plagued the Xbox 360's GPU. "


So taking into account the inadequate cooling and different manufacturing, Microsoft had a much bigger hand in the 360 failures than ATI did.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: MarcVenice
Yes and that's certainly going to be a concern if titles are PC ports but its also going to be less of a concern if devs have access to the SDK from the start of development. I'm quite sure older games with software back-end solvers will not be retrofitted with PhysX, that just won't happen given the nature of the gaming industry. But if the tools are accessible at the start, the difference in cost to implement GPU PhysX will be much more palatable. Still, in the big picture you have to realize the next generation consoles will most likely integrate hardware physics acceleration, so the PC would make the perfect test platform and devs would benefit from the experience now.

I'm not sure if I got this right, and since programming isn't my forte, maybe you can explain. If a developer has acces to physx, is it possible to add it in a game for a console, and when people play it on a console with the physx disabled, people won't see special dust particles, or flapping cloth, but just regulat 'static' stuff, and as soon as the same game, without any changes made to it, gets run a hardware platform that can run the game with physx enabled, like a PC with a nvidia videocard, said game will have all those physx features, once again, without having to make any modifications?

Because if so, developing for the consoles suddenly becomes a moot point, developers can make a game, with pretty cool physx in it without any added cost? But if not, then I think we stand in agreement, and we won't see full scale physx adoption for a while, untill we see next-gen consoles that can run those physx.


It's actually a little more granular than that, as all the consoles already do PhysX. You would probably just have the console versions of the games locked to a specific PhysX LOD that won't stress the console too much, and allow PC gamers access to the slider so they can turn the PhysX detail all the way up if they choose(with the higher settings definitely requiring some form of hardware acceleration). IMO, that was a major failing of the first set of PhysX games, as you either had to go all or nothing for PhysX in most cases. IIRC, one of the last things Ageia added before being acquired was the ability to have PhysX effects auto-scale in terms of detail(and required processing) so developers wouldn't have to manually create several versions of an effect to implement physics LOD.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
If you look at this pic:
http://images.anandtech.com/re...x360/therm_removed.jpg
that small flat heatsink is for the GPU and the tower one is for the CPU. To me that is woefully inadequate cooling for the GPU as compared to the CPU. And you wouldn't have to redesign the case to make a better cooler than what they put on there considering the small, flat size of the heatsink.

Compare that to the PS3 heatsink:
http://images.dailytech.com/ni...000_large_heatsink.jpg
See the difference? Much larger and heatpipes to boot, cooled by a 160mm fan.

After looking at those pics can you honestly say MS did even an adequate job with the cooling for the 360?
PS3 Heatsink to scale
Its actually mostly a duct/housing and small plates with heatpipes to fin arrays.

3rd pic down
As you can see here, there's a reason the GPU heatsink is flat, its because the DVD player is mounted on top of it. So sure, if MS knew their ATI designed GPU would double as a solder iron, they may have given the 360 an external DVD player and put a larger HS on the GPU.

As for the solder used, again MS doesn't design hardware, ATI does. Just because they bought the design off ATI doesn't mean ATI isn't at fault for a faulty design and excessively hot GPU.
 

fire400

Diamond Member
Nov 21, 2005
5,204
21
81
Originally posted by: nosfe
butz do yuo liek physx or physics?

that's funny...

'cuz you already got phys. processors onboard the nVidia and ATI boards, why buy separate physX junk?

get it integrated onto the graphics board, then... plug it into your motherboard.

crossfire and SLI physics technology, anyone? ...who cares: get the software done right, first.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: chizow
PS3 Heatsink to scale
Its actually mostly a duct/housing and small plates with heatpipes to fin arrays.

The PS3 consumes more power and hence dissipates more heat and as such required a bigger heatsink (you're claiming the PS3 heatsink in fact has less surface area than the 360 heatsink?). Look at the size of it...it takes up the full front to back width of the PS3. And trust me it runs hot (I have one) but the cooler is doing an adequate job so there aren't any problems.

3rd pic down
As you can see here, there's a reason the GPU heatsink is flat, its because the DVD player is mounted on top of it. So sure, if MS knew their ATI designed GPU would double as a solder iron, they may have given the 360 an external DVD player and put a larger HS on the GPU.

Even just the small revision to the cooler that was made to the Jasper version would have helped with the original but MS cut corners and didn't do proper testing. SERIOUSLY, look at the size and design of the PS3 heatsink compared to the original 360 heatsink...we can see which one is better designed. MS should have designed a better heatsink if the Xenos does in fact dissipate a lot of power but judging by the size of the heatsinks, it's clear the CPU in the 360 needs much better cooling than the GPU (of course if you do an inadequate job in the first place it still won't work). Too bad we don't have power consumption numbers for the GPU and CPU separately for both the PS3 and 360...then we can put this argument to rest.

As for the solder used, again MS doesn't design hardware, ATI does. Just because they bought the design off ATI doesn't mean ATI isn't at fault for a faulty design and excessively hot GPU.
[/quote]

Please read what I quoted from the AT article...you'll see who was responsible for using the high lead eutectics that can lead to this problem (same problem nV had).
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
The PS3 consumes more power and hence dissipates more heat and as such required a bigger heatsink (you're claiming the PS3 heatsink in fact has less surface area than the 360 heatsink?). Look at the size of it...it takes up the full front to back width of the PS3. And trust me it runs hot (I have one) but the cooler is doing an adequate job so there aren't any problems.
Look at the close-up that you linked....its quite clear the majority of that heatsink is just housing for the 160mm fan.
Top-Down View
PS3 Close-up
On the right side is an intake, the left side the exhaust with a fin array. The rest is just ducting, essentially sheet metal. There's 2 thin plates that aren't much thicker than the heatpipes and above the CPU plate is completely empty as that's where the fan goes.

Even just the small revision to the cooler that was made to the Jasper version would have helped with the original but MS cut corners and didn't do proper testing. SERIOUSLY, look at the size and design of the PS3 heatsink compared to the original 360 heatsink...we can see which one is better designed. MS should have designed a better heatsink if the Xenos does in fact dissipate a lot of power but judging by the size of the heatsinks, it's clear the CPU in the 360 needs much better cooling than the GPU (of course if you do an inadequate job in the first place it still won't work). Too bad we don't have power consumption numbers for the GPU and CPU separately for both the PS3 and 360...then we can put this argument to rest.
While the new HS on Falcon helped, it didn't eliminate RROD completely, which again points to excessive GPU heat. Also, there are plenty of power consumption numbers for the 360, I've already provided some in this thread.

Please read what I quoted from the AT article...you'll see who was responsible for using the high lead eutectics that can lead to this problem (same problem nV had).
I've read the entire article, I guess you didn't read the whole thing or didn't fully understand it:
  • Near the Bottom
    It would also mean that in order to solve the problem Microsoft would have to switch to eutectic bumps, similar to what ATI did back in 2005, which would require fairly major changes to the GPU in order to fix. ATI's eutectic designs actually required an additional metal layer, meaning a new spin of the silicon, something that would have to be reserved for a fairly major GPU change.
Microsoft does not design GPUs. ATI does. ATI designed a GPU for Microsoft that spec'd for high lead eutectics, so again, it falls back on ATI's faulty fireball of a design. You can say it was Microsoft's fault for using eutectic bumps but again, ATI designs GPUs, Microsoft does not. Expecting Microsoft to respin a piece of silicon they didn't design in the first place is a bit of a joke.

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Thilan,

I wouldn't waste your time. I have already seen several the NV marketers say they would take the opinion of a user over scientific laws. In fact, I think you recall that thread. Clearly, most of these guys don't even understand heat output, heat transfer and heat dissipation, otherwise they would not babel in ignorance over the subject.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: ArchAngel777
Thilan,

I wouldn't waste your time. I have already seen several the NV marketers say they would take the opinion of a user over scientific laws. In fact, I think you recall that thread. Clearly, most of these guys don't even understand heat output, heat transfer and heat dissipation, otherwise they would not babel in ignorance over the subject.

Is that why the 4870 cards produce such massive heat and need digital PWMs and all that?

I will say that my GTX280 even overclocked as it is, runs cooler and quieter than my 4870 did when not overclocked. Even after messing with fan profiles in riva tuner, the card ran up to high heat output.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: cmdrdredd
Originally posted by: ArchAngel777
Thilan,

I wouldn't waste your time. I have already seen several the NV marketers say they would take the opinion of a user over scientific laws. In fact, I think you recall that thread. Clearly, most of these guys don't even understand heat output, heat transfer and heat dissipation, otherwise they would not babel in ignorance over the subject.

Is that why the 4870 cards produce such massive heat and need digital PWMs and all that?

I will say that my GTX280 even overclocked as it is, runs cooler and quieter than my 4870 did when not overclocked. Even after messing with fan profiles in riva tuner, the card ran up to high heat output.

You provide a good example of not understanding the difference between heat output and heat transfer & heat dissipation. I would suggest reading up on it as this post is also evidence you don't understand it either.

A stock 280GTX dumps more heat into your room than a stock 4870 1GB. Doesn't matter what card is physically 'hotter' to the touch, or which card has hotter air blowing from it. The only those that tells you is how well the cooling is designed. Heatsink (Heat Transfer) to draw heat away from the GPU, and the Fan (Heat Dissipation) to dissipate the heat from the Heatsink into the air. The larger the heatsink, the greater amount of heat can be stored, and the more CFM (Larger, Faster fan) your fan can produce the faster it will cool the Heatsink via dissipation.

For the sake of this discussion, I am going to stop responding in this thread.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ArchAngel777
Thilan,

I wouldn't waste your time. I have already seen several the NV marketers say they would take the opinion of a user over scientific laws. In fact, I think you recall that thread. Clearly, most of these guys don't even understand heat output, heat transfer and heat dissipation, otherwise they would not babel in ignorance over the subject.

And what thread would that be? But since we have the professor here, maybe you can explain to Thilan a heatsink dominated by a 160mm fan is not going to provide the full surface area for heat dissipation.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: ArchAngel777
Originally posted by: cmdrdredd
Originally posted by: ArchAngel777
Thilan,

I wouldn't waste your time. I have already seen several the NV marketers say they would take the opinion of a user over scientific laws. In fact, I think you recall that thread. Clearly, most of these guys don't even understand heat output, heat transfer and heat dissipation, otherwise they would not babel in ignorance over the subject.

Is that why the 4870 cards produce such massive heat and need digital PWMs and all that?

I will say that my GTX280 even overclocked as it is, runs cooler and quieter than my 4870 did when not overclocked. Even after messing with fan profiles in riva tuner, the card ran up to high heat output.

You provide a good example of not understanding the difference between heat output and heat transfer & heat dissipation. I would suggest reading up on it as this post is also evidence you don't understand it either.

A stock 280GTX dumps more heat into your room than a stock 4870 1GB. Doesn't matter what card is physically 'hotter' to the touch, or which card has hotter air blowing from it. The only those that tells you is how well the cooling is designed. Heatsink (Heat Transfer) to draw heat away from the GPU, and the Fan (Heat Dissipation) to dissipate the heat from the Heatsink into the air. The larger the heatsink, the greater amount of heat can be stored, and the more CFM (Larger, Faster fan) your fan can produce the faster it will cool the Heatsink via dissipation.

For the sake of this discussion, I am going to stop responding in this thread.

Doesn't matter, hotter = hotter regardless of where the heat goes :roll:

The fact that the GTX280 doesn't run at 60c idle without fan mods makes it have a better cooling setup. I'm sorry you think fancy words and technical babble mean anything, but they don't in this case. It's very simple. If your card runs 70c vs another at 60c doing the same thing, the one running 60c is running cooler. That is always better. I don't care that the heat is exhausted out the case, that's the whole point of the dual slot HSF. I'd rather have it expelled out than hovering around in the case or just not doing anything at all and making the GPU run hot.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Thilan,

Seriously chizow is just wasting your time!

It's clear just by looking at the fan shroud that not much air cooling the GPU. Also the CPU temp sensor controls the voltage going to the exhaust fans, which is another design flaw.

Just for the heck of it here's a picture of my original launch xbox360 and the GPU is cool to the touch, even after hours of gaming. One day I'll finish dressing it up. :)

http://i31.photobucket.com/alb...SChevy2001/xbox360.jpg

 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: cmdrdredd
Doesn't matter, hotter = hotter regardless of where the heat goes :roll:

The fact that the GTX280 doesn't run at 60c idle without fan mods makes it have a better cooling setup. I'm sorry you think fancy words and technical babble mean anything, but they don't in this case. It's very simple. If your card runs 70c vs another at 60c doing the same thing, the one running 60c is running cooler. That is always better. I don't care that the heat is exhausted out the case, that's the whole point of the dual slot HSF. I'd rather have it expelled out than hovering around in the case or just not doing anything at all and making the GPU run hot.
Here's mine @ 44c with a 6% fan speed with stock ref cooler. I guess ATi GPU just run to hot.

idle - stock cooler
http://img83.imageshack.us/img83/6544/4870pszg7.jpg
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
Just for the heck of it here's a picture of my original launch xbox360 and the GPU is cool to the touch, even after hours of gaming. One day I'll finish dressing it up. :)

http://i31.photobucket.com/alb...SChevy2001/xbox360.jpg
LMAO that's great! I'm sure it does run cool and clearly illustrates why a larger heatsink could not be used on the 360. :)

Originally posted by: SSChevy2001
Here's mine @ 44c with a 6% fan speed with stock ref cooler. I guess ATi GPU just run to hot.

idle - stock cooler
http://img83.imageshack.us/img83/6544/4870pszg7.jpg
Are you using a similar Fan mod to your Xbox? :) Must be a pretty serious fan if its running 1000RPMs at only 6%.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: chizow
Originally posted by: SSChevy2001
Just for the heck of it here's a picture of my original launch xbox360 and the GPU is cool to the touch, even after hours of gaming. One day I'll finish dressing it up. :)

http://i31.photobucket.com/alb...SChevy2001/xbox360.jpg
LMAO that's great! I'm sure it does run cool and clearly illustrates why a larger heatsink could not be used on the 360. :)
I was thinking they could of got away with a laptop DVD instead and then added a better fan design.

Originally posted by: SSChevy2001
Here's mine @ 44c with a 6% fan speed with stock ref cooler. I guess ATi GPU just run to hot.

idle - stock cooler
Are you using a similar Fan mod to your Xbox? :) Must be a pretty serious fan if its running 1000RPMs at only 6%.
[/quote]
Sorry no special tricks here, just some simple changes in the bios. With these new idle settings I'm saving about 45w.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SSChevy2001
Originally posted by: cmdrdredd
Doesn't matter, hotter = hotter regardless of where the heat goes :roll:

The fact that the GTX280 doesn't run at 60c idle without fan mods makes it have a better cooling setup. I'm sorry you think fancy words and technical babble mean anything, but they don't in this case. It's very simple. If your card runs 70c vs another at 60c doing the same thing, the one running 60c is running cooler. That is always better. I don't care that the heat is exhausted out the case, that's the whole point of the dual slot HSF. I'd rather have it expelled out than hovering around in the case or just not doing anything at all and making the GPU run hot.
Here's mine @ 44c with a 6% fan speed with stock ref cooler. I guess ATi GPU just run to hot.

idle - stock cooler
http://img83.imageshack.us/img83/6544/4870pszg7.jpg

a cold boot in a cold room doesn't count. You obviously don't have a stock card, modded bios :roll:

And saving 45w? big deal...if 45w is make it or break it to your budget then maybe you should ask the government for a bailout or just stop buying hardware.
 

zebrax2

Senior member
Nov 18, 2007
972
62
91
Originally posted by: cmdrdredd

And saving 45w? big deal...if 45w is make it or break it to your budget then maybe you should ask the government for a bailout or just stop buying hardware.

whats is the problem with saving 45w? the card is idle, do you want your card to suck up so much power when clearly it does not need it when you are not using it?