32bit vs 64bit - Why should I consider?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Dark Cupcake
Originally posted by: Gamingphreek
Sure it isn't a problem but it is horrible design/engineering. That much heat means a ton of power is being used. Given that it is a 4870 and isn't the fastest thing on the planet, it shouldn't be wasting that much power (be it by Transistor Leakage or otherwise).

-Kevin

Temperature is not equal to power consumption. Sticking a masive heatsink on it and limiting it to 30C will not make it consume significantly less power. (In reality there should be a slight difference due to semiconductor properties, if I remember correctly :p)

On the other hand the original 4870 512 card I had used to idle at 82C and load during heavy gaming to 86C. I don't see that as a bad thing, since the thermal stress on the chip will be much lower.

Also while a 4870 should be able to run Furmark, I wouldn't do it for extended periods of time as the vrm get insanely hot (my msi card stable but 120C on them, sapphire one doesn't have temp monitoring for them, it might be similar design to palit cards).

Temperature is not linear to power consumption, but heat is produced as one of the side effects of power usage. The Law of Conservation of Energy states that energy can neither be created or destroyed - here is a prime example of electrical energy being converted to heat energy.

Additionally, slapping a heatsink on it affects the independent variable which is the heat produced given a supply of power - thus you can't use that as a counter example ;)

The VRM's are generally cooled enough, the core and the memory are the main things that need active cooling.

-Kevin

 

Itchrelief

Golden Member
Dec 20, 2005
1,398
0
71
Originally posted by: Gamingphreek

Additionally, slapping a heatsink on it affects the independent variable which is the heat produced given a supply of power - thus you can't use that as a counter example ;)


-Kevin

Huh? As I understand it, heatsinks/radiators only increase the heat transfer rate to the surroundings at a given temperature, leading the equilibrium temperature to be lower for a given rate of heat input. The person you quoted already stated he is ignoring temperature effects on electrical properties.

And anyways, it's disingenuous to say 4870s are a poor design when they hit 75C on Furmark (from what I hear 90C+ is more the territory a GPU can get into while running it), as that program is known to cause all GPUs to heat up like mad. Heck, my nvidia 9800 hits 75C running Folding. Would that be a bad design?
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Itchrelief
Originally posted by: Gamingphreek

Additionally, slapping a heatsink on it affects the independent variable which is the heat produced given a supply of power - thus you can't use that as a counter example ;)


-Kevin

Huh? As I understand it, heatsinks/radiators only increase the heat transfer rate to the surroundings at a given temperature, leading the equilibrium temperature to be lower for a given rate of heat input. The person you quoted already stated he is ignoring temperature effects on electrical properties.

And anyways, it's disingenuous to say 4870s are a poor design when they hit 75C on Furmark (from what I hear 90C+ is more the territory a GPU can get into while running it), as that program is known to cause all GPUs to heat up like mad. Heck, my nvidia 9800 hits 75C running Folding. Would that be a bad design?

They do use the Principles of Thermodynamics in that heat always moves from warmth to cold; however, if we are measuring heat output, unless both Video Cards we are comparing have the same exact heatsink, you can't use heat as the independent variable.

At any rate, science aside, I am saying, as a blanket statement, that GPU's can be made more efficient than they are. The GPU manufacturers merely have not had the pressure that CPU manufacturers have.

-Kevin
 

Itchrelief

Golden Member
Dec 20, 2005
1,398
0
71
Originally posted by: Gamingphreek
Originally posted by: Itchrelief
Originally posted by: Gamingphreek

Additionally, slapping a heatsink on it affects the independent variable which is the heat produced given a supply of power - thus you can't use that as a counter example ;)


-Kevin

Huh? As I understand it, heatsinks/radiators only increase the heat transfer rate to the surroundings at a given temperature, leading the equilibrium temperature to be lower for a given rate of heat input. The person you quoted already stated he is ignoring temperature effects on electrical properties.

And anyways, it's disingenuous to say 4870s are a poor design when they hit 75C on Furmark (from what I hear 90C+ is more the territory a GPU can get into while running it), as that program is known to cause all GPUs to heat up like mad. Heck, my nvidia 9800 hits 75C running Folding. Would that be a bad design?

They do use the Principles of Thermodynamics in that heat always moves from warmth to cold; however, if we are measuring heat output, unless both Video Cards we are comparing have the same exact heatsink, you can't use heat as the independent variable.

At any rate, science aside, I am saying, as a blanket statement, that GPU's can be made more efficient than they are. The GPU manufacturers merely have not had the pressure that CPU manufacturers have.

-Kevin

The way you have it written makes it seem like changing the heatsink changes the power draw of the chip.

So you are telling me that if I stick a huge heatsink on my GPU, it will use less power? (that is essentially what you mean by heat/independent variable would be changed by changing the heatsink)
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Itchrelief
Originally posted by: Gamingphreek
Originally posted by: Itchrelief
Originally posted by: Gamingphreek

Additionally, slapping a heatsink on it affects the independent variable which is the heat produced given a supply of power - thus you can't use that as a counter example ;)


-Kevin

Huh? As I understand it, heatsinks/radiators only increase the heat transfer rate to the surroundings at a given temperature, leading the equilibrium temperature to be lower for a given rate of heat input. The person you quoted already stated he is ignoring temperature effects on electrical properties.

And anyways, it's disingenuous to say 4870s are a poor design when they hit 75C on Furmark (from what I hear 90C+ is more the territory a GPU can get into while running it), as that program is known to cause all GPUs to heat up like mad. Heck, my nvidia 9800 hits 75C running Folding. Would that be a bad design?

They do use the Principles of Thermodynamics in that heat always moves from warmth to cold; however, if we are measuring heat output, unless both Video Cards we are comparing have the same exact heatsink, you can't use heat as the independent variable.

At any rate, science aside, I am saying, as a blanket statement, that GPU's can be made more efficient than they are. The GPU manufacturers merely have not had the pressure that CPU manufacturers have.

-Kevin

The way you have it written makes it seem like changing the heatsink changes the power draw of the chip.

So you are telling me that if I stick a huge heatsink on my GPU, it will use less power? (that is essentially what you mean by heat/independent variable would be changed by changing the heatsink)

No. I am saying that your example was flawed.

The independent variable, heat output, is determined by the dependent variable, power input. If you change the heat output as viewed by us through a thermometer, you have changed the independent variable thereby invalidating the test as you have 2 dependent variables.

Yeah, the 4870 can't handle Furmark (due to the VRMs used in the reference design). Not sure this would have affected you in any "real world" situation, though.

Thats interesting. I wonder why it is only on Futuremark that this happens. Regardless, OP, you have your answer right there what is wrong with your card.

-Kevin
 

Itchrelief

Golden Member
Dec 20, 2005
1,398
0
71
Originally posted by: Gamingphreek

No. I am saying that your example was flawed.

The independent variable, heat output, is determined by the dependent variable, power input. If you change the heat output as viewed by us through a thermometer, you have changed the independent variable thereby invalidating the test as you have 2 dependent variables.

The fly in the ointment is that a thermometer does NOT measure heat output. It only measures the equilibrium temperature reached, which is where heat input equals heat output.
 

StarsFan4Life

Golden Member
May 28, 2008
1,199
0
0
Originally posted by: Gamingphreek
Originally posted by: Itchrelief
Originally posted by: Gamingphreek
Originally posted by: Itchrelief
Originally posted by: Gamingphreek

Additionally, slapping a heatsink on it affects the independent variable which is the heat produced given a supply of power - thus you can't use that as a counter example ;)


-Kevin

Huh? As I understand it, heatsinks/radiators only increase the heat transfer rate to the surroundings at a given temperature, leading the equilibrium temperature to be lower for a given rate of heat input. The person you quoted already stated he is ignoring temperature effects on electrical properties.

And anyways, it's disingenuous to say 4870s are a poor design when they hit 75C on Furmark (from what I hear 90C+ is more the territory a GPU can get into while running it), as that program is known to cause all GPUs to heat up like mad. Heck, my nvidia 9800 hits 75C running Folding. Would that be a bad design?

They do use the Principles of Thermodynamics in that heat always moves from warmth to cold; however, if we are measuring heat output, unless both Video Cards we are comparing have the same exact heatsink, you can't use heat as the independent variable.

At any rate, science aside, I am saying, as a blanket statement, that GPU's can be made more efficient than they are. The GPU manufacturers merely have not had the pressure that CPU manufacturers have.

-Kevin

The way you have it written makes it seem like changing the heatsink changes the power draw of the chip.

So you are telling me that if I stick a huge heatsink on my GPU, it will use less power? (that is essentially what you mean by heat/independent variable would be changed by changing the heatsink)

No. I am saying that your example was flawed.

The independent variable, heat output, is determined by the dependent variable, power input. If you change the heat output as viewed by us through a thermometer, you have changed the independent variable thereby invalidating the test as you have 2 dependent variables.

Yeah, the 4870 can't handle Furmark (due to the VRMs used in the reference design). Not sure this would have affected you in any "real world" situation, though.

Thats interesting. I wonder why it is only on Futuremark that this happens. Regardless, OP, you have your answer right there what is wrong with your card.

-Kevin


RMA already in process. Should have my new card by next week...which I think takes too long.
 

StarsFan4Life

Golden Member
May 28, 2008
1,199
0
0
I get the replacement card today. After doing some reading, I suppose Furmark is NOT recommended on ATI cards....especially the 4870 series, correct?
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: StarsFan4Life
I get the replacement card today. After doing some reading, I suppose Furmark is NOT recommended on ATI cards....especially the 4870 series, correct?

Not recommending software for a particular card seems like avoiding poor performance results or avoiding a hardware defect to me.

There should be absolutely no problem running whatever software you have (assuming the card has the necessary H/W to support it). The 4870 is more than capable - there should be absolutely no problem running Futuremark software on it.

-Kevin
 

StarsFan4Life

Golden Member
May 28, 2008
1,199
0
0
Ran Furmark without any problems.....it must have been a bad card. Thats my first video card in 12 years of building that I have bought defective...
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I've said it before and I'll say it again... nobody buying an operating system today should consider a 32-bit OS if their hardware is 64-bit capable. Legacy apps you say? Virtualize it if you actually have apps that a 64-bit OS can't run.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Sylvanas
Originally posted by: GaryJohnson
64bit doesn't have any performance advantage over 32bit apart from the fact that you can use more RAM which can lead to better performance.

The only other real reason to go 64bit is for future software compatability. For example in 5 years from now Vegas Pro 14.0 might be 64bit only.

So weigh those reasons against whatever "problems" you're currently having with 64bit. Which... btw, what are the problems you having with 64bit? You shouldn't be having problems.

A 64bit OS is inherently faster than a 32bit OS in certain tasks, right now- not years in the future. See here. This is Linux, but the principle is the same- encoding, extraction, flash and 3D rendering are all faster on a 64bit OS with appropriate software. The catch is indeed 'with appropriate software' so you are going to need to use these things to see the difference but the difference is there. IMO there is no reason to be buying a 32bit, 'compatibility' was a problem maybe 3 years ago or whenever XP 64 came out but thats not an issue today. All devices to be certified for Windows Vista/7 need to have a 32bit and 64bit driver, so unless you are running a printer from 98 the likelihood is you will be fine.

Keep in mind it's not the OS's "64-bitness" that makes it faster... it's the extra GPR's available when running in 64-bit mode.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: VinDSL
Originally posted by: StarsFan4Life
[...]Last week, I pre-ordered WIndows 7 Prosfessional [sic] [...]

[...]Is there a real need for me to go 64bit?[...]
Congratulations, and...

No! :D

He has 8GB of RAM - 64bit is absolutely necessary! -_-

-Kevin
 

ibex333

Diamond Member
Mar 26, 2005
4,094
123
106
I don't mean to hijack the thread, but I want to know if there are any disadvantages with a 64bit Windows as opposed to 32bit? I heard gaming performance suffers but I want to know if there's any truth to this... Are there any other disadvantages? Is the 32 bit version generally MORE stable? Are there any games that take advantage of the 64bit technology? Are there any that are being planned?
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: ibex333
I don't mean to hijack the thread, but I want to know if there are any disadvantages with a 64bit Windows as opposed to 32bit? I heard gaming performance suffers but I want to know if there's any truth to this... Are there any other disadvantages? Is the 32 bit version generally MORE stable? Are there any games that take advantage of the 64bit technology? Are there any that are being planned?

Well the primary disadvantage to 64-bit is the increase in pointer size in code. For instance, when coding on a 32-bit machine, a pointer to X is 4 bytes (32-bit). On a 64-bit machine, this is doubled to 8 bytes (64-bit). Thus it will use more memory; however, a 64-bit machine, generally should have more memory to spare.

A 32-bit game shouldn't suffer a noticeable amount. Drivers are fairly mature at this point in time, that it is negligible - it just wont increase performance because it is written using 32-bit data types.

Additionally, when using codecs, the codec will need to match the application. For instance, a 32-bit MP3 codec will not be recognized by 64-bit Media Player.

The 64-bit version of Windows, I believe does not allow you to install unsigned drivers- thus beta testing is a little more limited. Outside of those, I don't see any other disadvantages to using 64-bit.

64-bit or 32-bit, in mature applications (which most are), has no inherent stability advantages or disadvantages.

I know Far Cry had a 64-bit patch; however, I have been out of the loop a little on gaming due to work, college, and other factors. I'm sure there are at least a few others though.

Finally, don't expect huge performance gains and don't expect to see every application switch to 64bit. For instance, there would be no purpose to rewrite MS Word to use 64-bit data types. There would be no advantage.

-Kevin
 

StarsFan4Life

Golden Member
May 28, 2008
1,199
0
0
Well.....something has gone terribly wrong:


F!

I recently purchased and built the following PC:

* Cooler Master Elite 330
* Gigabyte GA-MA78GM-S2H 780G motherboard
* AMD Phenom II Quad 940 Black Editon - Stock cooling
* 8GB G-Skill PC2-6400
* XFX Radeon 4870 1GB
* Western Digital Caviar Black 1TB SATA II
* Western Digital Caviar Black 640GB SATAII
* Onboard LAN/HD Audio
* Antec Earthwatts 650wt psu
* Logitech DiNovo Keyboard
* Logitech MX Revolution Mouse
* 2 X Yate Loon 120mm silent case fans
* 2 X Dell 23" S2309 W @ 1900 x 1080
* Microsoft Windows 7 64-bit and all updates
* DirectX 10 installed


I purchased it all on 6/20/2009 from Newegg.com.


Lastnight, I was enjoying a game of Day of Defeat Source....like I do almost daily. All of the sudden, my screens went black and I noticed a pungent "electronic melting" smell and looked to see my PSU was sparking and smoking.

I pulled the power plug....but it was too late.

So far...it seems to have taken the PSU and the motherboard...as I tried a spare Coolermaster PSU I had laying around. It would not boot up at all.

Is it possible it could have taken the CPU, RAM and video card long with it? None of these components have that smell coming from them.

Needless to say, I got on chat with Newegg...and they would do absolutely NOTHING about it...because I was 9 days outside of the 30 warranty they offer. I couldn't be more pissed than I am right now. This PSU was highly recommended by a lot here at << so I took it over a BFG....which seems to be a costly mistake.

I already submitted an RMA form with Antec and Gigabyte.....but if the problem was with the PSU in the first place....should they reimburse me for all parts it destroyed (given that I still need to test the ram, video card, cpu and hard drives)?

What the hell do I do here besides an RMA?

=================================

I suggest you repost this to the "General Hardware" or "Computer Help" forums. Your post is completely off-topic here and won't get the range of responses you are looking for.

RebateMonger
AnandTech Moderator