Vista32- vs. Vista64-bit OS Showdown *Done!*

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
I've (finally) updated my first post with my rig specs.


Right now I'm exhausted!

I spent quite a few hours tearing stuff out of my case and basically re-doing the entire thing. It's now got (5) 140mm fans including dedicated intakes for CPU and GPU, (2) 120mm fans, (1) GeminII CPU cooler with (2) more 120mm fans, my new 8800 GTS 512, a TV card, and a bunch of drives...

You should see the CPU cooler, though -- it dwarfs the video card!

:Q

Temps did not improve all that much. I think the fans are competing for airflow and causing some turbulence, so I'll probably need some ducting -- maybe on the weekend.

Oddly, voltage is better (i.e. closer to ideal volts) with more fans. :confused:


For now... I'm going to play a race or two in NFS... and call it a day...

*yawn*


I'll get to the benchmarks tomorrow.


:moon:
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Thanx for the results apoppin; i know benching is incredibly hard work. (Hence why i'm lazy & rarely do it :p)

It's disappointing to see drivers seem to be better optimized for x86, but it doesn't surprise me, since x64 is still a tiny portion of the market.

Doesn't change my constant recommending x64 over x86 though, since i recommend for longterm use for an enthusiast, x86 is on its final stretch.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: n7
Thanx for the results apoppin; i know benching is incredibly hard work. (Hence why i'm lazy & rarely do it :p)

It's disappointing to see drivers seem to be better optimized for x86, but it doesn't surprise me, since x64 is still a tiny portion of the market.

Doesn't change my constant recommending x64 over x86 though, since i recommend for longterm use for an enthusiast, x86 is on its final stretch.

thanks ... and i am only probably just over halfway ... i am actually *playing* games ... i installed Hg:L on my Vista32 bit partition and updated it. Really beautiful [dark and ugly] graphics. Fully maxed/extreme shaders/DX10 smoke - the 'works' at 16x10 with in-game maxed AA/AF ... and it is VERY playable ... this is one of the games i plan to include in Part Two of the "OS Showdown"/

Here is what i have decided to do for part 2 [not a final decision]

besides Hg:L

The Witcher

Gothic3

i think these games - including the King of memory mismanagement, Gothic3 - will give a good representation of RPG-type games that may show a difference between the 2 OSes.
- i thought about NWN2 ... but the Witcher is a better version of the same engine



and H-E-L-P !! - please! ... i am having problems keeping my OC with my Pro ... it has some ungodly low core/memory but *apparently* tests OK at XT speeds ... the problems is that speeds cannot be wither "saved" or they do not "stick" after reboot.

i do not know if it has to do with having only *one* bridge interconnect - CCC tells me it is not optimal but offers nothing in diagnostics. It appears that i am running my GPUs at different speeds :p
--And the most promising of the tool - ATT - doesn't have signed drivers for Vista 64 :p ... for once i may have to use different OC'ing utilities. :(


at any rate, Nullpointerus now takes the center stage with his testing while i am preparing mine for the weekend. His rig is finally working to his satisfaction and he got rid of his XT for a GT.
:thumbsup:

 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
*bump* for initial benchmarking

They can be found on the first page of this thread, in my first post.


I'll get to the next set of benchmarks after 10:30 p.m. EST. :thumbsup:
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
you guys are doing a great job. I'm still holding out on vista, but things like this are inching me closer...
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
My first round of benchmarks is completed:


3DMark:

32-bit -- '03 --> 38079 ( CPU: 2001 ) <-- win! ( + 0.2% )
64-bit -- '03 --> 37989 ( CPU: 2015 )

32-bit -- '06 --> 11607 ( SM 2.0: 5364, SM 3.0: 5484, CPU: 2557 ) <-- win! ( + 3.4% )
64-bit -- '06 --> 11227 ( SM 2.0: 5095, SM 3.0: 5404, CPU: 2468 )


Call of Juarez DX10

32-bit --> 28.5 ( min: 18.2, max: 50.3 ) <-- tie!
64-bit --> 28.5 ( min: 18.2, max: 50.3 ) <-- tie!


Lost Planet

32-bit -- DX09 --> Snow: 77, Cave: 51 <-- tie!
64-bit -- DX09 --> Snow: 77, Cave: 51 <-- tie!

32-bit -- DX10 --> Snow: 71, Cave: 50 <-- tie!
64-bit -- DX10 --> Snow: 71, Cave: 50 <-- tie!

The ties may look a little wierd, but those ARE the whole numbers resulting from the averaged runs. My guess is that being CPU bound (at default settings), these tests don't really highlight the driver differences between 32-bit and 64-bit. So later I'll redo these tests at my highest possible resolution (1680x1050) and possibly higher settings.

Anyone want to see max settings at 1680x1050, or should I use default settings for that res.?

Suggestions and requests are welcome!!

:)


Originally posted by: bryanW1995
you guys are doing a great job.
Hey, thanks!!

I'm still holding out on vista, but things like this are inching me closer...
SP1 sounds like a good time! :thumbsup:

Hidden meaning: I would've avoided a lot of issues that way.

:Q

:D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
What settings are you using for your games ?
- what do you mean by "default settings"?
:confused:

i used maximum everything - in game - plus 4xAA/16xAF [AA if possible, never forced].... except v-synch [actually i DO like v-synch w/triple buffering in Hg:L ... but then i will be not be measuring FPS]
in other words, if there was a setting for it, i checked it. Everything is put to "highest" or 'extreme' or 'ultra' ... i used DX10 if there was a choice and maximum HDR or Full Dynamic Lighting. Whatever setting that could be maxed out - in game - is maxed out.... and i always maxed the filtering - 4xAA/16xAF - whenever possible

i would hope your Lost Planet Benches are not at *max everything* including "Dx10 fur" 4xAA/16xAF. How is it you get so little penalty in LP by running the DX10 pathway? i didn't post my DX9 benches, but it is much faster - also with everything fully maxed [i know nvidia is ahead of AMD here; i also am using the retail game's built-in bench]
- if so, i will get a GT and sell both my 2900s :p
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: apoppin
What settings are you using for your games ?
- what do you mean by "default settings"?

:confused:
Whatever the settings were when I first started the benchmark program.

I saved screenshots of the settings in case you need specifics.

i used maximum everything - in game - plus 4xAA/16xAF [AA if possible, never forced].... except v-synch [actually i DO like v-synch w/triple buffering in Hg:L ... but then i will be not be measuring FPS]
in other words, if there was a setting for it, i checked it. Everything is put to "highest" or 'extreme' or 'ultra' ... i used DX10 if there was a choice and maximum HDR or Full Dynamic Lighting. Whatever setting that could be maxed out - in game - is maxed out.... and i always maxed the filtering - 4xAA/16xAF - whenever possible
I'm going to do another round of benchmarks at 1680x1050. I just need advice about what settings people would like to see. Perhaps I should use max settings?

i would hope your Lost Planet Benches are not at *max everything* including "Dx10 fur" 4xAA/16xAF. How is it you get so little penalty in LP by running the DX10 pathway? i didn't post my DX9 benches, but it is much faster - also with everything fully maxed [i know nvidia is ahead of AMD here; i also am using the retail game's built-in bench]
- if so, i will get a GT and sell both my 2900s :p
:Q

I guess my GTS is just *that* good. ;)


Seriously, the difference is the default resolution (1280x720) and settings (not maxed).


EDIT: 1680x1050 is not available in Lost Planet DX09 on my rig. I've tried the options menu and hand editing the config file, but the demo keeps going back to 1600x1000. I see plenty of people posting about this online -- no idea what the cause is yet.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nullpointerus
Originally posted by: apoppin
What settings are you using for your games ?
- what do you mean by "default settings"?

:confused:
Whatever the settings were when I first started the benchmark program.

I saved screenshots of the settings in case you need specifics.

i used maximum everything - in game - plus 4xAA/16xAF [AA if possible, never forced].... except v-synch [actually i DO like v-synch w/triple buffering in Hg:L ... but then i will be not be measuring FPS]
in other words, if there was a setting for it, i checked it. Everything is put to "highest" or 'extreme' or 'ultra' ... i used DX10 if there was a choice and maximum HDR or Full Dynamic Lighting. Whatever setting that could be maxed out - in game - is maxed out.... and i always maxed the filtering - 4xAA/16xAF - whenever possible
I'm going to do another round of benchmarks at 1680x1050. I just need advice about what settings people would like to see. Perhaps I should use max settings?

i would hope your Lost Planet Benches are not at *max everything* including "Dx10 fur" 4xAA/16xAF. How is it you get so little penalty in LP by running the DX10 pathway? i didn't post my DX9 benches, but it is much faster - also with everything fully maxed [i know nvidia is ahead of AMD here; i also am using the retail game's built-in bench]
- if so, i will get a GT and sell both my 2900s :p
:Q

I guess my GTS is just *that* good. ;)


Seriously, the difference is the default resolution (1280x720) and settings (not maxed).


EDIT: 1680x1050 is not available in Lost Planet DX09 on my rig. I've tried the options menu and hand editing the config file, but the demo keeps going back to 1600x1000. I see plenty of people posting about this online -- no idea what the cause is yet.

You are using the LP demo or the full retail version?
- i have the full game ... so there may be some difference [anyway] ... from my memory, it appears there are more options than when i tested it last Summer.

as to what setting you should use ... that is up to you ... but *whatever* you do, they must be identical on each of the partitions [that's all]. i use *maximum* because that is what i like on my own rig. And i use 16x10 because [i have it and] it is one of the two most popular [modern gaming] resolutions of today. i would game at 19x12 - except they don't offer it in a 20" monitor [and i obviously can't test at it]. 16x12 is possible for me to test but it is losing popularity and the results are not very different from 16x10. i could also test at 14x9 and from 16x12 down to 4x8 ... but it doesn't seem very necessary or that it will impact this 'showown' very much.

Yours is the G-92 GTS? it should be faster than my single 2900xt :p
- i am also looking forward to your results .... i hope you DO test at least a few games at 'max' settings

it feels good to have a smoothly running "benchmark computer"
[again]

:)


Seriously, the difference is the default resolution (1280x720) and settings (not maxed).
You don't need to SS but you should post what settings you test at so it can be replicated. And i think you should be pretty consistent - not 'low' on one test and 'hi' on another although DX10 games can have details lowered since many don't play well on most rigs.

I just need advice about what settings people would like to see.

i have given up on getting advice from "people" here and am just going to do it
... :music:My Way:music:
:D

i think we're on our own
:Q
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
I appreciate all the 'stuff' you guys are doing. Sorry I can't jump in. Can't (or won't!) go to 4Gb/V64. Best I can offer is XP32/V32 @ 2Gb.

Originally posted by: apoppin

Yeah, i only have one bridge interconnect. CCC says it is "not functioning at optimal performance" but has nothing in diagnostics. One bridge came with my VT 2900xt but G-D Sapphire is too cheap to include one ... so i ordered one that will be here tomorrow.

Explain this "optimized" vs. unoptimized thing. Almost everything i play is at 16x10 unless i need to break out my CRT for DX10 games :p

Single card, my 2900xt is around 10500 in 3DMark06 [vista64] and 12500 with the 2900p running in Tandem ... but RT shows my 2900xt running at it's stock clock and my 2900pro running at some unglodly low clock.

my problem is that OC don't "stick" ... i can't save them and i am *hoping* that it has to do with having only one bridge interconnect. i'll let you know tomorrow in the "Vista showdown" thread or if you reply - here.

My (2) Crossfire bridges came with the MSI 790fx mobo. I had a "Monk Moment" when confronted with 2 interconnects on the 2900pro's thinking I only need one (and fixating on which pair of interconnects on the cards to use - lol). I saw in a thread where someone used both interconnects and that solved my dilemma.

When in CCC you get the "not functioning at optimal performance" could it possibly be noting that your second PCIe-X16 only has 4 lanes ??? That may also contribute as to why your clock settings won't "stick"

The ""optimized"" issue that I ran into was from within the game setup itself - not in CCC. While preparing for the test (FEAR 1.08 ?) and selecting my screen resolution the game would note: ""Not Optimized for Display Resolution""

I do not know if that meant the display resolution was not optimized for Crossfire OR that my Westy HD Monitor was refusing to play well with the setup. I tend to think that Crossfire was not optimized for that resolution but 'who knows'?

Anyway ... the highest resolution in FEAR that I could run 'optimized' (without the nag) was 1280x960 with Crossfire adding 55% to min frame rates, 58% to avg, and 71% to max.

I'm not a gamer but don't mind doing the benchies. After playing around I may well sell the rig. The reason I picked up FEAR was that it was in the discount rack at Wally World and a young chap (who had obviously skipped school and was checkin' out the free Xbox play) suggested it over CoH - :D

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: heyheybooboo
I appreciate all the 'stuff' you guys are doing. Sorry I can't jump in. Can't (or won't!) go to 4Gb/V64. Best I can offer is XP32/V32 @ 2Gb.

Originally posted by: apoppin

Yeah, i only have one bridge interconnect. CCC says it is "not functioning at optimal performance" but has nothing in diagnostics. One bridge came with my VT 2900xt but G-D Sapphire is too cheap to include one ... so i ordered one that will be here tomorrow.

Explain this "optimized" vs. unoptimized thing. Almost everything i play is at 16x10 unless i need to break out my CRT for DX10 games :p

Single card, my 2900xt is around 10500 in 3DMark06 [vista64] and 12500 with the 2900p running in Tandem ... but RT shows my 2900xt running at it's stock clock and my 2900pro running at some unglodly low clock.

my problem is that OC don't "stick" ... i can't save them and i am *hoping* that it has to do with having only one bridge interconnect. i'll let you know tomorrow in the "Vista showdown" thread or if you reply - here.

My (2) Crossfire bridges came with the MSI 790fx mobo. I had a "Monk Moment" when confronted with 2 interconnects on the 2900pro's thinking I only need one (and fixating on which pair of interconnects on the cards to use - lol). I saw in a thread where someone used both interconnects and that solved my dilemma.

When in CCC you get the "not functioning at optimal performance" could it possibly be noting that your second PCIe-X16 only has 4 lanes ??? That may also contribute as to why your clock settings won't "stick"

The ""optimized"" issue that I ran into was from within the game setup itself - not in CCC. While preparing for the test (FEAR 1.08 ?) and selecting my screen resolution the game would note: ""Not Optimized for Display Resolution""

I do not know if that meant the display resolution was not optimized for Crossfire OR that my Westy HD Monitor was refusing to play well with the setup. I tend to think that Crossfire was not optimized for that resolution but 'who knows'?

Anyway ... the highest resolution in FEAR that I could run 'optimized' (without the nag) was 1280x960 with Crossfire adding 55% to min frame rates, 58% to avg, and 71% to max.

I'm not a gamer but don't mind doing the benchies. After playing around I may well sell the rig. The reason I picked up FEAR was that it was in the discount rack at Wally World and a young chap (who had obviously skipped school and was checkin' out the free Xbox play) suggested it over CoH - :D

i am just going to start a new thread about my X-Fire problems ... FEAR didn't scale :p

http://forums.anandtech.com/me...=2151964&enterthread=y

*most* older MBs are not "ideal" ... it is either 16x/4x or 8x/8x PCIe ... only the newest MBs have 16x/16x so that should not be the reason i am getting a poor performance increase and no OC that will stick.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
2/10/08 Update:

i added Crossfire results with my OC'd 2900pro/2900XT you might find interesting. Crossfire scales pretty well but still Vista 32 pulls ahead of Vista 64 in pure FPS.

i am taking a little break ... i have not been getting more then 4-6 hours of sleep over the last 2 weeks and it shows - i am exhausted and my rig is running perfectly. nullponterus also has his results updated and will carry the show for awhile. The we will finalize and compile the results.

From what i can see - so far - booting into Win Vista 64 and then into Vista 32 - every single day ... there is no practical difference in 32-bit games. There are no disadvantages to Vista 32 - it manages its memory very efficiently and there is no extra HD 'thrashing' or slowdowns ... the games that "ate memory" - the Witcher and Hg:L are "tamed" and i can play on either OS for hours without ANY issues. i haven't gone back to Gothic3 in Vista64 which DOES have a memory leak and crap RAM management[period]

i am not terribly anxious to even do part 2. If Vista 64 is slower in games, why do i want to recommend it over Vista 32? - i don't give a crap if a level loads a second or two faster - or not ... but then i have been called both a "graphics whore" and a "FPS whore" :p

here are the results:

i got my 2900xt/2900pro working well together ... here are Vista 64 results ... a solid increase over 10094 to 13090:

Main test results
3DMark Score 13090 3DMarks
SM 2.0 Score 5824 Marks
SM 3.0 Score 6736 Marks
CPU Score 2697 Marks


Test Results

Graphics Tests
1 - Return to Proxycon 46.72 FPS
2 - Firefly Forest 50.34 FPS
CPU Tests
CPU1 - Red Valley 0.85 FPS
CPU2 - Red Valley 1.36 FPS
HDR Tests
1 - Canyon Flight (SM 3.0) 69.44 FPS
2 - Deep Freeze (SM 3.0) 65.28 FPS

=================


F.E.A.R. built-in Demo

Vista 64 - 16x10 everything maxed 0xAA/16xAF - SS on

2900xt -25 Min/63 Avg/113 Max
Pro/XT --35 Min/70 Avg/115 Max


Vista 32 16x10 everything maxed 0xAA/16xAF - SS on

2900xt -30 Min/59 Avg/112 Max
Pro/Xt - 35 Min/76 Avg/120 Max


With Crossfire Vista 32 is also ahead of 64-bit


===================

Call of Juarez DX10 benchmark

Vista 64 -16x10- High Shadows/Shader Map - 2048x2048

2900xt -15.9 Min/20.7 Avg/49.3 Max
Pro/xt - 15.0 Min/38.5 Avg/82.8 Max

Vista 32 -16x10- High Shadows/Shader Map - 2048x2048

2900xt -- 14.7 Min/24.9 Avg/52.3 Max
Pro/XT - 14.4 Min/38.5 Avg/85.5 Max

Toss a coin



==========================

HL2 Lost Coast built in benchmark

Vista 64 - Min, Max, Avg
2900xt ------- 39, 190, 90.420
Pro/XT ------- 43, 214, 101.218


Vista 32 - Min, Max, Avg
2900xt - -------62, 226, 106.322
Pro/XT ------- 68, 283, 141.051

Vista32 takes it

++++++++++++++++++++++++


Lost Planet: Extreme Conditions-
- full retail game built-in demo. DX10/everything fully maxed in-game/1680x1050/4xAA-16xAF

Vista 64 XT/Pro - Snow - 30.0 / Cave 29.0

Vista 32 XT/Pro - Snow - 30.8 / Cave 30.2

again in Xfire ? 32 bit is ahead

============================
CPU Crysis demo 32bit Vista

2900xt -- Average FPS: 11.59, Min FPS: 4.95, Max FPS: 14.62
XT/Pro - Average FPS: 13.89, Min FPS: 4.46, Max FPS: 21.68

CPU Crysis demo 32bit Vista

2900xt - -Average FPS: 10.60, Min FPS: 0.74, Max FPS: 17.16
XT/Pro - Average FPS: 13.96, Min FPS: 1.98, Max FPS: 17.60


no 64-bit Xfire results ? flashing textures :p



 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

i have something to report ... i *finally* got Hellgate: London updated on Vista 64 ... i had to set up something faster than 56K dialup because it kept timing out ... but the NEWS is it runs faster and smoother than on Vista 32 .. very noticeable ... i can play with maxed AA/AF on 64-bit that i have to lower on Vista 32 to get comparable frame rates.

So you have a 2nd [BIG] win for Vista 64 - if i didn't *finish* Hg:L, i would consider keeping Vista 64 ... it IS the future and it makes a big difference in the very [very] few games that are coded for it.
--imo it is an excellent alternative to Vista 32 and i would finally 'recommend' it if you have no issues with HW signing or old SW.

... and i don't know about you guys ... but i am really *tired* of this ... i have done a little load/save times with some variable results ... i would not choose vista 32 or 64-bit based on it ... Maybe Derek Wilson will have more comprehensive results ... i look forward to it and to anyone else that wants to post here.

imho ... toss a coin ... Vista64 is more 'future proof' but you get slightly better FPS in Vista 32; Vista 64 games are much faster and portend the future but there are only a couple. "Try it" for yourself if you want to see if there is a difference ... the DVD is cheap shipping from MS ... just choose wisely because some choices have no 'going back' after you activate it.

ME? ... i am going to Uninstall Vista64 this week knowing my choice is still 'high end'

....anything else you want to see - speak now or forever hold your peace
:confused:

:D


---for now :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
2/24/08 ... Done ... stick a fork in me :p
:confused:

Part Two was easier than i thought ... much the same problem that HardOCP has with their Real [fake] world testing - no repeatable scientific testing ... so i skipped it ... and all you get are "impressions"

FPS whores will probably want Vista 32 ... there are only 2 games i tested that run faster in Vista 64 and they are FarCry and Hellgate: London ... and IF you play those 2 games a lot, you WILL want Vista 64 ... there is significant improvement over Vista 32 in every way.

IF you have 4+GB of RAM there are also some possible advantages. Huge advantages with 64-bit games but only "possible" minor advantages in "loading/saving", in memory-mismanaged games and also if you are an 'extreme multi-tasker'. None of my "stopwatch" tests were conclusive; since i have an average "cluttered" system with everything enabled [including ReadyBoost and System Restore] sometimes Vista 64 would be quicker and sometimes Vista 32. Even the Witcher showed no consistent differences....

So my own personal "D-day" has arrived ... it is Day 30 of my "trial" ... and MS is giving me the ULTIMATUM - activate or lose it ...
--that's easy, imo - for personal use - i'd rather have the very slight FPS increase and Vista32.
[i am done with FC and Hg:L]

But i DO see where some of you prefer the perceived slightly smoother experience in *some* memory mismanaged games and those who are also extreme maxi-multitasker; not needed for me.

As i UNINSTALL Vista 64, i have to say it is "THE FUTURE" . It is just as stable as Vista 32 and you have to 'nitpick' to find flaws as you trade the irritation of 'signed drivers' for slightly better security.
---So ... pick one ... don't feel bad picking either choice as it is not a "marriage". If you want you can be polygamous ... tri- or even quad-boot your OSes if it makes you feel especially "high end"

... so, if there is interest, i can format my results ... but one fact is pretty clear to me at least ... pick Vista ... forget XP
--i look forward to Derek's professional article comparing all Vista to XP and it's flavors and a much more detailed analysis then i was able to give you.

i got 11 new games to play, a rejuvenated CrossFire rig that handles anything i toss in at it and my dial-up now allows me to play WoW and NightFall-type games online. i think i will give benchmarking a little rest ... i am playing thru both of FEAR x-packs, right now ... then it's on to Jade Empire, i think ... Jericho just continues to be a pretty tech demo i toy with it is a resource hog and has a lot of flaws.
 

Engineer

Elite Member
Oct 9, 1999
39,230
701
126
Man, you guys are awsome!!!

Great work and dedication!!!

:thumbsup: and :beer:!!

:D
 

*kjm

Platinum Member
Oct 11, 1999
2,222
6
81
apoppin

I just picked up Stalker tonight and will install it. Question is how do you run the benchmark for it?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

This is really interesting and worth updating ... i copied it from a post in PC gaming:

http://forums.anandtech.com/me...172187&highlight_key=y

Originally posted by: Markbnj
Originally posted by: GarfieldtheCat
Originally posted by: KIAman
Pshhh... Vanguard takes up to 6gb of my memory.

How does it do that? Unless it's 64-bit software, a 32bit Windows program will only be able to access 2Gb, and 3Gig with the PAE extension

You never know how people are measuring it. The thing is that looking at the "working set" in Task Manager (mem usage column in processes tab) is misleading. Here's a link to a really good explanation:

http://blogs.msdn.com/ntdebugg...memory-shell-game.aspx

In brief: in a 32-bit XP installation there is 4 gigs of addressable virtual memory. Without the /3GB flag in boot.ini the kernel reserves 2GB of this for itself, and assigns every process that starts up a 2GB address space of its own. If the /3GB flag was present then the application can be built with a special compiler define that allows it to address an additional 1GB, for a total of 3GB.

Task Manager reports it's own idea of the "working set". The important thing is that TaskMan doesn't seem to distinguish between pages that are in RAM, and pages that are on disk. They are all committed pages as far as memory usage is concerned.

When you see a program apparently using huge amounts of memory beyond the 2GB (or 3GB) limit, what you are seeing is all the memory that process has in committed pages it owns. Much, if not most, of it will not be in physical RAM. A good indicator would be to open Task Manager, click the processes tab, click View | Select Columns, and check the "Page Faults" column. Chances are you'll see that the program that appears to be using more system RAM than is available is having a huge number of page faults as it is forced to swap data between disk and RAM.

Edit: regardless of how much memory the program is using it can't access addresses beyond that 2GB (or 3GB limit), so I am assuming that the working set reported by TaskMan also includes pages of ram that are committed but not currently mapped into the address space.

This link is very interesting:

http://blogs.msdn.com/ntdebugg...memory-shell-game.aspx



 

Intelman07

Senior member
Jul 18, 2002
969
0
0
This is a great effort, but I think we need some consolidated results, it is confusing.

The trend seems to be 32bit wins, but they seem to be close enough to not actually matter. But what about how games feel, are they smoother in x64 vs 32bit, or are they smoother in 32bit vs x64?
 

CVSiN

Diamond Member
Jul 19, 2004
9,289
1
0
I do have to ask Appopin and not sure it even makes a difference.
But why cripple X64 by only thowing 4 gigs at it when the main benefit other than truly using the CPU in its X64 mode is the ability to go with more RAM.

8gigs as most non server boards will only support 8 should have been tested as well.

that would give Vista 64 every benefit to running it.

I personally run an E8400 on a DS3L with 8 gigs of Patriot 8500 with a BFG OCed 8800GTS 512 and Vista 64x runs and plays like silky smooth butter.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Intelman07
This is a great effort, but I think we need some consolidated results, it is confusing.

The trend seems to be 32bit wins, but they seem to be close enough to not actually matter. But what about how games feel, are they smoother in x64 vs 32bit, or are they smoother in 32bit vs x64?

thank-you .. and you are right .. i concluded and abandoned it rather quickly when i realized i was completely *alone* in testing :p

i am working on a much more ambitious project now and Vista 64 is gone from my HD - i am sticking with Vista 32 for now.

i thought i had summed up my impressions - 32-bit "wins" enough for me to keep it. But the difference is extraordinarily slight and i kept Vista 32 for other reasons [like signed drivers]. But NO difference how they "feel" and there is NO advantage to playing 32bit games on a 64-bit OS that i could find - there was ZERO difference i could find in the Witcher .. inconsistent enough to make it the "same" as to loading, saving or "smoothness". Gothic3 - the Memory Mismanagement King ran a little longer on 64 bit but crashes eventually anyway.

However, when you get the Very Rare 64-bit game, you want to throw Rocks at Vista 32 .. the difference is AMAZING .. the game is much smoother and runs about 15% faster for both Hg:L and Far Cry. When more games that i like are ported to 64 bit, i will adopt a 64-bit OS .. but then isn't next OS from MS supposed to be '09? ... looks like Vista may be 'interim' .. it certainly is cheap. :p

But why cripple X64 by only thowing 4 gigs at it when the main benefit other than truly using the CPU in its X64 mode is the ability to go with more RAM.
Because Vista 32 cannot use more than 4GB and "sees" only about 3.5GB
- we would be contrasting and not comparing

and it was a *gaming* rig - there is ZERO benefit to any current PC game to run with more than 4 GB of system RAM

i'd agree with you if it was OTHER than a "gaming rig"
 

CVSiN

Diamond Member
Jul 19, 2004
9,289
1
0
Yes you are correct except that 8 gigs in X64 means no paging file whatsoever which does increase performance =D
and my system is purely a gaming/ HTPC serving an extender rig as well with 8 gig. being that 8 gig of really good DDR2 is still actually less than 2 gigs of really good DDR1 gamer ram there is no reason to not get it.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I am curious, the games that got much worse results in Vista 64bit then 32... did you try runnnig the 32bit executeable under the 64bit OS and see how that performed? It might be that for games where vista32 wins you could get identical results by running the 32bit app on 64bit vista. And still enjoy the extra speed for those where the 64bit app wins.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
if you really look closely at all the results, the results are SO CLOSE as to be almost negligible ... what +1 FPS for Vista 32? .. maybe 2?

and i tried to compare "stock" Vista 32 vs. "stock" Vista 64 .. as a normal gamer might do without really "tweaking" either OS

Vista 64 is clearly the advantage for all 64bit games and obviously the "futureproof" choice

That said, isn't Vista7 due next year - '09?
-if so, who cares? :p