Second opinion on physx

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Modelworks
Nope. You are never wrong. I think everyone understands that.

I'm not wrong in this case.
You are just reaching. You don't even go into the arguments, links and quotes I provide anymore. Let's face it, it's game over for you.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: apoppin
Originally posted by: Scali
Originally posted by: Keysplayr
Also, Scali. Just for the hell of it, I placed the GTX295 and the 8800GTS512 in my Q6600 rig and that only has 2GB of system memory on Vista32. Still plays the games perfectly well. I thought it would choke. My total graphics memory is higher that total system memory. So what is it I should be looking out for? Cause I don't notice any hitching or anything. So far I've tried CoD W@W, UT3, Crysis. They play the same as the system with 4GB of memory on Vista32. What am I missing, or gladly missing?

You'd have to run a 64-bit version of Windows to be able to tell the difference.
You'd have to also run a 32-bit version to tell there isn't any in gaming :p


That's a rather silly statement to make on an enthusiast forum.
You stick to the old stuff because you don't think you need the new stuff yet?
Shouldn't enthusiasts be at the forefront of the technological wave? Which isn't even a forefront anymore, since we had 64-bit Windows since 2005, and 64-bit x86 CPUs even longer?
If there's no difference, you'd STILL want to use the newer tech, because of its extra potential, even though it may not yet be realized (according to you that is, I wouldn't be caught dead with a 32-bit OS, and ESPECIALLY not XP).
Heck, I've been running XP x64 since 2005. I just can't imagine people still being on 32-bit. It just... I don't have words for it.

The words that describes it is your lack of *experience* with both operating systems
rose.gif

Nice try, but obviously I HAVE used both 32-bit and 64-bit versions. I've used XP, Vista and Windows 7 RC in both modes. Now it's your turn.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali
Originally posted by: Modelworks
Nope. You are never wrong. I think everyone understands that.

I'm not wrong in this case.
You are just reaching. You don't even go into the arguments, links and quotes I provide anymore. Let's face it, it's game over for you.

It is not reaching that you know little of professional cg applications or OpenGL. Stick with directx and games.



 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Scali
Originally posted by: apoppin
Originally posted by: Scali
Originally posted by: Keysplayr
Also, Scali. Just for the hell of it, I placed the GTX295 and the 8800GTS512 in my Q6600 rig and that only has 2GB of system memory on Vista32. Still plays the games perfectly well. I thought it would choke. My total graphics memory is higher that total system memory. So what is it I should be looking out for? Cause I don't notice any hitching or anything. So far I've tried CoD W@W, UT3, Crysis. They play the same as the system with 4GB of memory on Vista32. What am I missing, or gladly missing?

You'd have to run a 64-bit version of Windows to be able to tell the difference.
You'd have to also run a 32-bit version to tell there isn't any in gaming :p


That's a rather silly statement to make on an enthusiast forum.
You stick to the old stuff because you don't think you need the new stuff yet?
Shouldn't enthusiasts be at the forefront of the technological wave? Which isn't even a forefront anymore, since we had 64-bit Windows since 2005, and 64-bit x86 CPUs even longer?
If there's no difference, you'd STILL want to use the newer tech, because of its extra potential, even though it may not yet be realized (according to you that is, I wouldn't be caught dead with a 32-bit OS, and ESPECIALLY not XP).
Heck, I've been running XP x64 since 2005. I just can't imagine people still being on 32-bit. It just... I don't have words for it.

The words that describes it is your lack of *experience* with both operating systems
rose.gif

Nice try, but obviously I HAVE used both 32-bit and 64-bit versions. I've used XP, Vista and Windows 7 RC in both modes. Now it's your turn.

my turn for what?
:confused:

You admit you have been running 64-bit for 4 years. i have been testing them both - Vista32 vs Vista64, side-by-side - for two years. You seem to have a good understanding of XP but don't seem to get Vista's memory management in regards to PC gaming

rose.gif
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Modelworks
It is not reaching that you know little of professional cg applications or OpenGL. Stick with directx and games.

Actually it is :)
I've done work on professional applications with OpenGL, and on various platforms.
I've done development for Hydroman for example:
http://www.paro-nl.com/hydromansoftware.html

Now if that is 'knowing little', I'd like to know what makes you the expert :p
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: apoppin
You admit you have been running 64-bit for 4 years. i have been testing them both - Vista32 vs Vista64, side-by-side - for two years. You seem to have a good understanding of XP but don't seem to get Vista's memory management in regards to PC gaming

I haven't been running 64-bit exclusively, obviously. But it has been my preferred platform for years.

As for Vista's memory management... why don't you explain it then?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The reason CAD uses OGL is because:
a) they first built the renderer in the days before DX worked well.
b) they need to support unix platforms.
c) the render guys are OGL experts as that's obviously what the company required in the past.

However as unix dies (before anyone pipes up businesses have shown very little interest in linux as a CLIENT platform) then CAD vendors will stop supporting it (it's takes a lot more effort to write something that works on both windows and unix). When that happens DX becomes an option, as it is simpler to code in DX (especially the new more complex stuff) then CAD packages will slowly switch to using it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Scali
Originally posted by: apoppin
You admit you have been running 64-bit for 4 years. i have been testing them both - Vista32 vs Vista64, side-by-side - for two years. You seem to have a good understanding of XP but don't seem to get Vista's memory management in regards to PC gaming

I haven't been running 64-bit exclusively, obviously. But it has been my preferred platform for years.

As for Vista's memory management... why don't you explain it then?

why don't you?

You are the one that seems to think that a GTX295 and a 2nd card for PhysX with "2.3GB of total vRAM" will not run well with 4GB of system RAM on Vista 32; when it runs OK with 2GB of System RAM :p

if you look in my sig, i DO run Vista64 bit - but for reasons OTHER than PC gaming
rose.gif
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali
Originally posted by: Modelworks
It is not reaching that you know little of professional cg applications or OpenGL. Stick with directx and games.

Actually it is :)
I've done work on professional applications with OpenGL, and on various platforms.
I've done development for Hydroman for example:
http://www.paro-nl.com/hydromansoftware.html

Now if that is 'knowing little', I'd like to know what makes you the expert :p


My last job was doing rigging for framestore on the ecoimagination commercials for GE. I have had job offers from Blizzard and Epic , but I like to freelance more than work in a corporate environment. I have worked in CG and VFX since 1998, have 6 shipped game titles, 11 commercials and 7 movies on my resume along with many small jobs involving everything from textures to animations.

I am in the process of writing a tutorial series for Digital Tutors on character rigging.

I know the field very very well.



 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Dribble
The reason CAD uses OGL is because:
a) they first built the renderer in the days before DX worked well.
b) they need to support unix platforms.
c) the render guys are OGL experts as that's obviously what the company required in the past.

However as unix dies (before anyone pipes up businesses have shown very little interest in linux as a CLIENT platform) then CAD vendors will stop supporting it (it's takes a lot more effort to write something that works on both windows and unix). When that happens DX becomes an option, as it is simpler to code in DX (especially the new more complex stuff) then CAD packages will slowly switch to using it.

That is partly true. The unknown factor is Apple. Right now it is easier to keep OpenGL because you can target all OS . Apple was just a 2d and video platform but over the past two years there has been a large number of applications that now have OSX versions. I was really surprised that they ported Maya to OSX.

Apple is still the minority, but they do continue to add to their market share. Who knows where they will be in 5 years in the CG world.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: apoppin
why don't you?

You are the one that seems to think that a GTX295 and a 2nd card for PhysX with "2.3GB of total vRAM" will not run well with 4GB of system RAM on Vista 32; when it runs OK with 2GB of System RAM :p

I'm just saying that as more vram gets mapped into the 32-bit address space, less physical memory remains available. So even if you have 4 GB of physical memory in your box, you will be able to use much less than that. If you're going to put 4 GB in your system, you want to actually be able to USE it, don't you? You don't want it to be hidden behind some mapped vram and other hardware.

That's the main reason why I asked... Why put 4 GB in a box and then run a 32-bit OS on it, wasting a lot of memory.
Now I know from experience that my 8800GTS320 runs better in 64-bit. Perhaps it's not that much of an issue when your videocard has more memory, and doesn't need to swap textures in and out that often. But for me it definitely matters, because in 64-bit, I can hold all textures in memory, not needing the harddisk to page textures in and out. I'm quite sure I'm not the only one.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Modelworks
My last job was doing rigging for framestore on the ecoimagination commercials for GE. I have had job offers from Blizzard and Epic , but I like to freelance more than work in a corporate environment. I have worked in CG and VFX since 1998, have 6 shipped game titles, 11 commercials and 7 movies on my resume along with many small jobs involving everything from textures to animations.

I am in the process of writing a tutorial series for Digital Tutors on character rigging.

I know the field very very well.

Okay, so you're basically just a modeler/animator, trying to tell a developer that he doesn't understand the very software that he writes, and you merely use? You must be the most arrogant guy on this forum, which is quite an achievement :)
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Scali
Originally posted by: Modelworks
My last job was doing rigging for framestore on the ecoimagination commercials for GE. I have had job offers from Blizzard and Epic , but I like to freelance more than work in a corporate environment. I have worked in CG and VFX since 1998, have 6 shipped game titles, 11 commercials and 7 movies on my resume along with many small jobs involving everything from textures to animations.

I am in the process of writing a tutorial series for Digital Tutors on character rigging.

I know the field very very well.

Okay, so you're basically just a modeler/animator, trying to tell a developer that he doesn't understand the very software that he writes, and you merely use? You must be the most arrogant guy on this forum, which is quite an achievement :)


What developer ?
You ?

You don't understand CG applications. If you did you would know how much the artists works with the developers of the products. You would know why artist are frequently called in to alpha test products and you would know that artist and developers spend a huge amount of time together. It isn't some developer sitting in a cubicle writing code that the artist has no clue of what is going on with the software.

You are the one who couldn't grasp that Mesa and OpenGL were separate for a reason and it wasn't trademarks. Something a developer in the CG world clearly understands.








 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Dribble
The reason CAD uses OGL is because:
a) they first built the renderer in the days before DX worked well.
b) they need to support unix platforms.
c) the render guys are OGL experts as that's obviously what the company required in the past.

Yep, that pretty much nails it.
OpenGL was obviously developed by SGI, which used a unix OS. So the first OpenGL machines were unix boxes.
Various other companies, such as HP, also started developing graphics workstations for CAD usage, again powered by unix and OpenGL.
I recall doing my labcourses Autocad and Pro/Engineer on HP Apollo machines in a special room. These machines ran a HP flavour of unix, used a Motorola 68k processor, and used X for a GUI. I'm not sure if they even had 3d acceleration. Things were mostly wireframe or flatshaded back then.
These days the same university uses the standard labrooms with standard PCs for Autocad and Pro/Engineer.
I actually did the OpenGL labcourse on a Windows machine.

Originally posted by: Dribble
However as unix dies (before anyone pipes up businesses have shown very little interest in linux as a CLIENT platform) then CAD vendors will stop supporting it (it's takes a lot more effort to write something that works on both windows and unix). When that happens DX becomes an option, as it is simpler to code in DX (especially the new more complex stuff) then CAD packages will slowly switch to using it.

Yup, that seems to be exactly what happened to Autocad. While it had unix versions in the past, they never ported it to linux, but instead dropped the unix product line completely, and went Windows-only.
Makes me wonder about the cause and effect. Is it because there was no demand for a linux version because unlike the server-world, the CAD world moved from unix workstations to Windows machines?
Or is it rather that Autodesk figured that there's no use for both a Windows and a linux version, since linux and Windows run on the same hardware anyway? Their customers could just run Windows, saves Autodesk the trouble of maintaining two versions of their software for the same hardware platform. Perhaps the more mature OpenGL drivers for Windows also played a part in that. Windows already had Autocad support, and manufacturers already had very good Windows drivers with optimizations, where some manufacturers still struggle in linux today.

As for 3dsmax, it started out as a DOS product, then later rewritten for Windows, but I don't think it ever supported any other platforms in the first place. As you say, it had OpenGL support first, because in the early days of Windows NT that was the best solution (in fact, the ONLY solution, at first). They added Direct3D at a later stage, when D3D became a good alternative to OpenGL. And as you say, when you develop Windows-only, there are benefits to using D3D.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Modelworks
What developer ?
You ?

You don't understand CG applications. If you did you would know how much the artists works with the developers of the products. You would know why artist are frequently called in to alpha test products and you would know that artist and developers spend a huge amount of time together. It isn't some developer sitting in a cubicle writing code that the artist has no clue of what is going on with the software.

You are the one who couldn't grasp that Mesa and OpenGL were separate for a reason and it wasn't trademarks. Something a developer in the CG world clearly understands.

Yea, whatever... geez. Go model something, or whatever it is that you do.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Might have to check out Cryostasis just for physx as the reviews were only average for this particular game.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: Azn
Might have to check out Cryostasis just for physx as the reviews were only average for this particular game.

Well, I would agree that the game was not perfect but it was definately enjoyable.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Scali
Originally posted by: apoppin
why don't you?

You are the one that seems to think that a GTX295 and a 2nd card for PhysX with "2.3GB of total vRAM" will not run well with 4GB of system RAM on Vista 32; when it runs OK with 2GB of System RAM :p

I'm just saying that as more vram gets mapped into the 32-bit address space, less physical memory remains available. So even if you have 4 GB of physical memory in your box, you will be able to use much less than that. If you're going to put 4 GB in your system, you want to actually be able to USE it, don't you? You don't want it to be hidden behind some mapped vram and other hardware.

That's the main reason why I asked... Why put 4 GB in a box and then run a 32-bit OS on it, wasting a lot of memory.
Now I know from experience that my 8800GTS320 runs better in 64-bit. Perhaps it's not that much of an issue when your videocard has more memory, and doesn't need to swap textures in and out that often. But for me it definitely matters, because in 64-bit, I can hold all textures in memory, not needing the harddisk to page textures in and out. I'm quite sure I'm not the only one.

"wasting a lot of memory" ???

letsee .. i just paid $39 after MiR for 2x2GB OCZ PC8500 for a new Phenom II X2/4 build, and if i only use 80% ... how much am i really "wasting" ? like ten bucks? :p
:confused:

the ONLY reason one would want 64 bit Vista is for 64-bit applications or where more then 4GB of RAM is desirable. Vista 64 has to run 32-bit applications including games in an emulation layer of sorts and it is RAM-hungry. Most 32-bit games - 99.9% of all PC games - actually run a tiny bit faster in Vista 32 over Vista 64.

The people who argue like you also "feel" there is a difference in gaming - but NO ONE has been able to demonstrate any advantage of 64-bit over 32-bit [for 32-bit games] in a 4GB Vista system RAM-equipped PC; not even with loading times.

rose.gif
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: apoppin
"wasting a lot of memory"

letsee .. i just paid $39 after MiR for OCZ PC8500 and if i only use 80% ... how much am i really "wasting" ? :p
:confused:

Well, in Keysplayr's case he's put 4 GB into the system, but only 2 GB shows up. That's 50% wasted. Sure, money isn't that big a deal these days, but when you go out and buy 4 GB, isn't that because you actually want to be able to use it? It's like I'd buy two videocards for SLI, except my motherboard doesn't support SLI, so I'd always be able to use only one in games.

Originally posted by: apoppin
the ONLY reason one would want 64 bit Vista is for 64-bit applications or where more then 4GB of RAM is desirable. Vista64 has to run 32-bit applications in an emulation layer of sorts and it is ram-hungry. Most 32-bit games - 99.999% of all PC games - actually run a tiny bit faster in Vista32 over Vista64.

That 'emulation layer of sorts' is that nice v86 mode that your processor is equipped with. In fact, because various parts of the kernel and drivers continue to run in 64-bit mode, there's 32-bit software which actually runs FASTER on a 64-bit OS.
Besides, the argument of "ram-hungry" doesn't really go, when the 32-bit OS can't address the memory in the first place. If you'd put 4 GB in a 32-bit system, at best you'd be able to use 3.2 GB, but with high-end/multiple videocards, that generally ends up much lower. So you're wasting nearly 1 GB to start with. The 64-bit can be memory-hungry all it wants, the difference is nowhere near that 1 GB, so you still have more efficient memory usage in a 64-bit OS.
Heck, my system has 6 GB. 3 of which never shows up in a 32-bit OS in the first place. What do I care about memory usage in 64-bit? Of memory there is plenty, I just need the OS to actually USE it. A 64-bit OS can. With a 32-bit OS I'm stuck at a measly 3 GB, no matter how much memory I actually put in my machine. 3 GB just doesn't cut it for me.

Originally posted by: apoppin
The people who argue like you also "feel" there is a difference in gaming - but NO ONE has been able to demonstrate any advantage of 64-bit over 32-bit [for 32-bit games] in a 4GB system RAM-equipped PC; not even with loading times.

Well, you can come over, then we'll go time load times of Half-Life 2, Far Cry 2, Crysis and Crysis:Warhead. The differences in some cases are actually VERY large.
For some reason my laptop also really likes 64-bit, even though it only has 2 GB. If I run the HL2:Lost Coast stress test on it in 32-bit, it swaps constantly, because it runs out of memory. I end up with a score of about 4 fps.
When I run the same test in 64-bit, the swapping is gone, and it can reach a whopping 11 fps.

Trust me, I'm not the "feel" type of guy. I'm a hardcore mathematician, scientist and academic.
 

DataCabbitKSW

Junior Member
Jun 12, 2009
12
0
0
I'm actually rather proud of NVidia really pushing on the GPGPU front. Not only with PhysX but with CUDA, and now pushing forward with OpenCL (for all platforms, they just released drivers for Windows and Linux, and OS X 10.6 SnowLeopard will have some in its driver stack supposedly) but also they are pushing for usability in the DirectX 11 DirectCompute API which will first show up in Windows7. Nvidia has been building their chips in recent generations with the idea of GPGPU in mind. ATI was the first out of the gate getting tiny apps out and Stream processing, but failed to deliver anything further really. I hope they can get back in the game with OpenCL and DirectX Compute, because more competition is always good (for the consumer).

PhysX, while it does have a software mode, you just can't get the same performance even out of a OC'd quad core system. Like KeysPlayer pointed out, it really flourishes in the massively parallel environment that are GPUs.

With the advent of OpenCL and DirectX Compute, I think Havok will have the means to make a universal layer to support hardware accelleration of Havok, much like PhysX is already doing with CUDA. I know Nvidia and the PhysX team is already planning on making versions to run under DirectCompute and OpenCL.

With the upcoming standards (both open and proprietary) making adoption of GPGPU easier, it is an exciting time for both gamers and professional data modelers. Science and engineering can and will make use of these in order to get miniature super-computing structures. The TESLA offsystem GPGPU device from NVidia is a good example of this. I'm still reading some of the whitepapers on the DirectX Compute Shader (like http://tinyurl.com/nqunww ) over from TechNet. I'm honestly excited about it, want to see what gets developed for it, and want to develop utilizing it myself.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: DataCabbitKSW
I'm actually rather proud of NVidia really pushing on the GPGPU front. Not only with PhysX but with CUDA, and now pushing forward with OpenCL (for all platforms, they just released drivers for Windows and Linux, and OS X 10.6 SnowLeopard will have some in its driver stack supposedly) but also they are pushing for usability in the DirectX 11 DirectCompute API which will first show up in Windows7.

A really amazing program is 3d coat. The programmer Andrew Shpagin is just phenomenal. I never come across someone so talented working with graphics programming. In a very short period of time he has written an application that has features that no other current application has, it uses voxels. Voxels use a ton of processing power so Andrew implemented CUDA. He didn't just stop there, he then wrote a Mac version of the program and has begun implementing CUDA on it as well.

This is one guy, he doesn't have a team of programmers or big offices. If you request a feature on the forums, two days later its done. Anyone can download the programs free trial. It shows what can be done with CUDA if you have the skill. I can't say enough about the guy, he does what takes other programmers teams to do.
http://3dcoat.com/
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: DataCabbitKSW
With the advent of OpenCL and DirectX Compute, I think Havok will have the means to make a universal layer to support hardware accelleration of Havok, much like PhysX is already doing with CUDA. I know Nvidia and the PhysX team is already planning on making versions to run under DirectCompute and OpenCL.

And if that fails, there's also Bullet Physics, which is focusing on getting Cuda and OpenCL acceleration into their physics API:
http://www.bulletphysics.com/wordpress/?p=64

Hardware accelerated physics will get there, question is which solution will adapt best and eventually push the others out of the market.

Originally posted by: DataCabbitKSW
With the upcoming standards (both open and proprietary) making adoption of GPGPU easier, it is an exciting time for both gamers and professional data modelers. Science and engineering can and will make use of these in order to get miniature super-computing structures. The TESLA offsystem GPGPU device from NVidia is a good example of this. I'm still reading some of the whitepapers on the DirectX Compute Shader (like http://tinyurl.com/nqunww ) over from TechNet. I'm honestly excited about it, want to see what gets developed for it, and want to develop utilizing it myself.

I've actually been playing around with GPGPU a bit, recently... I want to create a custom pipeline, where I can do my own tesselation and all that, then feed the triangles down the conventional GPU backend for rasterizing.
Currently I'm using Cuda with D3D10. I've been looking at OpenCL, but it now seems that DirectX Compute will be available to end-users at about the same time as OpenCL. Since I've already converted my engine to D3D11 anyway, I might aswell start using DirectX Compute instead of OpenCL.
My first impression was that DX Compute was still a bit 'clunky' though. I was expecting it to integrate with the API more seamlessly than Cuda did, but not really. Well, in a way it does, perhaps... Because you compile your CS more or less the same way as the other types of shaders. But I find the Cuda way nicer. With Cuda you compile your shader along with your program, and you can call it as if it is a regular C function. In DX11 you need to use the same constant buffer and resource view stuff that you use for graphics.
The handling of buffers in DX11 is also not really better than what Cuda already did, imho (you could just map D3D10 resources into Cuda quite freely). You still need to copy data around from time to time.
Maybe nVidia just made Cuda too good :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Scali
Well, in Keysplayr's case he's put 4 GB into the system, but only 2 GB shows up. That's 50% wasted. Sure, money isn't that big a deal these days, but when you go out and buy 4 GB, isn't that because you actually want to be able to use it? It's like I'd buy two videocards for SLI, except my motherboard doesn't support SLI, so I'd always be able to use only one in games.

Scali, let me clarify.

System:
Q6600
eVGA790i Ultra
2GB DDR3 (2x1GB)
GTX295
8800GTS640
Vista32
winver reveals: Physical memory available to Windows: 2,095,040 KB

The Phenom II system is the 4GB system which is now sitting on the counter. I had some stability issues with it. AND I couldn't get RAID to function properly. I'll have to mess with BIOS as it is the latest for the board.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Keysplayr
Scali, let me clarify.

System:
Q6600
eVGA790i Ultra
2GB DDR3 (2x1GB)
GTX295
8800GTS640
Vista32
winver reveals: Physical memory available to Windows: 2,095,040 KB

The Phenom II system is the 4GB system which is now sitting on the counter. I had some stability issues with it. AND I couldn't get RAID to function properly. I'll have to mess with BIOS as it is the latest for the board.

Well, I'd like to know what Windows reports when you do have a system with 4 GB.
We already know it won't report the full 4 GB, because around 3.2 GB is the best you can get, period. I'm just wondering how much the 'triple' videocard configuration will affect the memory usage in a 32-bit OS.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Scali
Originally posted by: apoppin
"wasting a lot of memory"

letsee .. i just paid $39 after MiR for OCZ PC8500 and if i only use 80% ... how much am i really "wasting" ? :p
:confused:

Well, in Keysplayr's case he's put 4 GB into the system, but only 2 GB shows up. That's 50% wasted. Sure, money isn't that big a deal these days, but when you go out and buy 4 GB, isn't that because you actually want to be able to use it? It's like I'd buy two videocards for SLI, except my motherboard doesn't support SLI, so I'd always be able to use only one in games.

Originally posted by: apoppin
the ONLY reason one would want 64 bit Vista is for 64-bit applications or where more then 4GB of RAM is desirable. Vista64 has to run 32-bit applications in an emulation layer of sorts and it is ram-hungry. Most 32-bit games - 99.999% of all PC games - actually run a tiny bit faster in Vista32 over Vista64.

That 'emulation layer of sorts' is that nice v86 mode that your processor is equipped with. In fact, because various parts of the kernel and drivers continue to run in 64-bit mode, there's 32-bit software which actually runs FASTER on a 64-bit OS.
Besides, the argument of "ram-hungry" doesn't really go, when the 32-bit OS can't address the memory in the first place. If you'd put 4 GB in a 32-bit system, at best you'd be able to use 3.2 GB, but with high-end/multiple videocards, that generally ends up much lower. So you're wasting nearly 1 GB to start with. The 64-bit can be memory-hungry all it wants, the difference is nowhere near that 1 GB, so you still have more efficient memory usage in a 64-bit OS.
Heck, my system has 6 GB. 3 of which never shows up in a 32-bit OS in the first place. What do I care about memory usage in 64-bit? Of memory there is plenty, I just need the OS to actually USE it. A 64-bit OS can. With a 32-bit OS I'm stuck at a measly 3 GB, no matter how much memory I actually put in my machine. 3 GB just doesn't cut it for me.

Originally posted by: apoppin
The people who argue like you also "feel" there is a difference in gaming - but NO ONE has been able to demonstrate any advantage of 64-bit over 32-bit [for 32-bit games] in a 4GB system RAM-equipped PC; not even with loading times.

Well, you can come over, then we'll go time load times of Half-Life 2, Far Cry 2, Crysis and Crysis:Warhead. The differences in some cases are actually VERY large.
For some reason my laptop also really likes 64-bit, even though it only has 2 GB. If I run the HL2:Lost Coast stress test on it in 32-bit, it swaps constantly, because it runs out of memory. I end up with a score of about 4 fps.
When I run the same test in 64-bit, the swapping is gone, and it can reach a whopping 11 fps.

Trust me, I'm not the "feel" type of guy. I'm a hardcore mathematician, scientist and academic.

really, Keysplayr is only seeing 2GB of 4GB system RAM?
:Q
Something is wrong. Most of my Vista 32 PCs had 3.3-3.5GB showing up as "used" .. actually the 'extra' 0.7 GB is not really "wasted" - as it is not needed in PC gaming :p

Again, 6GB RAM is *never* needed for playing a 32-bit game; there are *other* applications where it is quite useful; i differentiated between playing games and my own migration to 64-bit Vista was for other reasons

where do you live? Maybe i can help you with your Lost Coast Swapping issues - my own 32-bit Vista Notebook {athlon X2/GeForce 8200 M/2GB RAM} does not have these issues although it is not "fast" by any means. FarCry runs as you describe on 64-bit over 32-bit; but it was given a 64-bit pathway.

i am sorry but i do not really "trust" anyone that says "trust me"
rose.gif