Originally posted by: Philippine Mango
But if you add them up it doesn't mathmatically make sense. Looking at the total number of processes open with ram usage and comparing that to the commit charge it's just not adding up. Is there something hidden here? I'm looking at the kernel usage but still doesn't make sense since most of it is paged. Drivers? What is it? Anybody know of a program to reveal all of this?
Memory management in modern operating systems is extremely complicated. To learn about Windows memory management, either read "Inside Windows 2000" by Solomon and Russinovich or search for previous posts on this subject. In a nutshell, what you see in Task Manager is a very brief summary of a very complicated issue. Even the term "RAM usage" is loaded with ambiguity when you start talking about virtual memory, processes, system cache, shared memory, etc.
Yes we all know systems are getting faster and better and more useful but if you notice microsoft is playing on that... They know they will have faster systems so they just take up whats available even if its not important nor efficient.
This is not really true: it's more of a cost/benefit tradeoff really. If the computer is really, really fast, then using some less-efficient algorithms might be OK because it allows you to implement something else too. Sometimes efficiency/performance is weighed against developer productivity. This is true with all software, not just Windows, and certainly not just Microsoft software.
And trust me, Windows experts are VERY aware of performance. There are entire teams that go around Windows looking for performance improvements.
I have a slight feeling that since there is so much memory to spare that the MS Code isn't as efficient as it used to be (using more commands than needed for the same process). This reminds me of Kazaa, AOL, AIM and other inefficent programs like that.
This is simplistic, but basically true. Developers today are less concerned about conserving memory than they used to be. A few extra variables in your code might make things more readable and prevent a few bugs and only use up a few more bytes of memory. That's an acceptable tradeoff for most.
The only programs I can imagine that would be optimized or efficient would be expensive ones (Premiere, Photoshop etc.). Yes they use a lot of memory but it all makes sense to WHY it takes so much... Imaging processing! Or Premiere with video encoding and it doesn't even eat up that much ram, about 128-256MB at most!
Trust me: there is no correleation between the cost of programs and how efficiently they use RAM.
And besides, "less RAM usage" is rarely more efficient from a performance standpoint. Often, performance improvement and RAM usage are directly proportional. Some of the fastest data structures commonly use more RAM than their slower, simpler counterparts. From hash tables to b-trees, the extra overhead sacrifices storage efficiency for maximum performance in the algorithmic world.
And that's not even mentioning caches, where higher RAM usage almost always means more fully populated caches offering big performance improvements over a smaller caches.