Windows XP Pro

Philippine Mango

Diamond Member
Oct 29, 2004
5,594
0
0
Is it physically possible to run windows XP on a system with 64MB of ram, PII 450MHZ processor, 7200RPM drive, AND be as fast as that exact system but with windows 98? If not, it seems like this is a way to prove that a lot of it is bloat since you can run windows 2000 on that exact system and it will be fast. If so, how would you do this? Obviously you would remove windows messenger, disable all the Windows XP crap and maybe close a few services but what else can you do to make this work?
 

hopejr

Senior member
Nov 8, 2004
841
0
0
It may not run as quick as Win98, but it will run much better (in terms of stability, etc). Yes, XP is a load of bloat. Longhorn will be worse (although some of those bloated features are sweet as).
 

MrChad

Lifer
Aug 22, 2001
13,507
3
81
What is the point of your exercise? To prove that Windows XP uses more resources than 98? It does. Get over it.

In almost every case, new generations of software are optimized for newer generations of hardware. The system you described was state-of-the-art in 1998. Windows 98 runs optimally on such a system. Windows XP (released in 2001) should run optimally on systems released in 2001.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
It'd run, just be ungodly slow. You could improve its performance by a substantial margin by adding more RAM though. And Win2k would run fairly well on that system, although 128MB would be better.

My place of work likes to put WinXP on low spec machines too. We've a bunch of P2 400s with XP Pro installed. Takes a good couple of minutes to actually boot to a usuable desktop.
 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
btw... put some more ram in that.... christ, lol.

64mb in win2k would be horrible..... u might as well stay with 98.
 

ShaqDiesel

Member
Jan 30, 2003
101
0
0
No way XP would run as fast as 98 on that thing. It'd get close if you stuck a 256 MB stick of RAM in there, but even then....ick.
 

Satyrist

Senior member
Dec 11, 2000
458
1
81
My guess is that these days 512mb is the comfortable minimum for running XP, without compromising performance.

If one's a power user, gaming, or running multiple apps at once though, 1gb might be better.

 

Philippine Mango

Diamond Member
Oct 29, 2004
5,594
0
0
But what I was asking is, is it possible to run XP quickly on that machine or is there too much bloat to be removed. I mean I heard Windows 2k3 runs smoothly on a system like that, right? Shouldn't there be a way to make it where XP would use just slightly more ram then 2000 simply because of more drivers available? It's dumb for an operating system to require even more processing power since it should use the least, I think between NT4.0, 95,98 and 2000 is the sweet spot for memory usage, just average them out and I would think XP would and should use slightly more. No registry tweaks or anything? What is so significant in XP and assuming you disable a lot of the garbage to make it require a faster system? Drive space usage, I can most definately see the need for the increase but not so much for the ram... Also if you observe task manager on a clean install with a lot of the garbage removed you still can see what services are taking up ram. You'll essentially see the essential ones like explorer, svchost.exe etc.. But if you add them up it doesn't mathmatically make sense. Looking at the total number of processes open with ram usage and comparing that to the commit charge it's just not adding up. Is there something hidden here? I'm looking at the kernel usage but still doesn't make sense since most of it is paged. Drivers? What is it? Anybody know of a program to reveal all of this?

Yes we all know systems are getting faster and better and more useful but if you notice microsoft is playing on that... They know they will have faster systems so they just take up whats available even if its not important nor efficient. I have a slight feeling that since there is so much memory to spare that the MS Code isn't as efficient as it used to be (using more commands than needed for the same process). This reminds me of Kazaa, AOL, AIM and other inefficent programs like that. The only programs I can imagine that would be optimized or efficient would be expensive ones (Premiere, Photoshop etc.). Yes they use a lot of memory but it all makes sense to WHY it takes so much... Imaging processing! Or Premiere with video encoding and it doesn't even eat up that much ram, about 128-256MB at most!

 

Philippine Mango

Diamond Member
Oct 29, 2004
5,594
0
0
Oh I just thought of a good example, Consoles. At first the games are poorly optimized and since they are so powerful for the time, they just write what ever they want as long as it works. But as time goes on and they try to appeal to people still, they continue to make the code more efficient because they are working with in a set system. A good example is the GTA series, GTA III had relatively good graphics for the time but then they were able to make them EVEN better for Vice city with the same system and constraints! Same thing with San andreas except they had to sacrifice texture quality but it was still improved! But because the pc is always upgrading all the time, microsoft doesn't have to make sure it runs speedy on an old system because their motto is essentially "Tough Luck". It's really a bunch of bull...
 

PingSpike

Lifer
Feb 25, 2004
21,766
615
126
Originally posted by: Philippine Mango

Yes we all know systems are getting faster and better and more useful but if you notice microsoft is playing on that... They know they will have faster systems so they just take up whats available even if its not important nor efficient. I have a slight feeling that since there is so much memory to spare that the MS Code isn't as efficient as it used to be (using more commands than needed for the same process). This reminds me of Kazaa, AOL, AIM and other inefficent programs like that. The only programs I can imagine that would be optimized or efficient would be expensive ones (Premiere, Photoshop etc.). Yes they use a lot of memory but it all makes sense to WHY it takes so much... Imaging processing! Or Premiere with video encoding and it doesn't even eat up that much ram, about 128-256MB at most!

This is a problem will pretty much all software. Hardware is cheap. Software has gotten more complex and many developers are getting lazy.
 

PingSpike

Lifer
Feb 25, 2004
21,766
615
126
Originally posted by: Philippine Mango
Oh I just thought of a good example, Consoles. At first the games are poorly optimized and since they are so powerful for the time, they just write what ever they want as long as it works. But as time goes on and they try to appeal to people still, they continue to make the code more efficient because they are working with in a set system. A good example is the GTA series, GTA III had relatively good graphics for the time but then they were able to make them EVEN better for Vice city with the same system and constraints! Same thing with San andreas except they had to sacrifice texture quality but it was still improved! But because the pc is always upgrading all the time, microsoft doesn't have to make sure it runs speedy on an old system because their motto is essentially "Tough Luck". It's really a bunch of bull...

Food for thought, I believe the X-Box has...64mb or ram I think? Its essentially a PC under the hood, running stripped down windows.
 

kylef

Golden Member
Jan 25, 2000
1,430
0
0
Originally posted by: Philippine Mango
But if you add them up it doesn't mathmatically make sense. Looking at the total number of processes open with ram usage and comparing that to the commit charge it's just not adding up. Is there something hidden here? I'm looking at the kernel usage but still doesn't make sense since most of it is paged. Drivers? What is it? Anybody know of a program to reveal all of this?
Memory management in modern operating systems is extremely complicated. To learn about Windows memory management, either read "Inside Windows 2000" by Solomon and Russinovich or search for previous posts on this subject. In a nutshell, what you see in Task Manager is a very brief summary of a very complicated issue. Even the term "RAM usage" is loaded with ambiguity when you start talking about virtual memory, processes, system cache, shared memory, etc.

Yes we all know systems are getting faster and better and more useful but if you notice microsoft is playing on that... They know they will have faster systems so they just take up whats available even if its not important nor efficient.
This is not really true: it's more of a cost/benefit tradeoff really. If the computer is really, really fast, then using some less-efficient algorithms might be OK because it allows you to implement something else too. Sometimes efficiency/performance is weighed against developer productivity. This is true with all software, not just Windows, and certainly not just Microsoft software.

And trust me, Windows experts are VERY aware of performance. There are entire teams that go around Windows looking for performance improvements.

I have a slight feeling that since there is so much memory to spare that the MS Code isn't as efficient as it used to be (using more commands than needed for the same process). This reminds me of Kazaa, AOL, AIM and other inefficent programs like that.
This is simplistic, but basically true. Developers today are less concerned about conserving memory than they used to be. A few extra variables in your code might make things more readable and prevent a few bugs and only use up a few more bytes of memory. That's an acceptable tradeoff for most.

The only programs I can imagine that would be optimized or efficient would be expensive ones (Premiere, Photoshop etc.). Yes they use a lot of memory but it all makes sense to WHY it takes so much... Imaging processing! Or Premiere with video encoding and it doesn't even eat up that much ram, about 128-256MB at most!
Trust me: there is no correleation between the cost of programs and how efficiently they use RAM.

And besides, "less RAM usage" is rarely more efficient from a performance standpoint. Often, performance improvement and RAM usage are directly proportional. Some of the fastest data structures commonly use more RAM than their slower, simpler counterparts. From hash tables to b-trees, the extra overhead sacrifices storage efficiency for maximum performance in the algorithmic world.

And that's not even mentioning caches, where higher RAM usage almost always means more fully populated caches offering big performance improvements over a smaller caches.