I am just wondering, back in the old days (1989/1990) we could prove that sometimes it HURT performance to put more RAM in your system. Reason being, you have to refresh RAM in order for it not to lose it's contents. If you put 16M in your machine back then, we had no software to utilize it. You would see Norton SI actually decrease the so-called speed of the machine by putting in more RAM. We even used to reprogram the countdown timer in the DMA chip so RAM was not refreshed as often and we would see that SI would say the machine was faster by increasing the DMA register that affected how often the refresh took place. So, wouldn't this still hold true, you are spending more time refreshing more RAM that you possibly aren't seeing any performance, nor even being used if you are running a 9x product? I am behind the times so I am open to bashing if I am way off base. Let me know. thanks!