Originally posted by: spyordie007
And of course it would be even better to just buy more RAM so that you didnt need to make frequent trips to the paging file.... :roll:Originally posted by: Nothinman
Norton's defrag is cool. It puts the swapfile at the head of the disk as one contiguous file so it doesn't get fragmented, and other files don't fragment around it.
That's counter-productive. The biggest performance killer with regard to disks is seek time, if you move the pagefile to the front of the disk while all of the files are in the middle you cause extra seeks every time the pagefile must be accessed. Ideally you want the pagefile in the middle of your filesystem because you want to keep the read heads in the same general area so that seek time is minimized.
Defragging shouldnt effect stability at all, anyone who states this is just spreading FUD.I'm sorry, but *stability* improvements from defragging?
Pardon me for asking, but that is a pretty hefty claim! Do you have anything to back up your claim? Performance problems stemming from a heavily fragmented pagefile on a memory starved system -- I can buy that, but *stability* problems?
This is more of an application issue than an OS issue. Besides if you're getting a highly fragmented page file it means you should be starting with it larger so it isnt constantly expanding into additional drive area (or better yet adding RAM).I have seen this. Older (Dell 6350? 4U Server) running 2k3 server and exchange 2k3. The mail store dismounted and would not remount. My co worker knows exchange very well, and bet me lunch it was a framgented PF. 1200ish frags on the page file. A quick defrag and trip to the chinese place for lunch and it was back online.
Originally posted by: Nothinman
Linux doesn't really get fragmented, so yes, there is not really a point in defragmenting it
Sure it does, it's just that Linux is smart and the fragmentation has very little affect on performance in day to day use.Well very little actually. As far as I understand it, whereas NTFS shoves any new programs, etc. right at the front of the drive, and if there is not enough space for the whole program, NTFS tells it to just take the next available space after that, splitting -fragmenting- the file. Whereas with Linux, the hdd tells you to put that program whereever you can fit the entire thing, and if there isn't enough space, to take the next biggest. So if you have a 80gig -or whatever- hdd and 60gig is taken, not very much is likely to be fragmented. Or if you have a relatively new one, you won't be likely to have to defragement within the first year and a half, I would guess.
I could be wrong, but that's how I understand it. That's why Windows needs regular defrag.
fragmenting- the file. Whereas with Linux, the hdd tells you to put that program whereever you can fit the entire thing, and if there isn't enough space, to take the next
Originally posted by: OptyMyStix
No doubt regular defrag is recommended but you shouldn't neglect the importance of these Tips & Tricks of Computer Maintainance.
OptyMyStix
5) Delete cache files to increase download speed
Regularly check Internet explorer's history and cache files. Delete the cache files and clear the history files. Deleting the cache files helps to speed up the download. As there is more space available to store the temporary files.
7) Delete old .zip files
After unzipping files users leave the zipped file on their computer. If you don't need it, delete it. You can save it on CDR or CD-RW disc.
Originally posted by: BikeDude
fragmenting- the file. Whereas with Linux, the hdd tells you to put that program whereever you can fit the entire thing, and if there isn't enough space, to take the next
That sounds very impressive, specially considering that the OS has no knowledge of just how big that file is going to get. I.e. if your app starts writing a stream of data, how can the OS guess that you're going to write 100K or 1GB? If it were a bad OS, it would cache the entire write operation until the application stops writing, but only a really bad OS would do that. (what if there's a power failure? what if the user has less than several GBs worth of memory?)
Another approach would be to always start writing to the biggest block available. But that too leads to fragmentation. It could defrag on the run, but then it would need to defrag even temporary files... Sounds like a magic bullet to me.
NTFS does come with some overhead though. There's the journaling thing, but all filesystems these days have journaling, so it isn't a unique feature. (I trust the Linux filesystems you guys are using supports journaling too) Then there's Windows' 8.3 mechanism (which can be disabled btw). By default every filename has an associated 8.3 filename. That can hamper performance quite a bit in directories containing lots of long filenames.
Originally posted by: OptyMyStix
My Dear n0cmonkey,
The cache is also known as the Temporary Internet Files. The Temporary Internet Files (or cache) folder contains Web page content that is stored on your hard disk for quick viewing. This cache permits Internet Explorer or MSN Explorer to download only the content that has changed since you last viewed a Web page, instead of downloading all the content for a page every time it is displayed.
Temporary Internet Files or cache Use More Disk Space Than Specified, hence you must delete the cache files regularly to speed up the download. If you don't believe me my dear you may refer following sites for more information that justify my statement .
1) http://support.microsoft.com/default.aspx?scid=kb;en-us;260897
AND
2) http://support.microsoft.com/kb/301057/EN-US/
N:B Both above sites are Microsoft's official websites.
Now regarding deleting .zip files as you said, "Hard drive space is cheap." But it's not as cheap as you think my dear because many people still use old PC with hardly 2 GB or 4 GB disk space. And remember Nothing is cheap in this world everything has its own value. You must have ability to evaluate it.
Keep writing....
Thank you.
OptyMyStix.
This article is not stating that clearing your cache will speed up your downloads, nor do any of the others you have posted. All they indicate is that generally pages will load faster out of cache rather than downloading them.Originally posted by: OptyMyStix
My Dear n0cmonkey,
If you are still not enlightened plz refer to yet another Microsoft's official weblink to clear your doubt. I'm ready to tour you to explore the entire Microsoft's website to satiate your doubt.
http://support.microsoft.com/kb/263070/EN-US/
Regards,
OptyMyStix
Originally posted by: spyordie007
This article is not stating that clearing your cache will speed up your downloads, nor do any of the others you have posted. All they indicate is that generally pages will load faster out of cache rather than downloading them.Originally posted by: OptyMyStix
My Dear n0cmonkey,
If you are still not enlightened plz refer to yet another Microsoft's official weblink to clear your doubt. I'm ready to tour you to explore the entire Microsoft's website to satiate your doubt.
http://support.microsoft.com/kb/263070/EN-US/
Regards,
OptyMyStix
I'll chime in here and give my experience, but I have seen a simple defrag *prevent* errors, so it increased stability. This was due to a specifc cause, I'm not saying running defrag in general will increase stability in general.Originally posted by: BikeDude
Originally posted by: Harvey
But over 90% of the world's desktops are running Windows, and the performance and stability improvements aren't just cosmetic.
I'm sorry, but *stability* improvements from defragging?
Pardon me for asking, but that is a pretty hefty claim! Do you have anything to back up your claim? Performance problems stemming from a heavily fragmented pagefile on a memory starved system -- I can buy that, but *stability* problems?
Originally posted by: corkyg
Originally posted by: Harvey
I defrag my drive daily. More often if I make serious changes such as installing a new program.
Defragging often can cut the time it takes from hours to minutes, and it definitely cuts the chances of system errors.![]()
Same here - it just takes a few minutes each night pefore shutting down, and it keeps things in goodshape. I have done this for the past 19 years, and it has become a routine. Maybe that's why I only have to do a clean install when I get a new system or mobo?
The best I have found to do the job quicky with a good, understandable GUI/metaphor is PerfectDisk by Raxco (Canadian.)
Originally posted by: Phoenix86
The systems I was supporting a few years back (NT4) had horribly low memory, and thus used the page file quite a bit. Often over time the PF would become very fragmented and the systems would generate errors. Defragging the HDD would stop these errors. Eventually we worked it out that removing the PF, and re-adding it on a clean partition would solve it as well.
Originally posted by: TechnoPro
I discovered that spread between the 4 user accounts were more than half a million (500,000+) Temporary Internet Files. They had simply never been removed since the OS was reloaded, whenever that was.