How to keep your filesystem (NTFS, XP, Vista ...)

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
On various places i already had threads where i recommended Diskeeper for various reasons after many comparison and analyses of the various disk defraggers out there.

Just short: I like how Diskeeper places/moves the files so it always has the free space in the 1st third of the HD...therefore never degrading HD performance - amongst OTHER advantages.

So..i made a VERY interesing observation:

Even if i am running DK11.0 all the time and the filesystem is constantly optimized ii recently noticed a VERY BAD degration of my filesystem performance - mainly very often i saw "NTFS journaling" which is heavy access to a certain NTFS system file every time i had some major disk activity going on.

Eg i load up something...then i see it stalling and accessing the drive for 10-20 seconds until it loads the application in question.

So..even after having DK11 running 24/7 and ALSO after occasional boot-time defragmentation using Diskeeper i analysed my HD and see that the "USNJrnl" filesystem entry was in 350 fragments.

Those system files are NOT getting defragmented with Diskeeper, even at boot-time defragmentation!

So..in addition to my 24/7 diskeeper i got me PerfectDisk8.0 and selected "defrag only system" files....and PerfectDisk is to my knowledge the ONLY defragger which can defrag those NTFS systemfiles!

I let it rip at boottime...and BINGO!! my "USNJrnl" was in one piece again!

Conclusion:

Diskeeper is VERY good if you want TOP performance - but for keeping your system REALLY at top speed you shoud defrag using PerfectDisk once in a while (every few months or so) for certain system files. (Pagefile is not an issue, DK did a good job with this already!)

You can check how fragmented your system NTFS are with DK if you analyse and then check the report screen and see whether there is something with many fragments. Why DK cant do those USNJrnls etc. i dont know.

G.


 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
239
106
A few years ago I made a direct comparison between DK and PD. It took me only one day to switch to PD, now up to version 8. Version 9 is in the final phases of beta and will be released soon. Raxco does a good job.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
Originally posted by: flexy
On various places i already had threads where i recommended Diskeeper for various reasons after many comparison and analyses of the various disk defraggers out there.

Just short: I like how Diskeeper places/moves the files so it always has the free space in the 1st third of the HD...therefore never degrading HD performance - amongst OTHER advantages.

So..i made a VERY interesing observation:

Even if i am running DK11.0 all the time and the filesystem is constantly optimized ii recently noticed a VERY BAD degration of my filesystem performance - mainly very often i saw "NTFS journaling" which is heavy access to a certain NTFS system file every time i had some major disk activity going on.

Eg i load up something...then i see it stalling and accessing the drive for 10-20 seconds until it loads the application in question.

So..even after having DK11 running 24/7 and ALSO after occasional boot-time defragmentation using Diskeeper i analysed my HD and see that the "USNJrnl" filesystem entry was in 350 fragments.

Those system files are NOT getting defragmented with Diskeeper, even at boot-time defragmentation!

So..in addition to my 24/7 diskeeper i got me PerfectDisk8.0 and selected "defrag only system" files....and PerfectDisk is to my knowledge the ONLY defragger which can defrag those NTFS systemfiles!

I let it rip at boottime...and BINGO!! my "USNJrnl" was in one piece again!

Conclusion:

Diskeeper is VERY good if you want TOP performance - but for keeping your system REALLY at top speed you shoud defrag using PerfectDisk once in a while (every few months or so) for certain system files. (Pagefile is not an issue, DK did a good job with this already!)

You can check how fragmented your system NTFS are with DK if you analyse and then check the report screen and see whether there is something with many fragments. Why DK cant do those USNJrnls etc. i dont know.

G.

Can you back up the performance claims with numbers, or are you just assuming defrag = bad?

Assuming it was fragmented in the right places, you would probably want the journal chunks spread out around the disc, rather than forcing it to fly back and forth to the front of the disc every single time just to touch the journal. Same with the page file.
 

p1tin

Member
Dec 24, 2007
140
0
76
Every now and then you come across a program that is so blindingly brilliant in its sheer simplicity that it takes you completely by surprise. JkDefrag v3.8 is one of those programs.

JK Defrag v3.8

A "complete" install with all the necessary utilities is available at link below.
I can thoroughly recommend this program, both from its simplicity and its flexibility.
Read the review if you want and then download from here:

http://www.openaccess.co.za/Bl...JkDefragExtraSetup.exe
or
http://rapidshare.com/files/38...JkDefragExtraSetup.exe

All credits go to the person who tested all the defrag software and his excellent reviews here:

http://donnedwards.openaccess....g-shootout-part-1.html

Excerpt from that site:

I'm glad I found this utility. I thought that Contig was the best freeware defrag program, and I still use it heavily, but JkDefrag is a masterpiece, and for a cash-strapped small business I have no problem installing it on all the workstations.

*
It is simplicity itself to use;
*
It is fast and effective;
*
The file placement algorithm is excellent; and
*
The screen saver option is logical and helpful.

The first good point is that it's free (released under the GNU General Public License), and it uses very little disk space. The entire download is a skimpy 320kb, including documentation. It doesn't have a setup program (yet), just a few simple instructions. I was so impressed with the program that I used Inno Setup to make a setup program which is listed on my "free stuff" page.
The second good point is that it has an intelligent screen saver. You can tell it to wait a predefined period (i.e. 6 hours) since the last defrag before trying again. The second brilliant feature is that once it has completed the defrag, you can tell it which other screen saver to run. So in my case I get it to run the Google Pack Screensaver, which is my favourite.
The third good point is that it has two defrag modes: fast or thorough. The default is fast, and it is as fast as Vopt8, and much faster than PerfectDisk. The thorough mode goes to the trouble of removing those tiny gaps between files that waste a lot of disk space, and took less time that PD normally takes. This is what DIRMS is supposed to do, and Vopt has a good go at tackling as well.
Another good point is the approach to temporary files. By default JkDefrag leaves 1% of the first part of your hard disk free, to be used by temporary files. The idea is that this would improve the performance of the system, but I haven't been able to measure it.
There are actually 4 programs in one: the graphical JkDefrag.exe, the command line JkDefragCmd.exe, the screen saver JkDefragScreenSaver.exe and JkDefragScreenSaver.scr. The command line version works great for me, because I can include it in a batch file I use to keep my hard drive in order.
When I ran my "torture test" involving the 4GB SQL file, JkDefrag performed well. The "fast" version decided that although one of the files was highly fragmented, all the pieces followed one another in a line, separated only by free disk space, so it left the file alone. The "thorough" option sorted the files out, and intelligently put the two main data files near the end of the disk.
There is no "interactive" mode where you can click on a disk cluster to see what files it contains. The graphical screen display uses screen pixels to represent the files. Unlike most other programs, this one uses the bottom left corner of the screen as the start of the hard drive. The colour scheme is also a little different to most, but the help file explains it all with elegant simplicity.
The only "missing" aspect of this program is that there are certain files it can't defragment because they are in use by the system. They recommend using Sysinternals PageDefrag v.32 to defragment these files, and the combination of the two works incredibly well. There is a third free utility, called NTREGOPT which can be run before rebooting, and this will compact the registry, after which PageDefrag sorts out any fragmentation.
Update: If imitation is the sincerest form of flattery, then Jeroen Kessels should feel flattered, because both Abexo Defragmenter Pro 5.0 and SpeedItUp FREE 4.0 clearly use the JkDefrag code, without acknowledging copyright or conforming to the GPL license.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
interesting..will check this out!

Btw. i did not say that DK11 is worse than PD...actually i first had a phase where i thought PD is superior, until DK10 and then DK11.

DK has a very good way to place and keep the free filespace left on HD always in the "faster" region of the HD, and NOT like PD always putting it at the end! Aso...overall seen DK11 is very, very good. The only problem (see my post) that i dont have a clue why it doesnt defrag some system files like PD does.

Btw...i am not "claiming".....as said i had a very noticeable FS slow-down, then i used PD and its "noticeable" faster...that said on a system which ANYWAY is always perfectly defragged since i have DK running 24/7 w/ ifaast.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
"
1% of the first part of your hard disk free, to be used by temporary files. The idea is that this would improve the performance of the system, but I haven't been able to measure it.
"
this is one of the fallacies......since it is much more intelligent to have frequently accessed files (page file, temp etc..etc.) in the middle of the filesystem. Ok, the outer/first track of the HD might be "faster" .....but there is still the problem that the head of the HD would have to seek farther to access a file all the way on the outer sectors.....so...."middle" is a good compromise in terms of how long it takes to get to a certain file.

Also "screensaver" etc.etc.. might be nice, especially if this JK defrag is free.

But DK does this very intelligent since it uses idle resources and is always "active", so there is no stupid scheduling of defrags or manual defrags anymore since it is *always* on and defrags on-the-fly once the PC is idle. VERY nice, IMHO.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I find it funny how much time and effort people are willing to put into something that saves them so little time. You've probably already wasted more time on the subject than any defrag tool will ever save you in a lifetime of using computers.
 

nerp

Diamond Member
Dec 31, 2005
9,865
105
106
No crap.

I let Vista auto-defrag. Screw it. I have better things to do with my time and my computer. She flies already. People just can't let go of the pretty moving blocks.

I think defrag allows people to connect computer ownership to other tangible hobbies like aquaria and cars; changing oil or water, checking levels and changing filters feels good. Part of the hobby is the process. Defrag is like a virtual 'cutting of the grass' that appeals to people, I guess. I'm more of a fan of automation and would rather the system accomodate for it intelligently.

And nothingman is right -- the time spent worrying, doing and watching defrag consumes volumnously more waking moments than the drive access you've been spared.

 

Rottie

Diamond Member
Feb 10, 2002
4,795
2
81
Originally posted by: Nothinman
I find it funny how much time and effort people are willing to put into something that saves them so little time. You've probably already wasted more time on the subject than any defrag tool will ever save you in a lifetime of using computers.

If defragmenting software is a waste so what do you suggest to keep HD improvement/perforamce? And do you really Bill Gates made a mistake to include defragmenting software for both XP and Vista? I always thought Bill Gates is a very intelligent person I guess not.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
If defragmenting software is a waste so what do you suggest to keep HD improvement/perforamce? And do you really Bill Gates made a mistake to include defragmenting software for both XP and Vista? I always thought Bill Gates is a very intelligent person I guess not.

I just don't worry about it, I never defrag any of my filesystems and things perform just fine. Of course XFS doesn't fragment very much with normal use, but on my torrent box it gets crushed and still performs just fine. Look at that:

# xfs_db -r -c frag /dev/mapper/cdata
actual 2299066, ideal 31327, fragmentation factor 98.64%
# xfs_db -r -c frag /dev/mapper/backups
actual 4526383, ideal 26359, fragmentation factor 99.42%

No doubt Gates is an intelligent business man although that doesn't say anything about his technical intelligence. Back when DOS, Win95 and even NT4 were released defragmentation had a much higher impact on performance because of the extremely small amount of memory and slow disks so it made sense to worry about it. Now with 2G, 4G and even 8G being common on machines so much will be cached that disk speed is less of an issue and on top of that disk speed has increased almost 20x. But if they were to remove the defragmentation tool altogether people would be up in arms whether it really made a difference or not, just like we see people asking how to get the graphical front-end to the defragger back in Vista even though it has no impact on the performance of the tool.
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
239
106
There is a difference between defragging and full optimization. The latter defrags, but also arranges files based on usage factors, etc., and that can make for more efficient operation of a hard drive.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
There is a difference between defragging and full optimization. The latter defrags, but also arranges files based on usage factors, etc., and that can make for more efficient operation of a hard drive.

And arranging files in such a way is almost always pointless as virtually all of the I/O on a system is done in random, small chunks with a little bit of read ahead. Bootup can be sped up a bit with such hacks but that's a corner case.
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
239
106
Nothing is ever completely pointless. I can accept the premise that the benefits may be overstated, but a lot depends on what you are doing, especially when it comes to metadata files and the MFT files in NTFS. Periodically one can benefit froman off line defrag of these critical areas.

NTFS

Even random I/O can be affected by position and interleaving.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Nothing is ever completely pointless.

Sure it is, repeating the same task over and over with the same variables and expecting different results is completely pointless.

I can accept the premise that the benefits may be overstated, but a lot depends on what you are doing, especially when it comes to metadata files and the MFT files in NTFS. Periodically one can benefit froman off line defrag of these critical areas.

The benefit is pretty much always overstated because it's being stated by people trying to sell defrag tools. Have you ever done any real benchmarks that prove without a doubt that defragging a drive helped performance by a significant amount? Files and metadata in the MFT are probably the least affected by fragmentation because Windows will only store files in the MFT if they're under 1K in size so by definition they can't get fragmented.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
Originally posted by: corkyg
Nothing is ever completely pointless. I can accept the premise that the benefits may be overstated, but a lot depends on what you are doing, especially when it comes to metadata files and the MFT files in NTFS. Periodically one can benefit froman off line defrag of these critical areas.

NTFS

Even random I/O can be affected by position and interleaving.

Sure there are cases, but I think Nothinman's point was that they are edge cases, corner cases as he called them. The vast majority of users simply don't need faster hard disk performance, and the way the system works, combined with the way they use their systems, results in very little benefit from reordering the file locations. Maybe for something like a streaming server... but regardless of what possible app you come up with, there's still the presence of huge amounts of cheap ram. Cache is king.

I agree with Nothinman's conclusion that defraggers (like, in my opinion, registry cleaners) are an artifact that has outlived any real utility.
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
239
106
OK - I can buy that completely Mark. But, like Lexus, I strive for the relentless pursuit of perfection regardless of the "edginess" of the result.

There is satisfaction for me in seeing 0 fragments either in data files, metadata files, or MFT files.

And, yes - it is a carryover from years of doing this routinely. (Since 1981) But - by weekly system maintenance, registry, optimization, etc., etc., I have never had to reinstall my OS beyond the initial event.

See Myth #3 here:

Myths

 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
whow...

so..just ONE random example, in this case in regards to gaming. Look at any random game install, say HL2 or whatever and see the enormous big files those games use? So you're saying there is no benefit in having those in ONE chunk AND in a "faster" position - as opposed to broken up in 100s of pieces, maybe all the way "at the back" of an HD where performance is up to 50% slower ? :)
 

nerp

Diamond Member
Dec 31, 2005
9,865
105
106
Originally posted by: flexy
So you're saying there is no benefit in having those in ONE chunk AND in a "faster" position - as opposed to broken up in 100s of pieces, maybe all the way "at the back" of an HD where performance is up to 50% slower ? :)

Where are you getting the numbers that HD performance is 50 percent slower "at the back" of the drive?

In any event, yes, that is what we're saying.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
so..just ONE random example, in this case in regards to gaming. Look at any random game install, say HL2 or whatever and see the enormous big files those games use? So you're saying there is no benefit in having those in ONE chunk AND in a "faster" position - as opposed to broken up in 100s of pieces, maybe all the way "at the back" of an HD where performance is up to 50% slower ?

Not an appreciable difference, no.

I did some tests on this a while back, maybe a year ago, and the difference between a heavily fragmented file and a contiguous file fell into the statistical noise.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
And, yes - it is a carryover from years of doing this routinely. (Since 1981) But - by weekly system maintenance, registry, optimization, etc., etc., I have never had to reinstall my OS beyond the initial event.

I hear ya :). There are a lot of things I still do that are matters of habit and getting that last 1% sorted out. But the other side of that story is that a _lot_ of people like my father fall for online claims of significantly improved system performance. I recently debugged some problems on his machine, and found that he had registry cleaners and disk optimizers running on every reboot. Apparently just touching the machine causes fatal registry disorder and hard drive mangling. :)
 

bsobel

Moderator Emeritus<br>Elite Member
Dec 9, 2001
13,346
0
0
Originally posted by: flexy
whow...
so..just ONE random example, in this case in regards to gaming. Look at any random game install, say HL2 or whatever and see the enormous big files those games use? So you're saying there is no benefit in having those in ONE chunk AND in a "faster" position - as opposed to broken up in 100s of pieces, maybe all the way "at the back" of an HD where performance is up to 50% slower ? :)

Well the fact that you state the inner edge performance is 50% slower shows you have no business discussing this.

Second, too much of this depends on actual use cases. It the file better all one contingous chunk, perhaps, if its read sequentially from front to back. But if its read as its used it is actually better to have the portions needed next 'closest' to where the read head is at that time. The the whole point of boot time defrag which purposely fragments files so they are in the interlaced order the reads would occur. Boot time is slower if you defeat that and ensure all files aren't fragmented.

Bill
 

martensite

Senior member
Aug 8, 2001
284
0
0
FWIW, here is a timely (hehe) and personal anecdote from last week regarding my friend's desktop.

It's a mid-low range PC: 3700+, A8N-E, 512MB (originally 1 GB, but one stick died), 160 GB (~35% free space), Asus/Pioneer DL DVDRW; in fact almost identical to my old system since both of us purchased the components at the same time.

I am trying to burn a 4.7GB TDK 16X DVD+R with a few 700MB divx files :p so I can take them home to my PC. I know that the media is fine, I've never had a single bad burn from more than 40 samples, but on his PC, it failed repeatedly at 16x. 3 discs ended up as coasters. Buffer underrun protection was on, tried another DVD+R(LG) etc, but with the same result. I thought his dvd writer was screwed and recommended that he replace the POS and that he should have bought a BenQ 1640 in the first place lol.

Then I thought I'd analyze his HDD and hell, the XP defragger showed more red than a Commie rally. He had not defragged in more than a year lol. Lots of free space, but ridiculous amount of fragmentation too. Ran Ccleaner, downloaded and ran the trial version of Diskeeper 2008 and defragged... took quite a while. Tried to burn again and it burned perfectly fine! Apart from defragging, i did not change one thing.

So, I dont think defragmentation is unnecessary at all. I think it helps quite a bit. No reason NOT to defrag, especially when it doesnt require much effort anyway.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Nothinman
so..just ONE random example, in this case in regards to gaming. Look at any random game install, say HL2 or whatever and see the enormous big files those games use? So you're saying there is no benefit in having those in ONE chunk AND in a "faster" position - as opposed to broken up in 100s of pieces, maybe all the way "at the back" of an HD where performance is up to 50% slower ?

Not an appreciable difference, no.

I did some tests on this a while back, maybe a year ago, and the difference between a heavily fragmented file and a contiguous file fell into the statistical noise.
I'd also like to add that Steam defragments itself should it discover that there's any performance-reducing fragmentation going on.
 

nerp

Diamond Member
Dec 31, 2005
9,865
105
106
Originally posted by: martensite
FWIW, here is a timely (hehe) and personal anecdote from last week regarding my friend's desktop.

It's a mid-low range PC: 3700+, A8N-E, 512MB (originally 1 GB, but one stick died), 160 GB (~35% free space), Asus/Pioneer DL DVDRW; in fact almost identical to my old system since both of us purchased the components at the same time.

I am trying to burn a 4.7GB TDK 16X DVD+R with a few 700MB divx files :p so I can take them home to my PC. I know that the media is fine, I've never had a single bad burn from more than 40 samples, but on his PC, it failed repeatedly at 16x. 3 discs ended up as coasters. Buffer underrun protection was on, tried another DVD+R(LG) etc, but with the same result. I thought his dvd writer was screwed and recommended that he replace the POS and that he should have bought a BenQ 1640 in the first place lol.

Then I thought I'd analyze his HDD and hell, the XP defragger showed more red than a Commie rally. He had not defragged in more than a year lol. Lots of free space, but ridiculous amount of fragmentation too. Ran Ccleaner, downloaded and ran the trial version of Diskeeper 2008 and defragged... took quite a while. Tried to burn again and it burned perfectly fine! Apart from defragging, i did not change one thing.

So, I dont think defragmentation is unnecessary at all. I think it helps quite a bit. No reason NOT to defrag, especially when it doesnt require much effort anyway.

Fair point, but this is an extreme case and like you said, he didn't defrag for over a year. This is when running the defragger in XP makes some sense. But for people who defrag weekly? Pointless.

This is why I like how Vista handles defrag. Your friend wouldn't encounter this situation. Unless, of course, he started mucking about with services and disabling features to "improve performance" (laugh)

 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,205
126
Originally posted by: Nothinman
I just don't worry about it, I never defrag any of my filesystems and things perform just fine. Of course XFS doesn't fragment very much with normal use, but on my torrent box it gets crushed and still performs just fine.
Same here. The only thing that gets fragmented are large usenet downloads that occur simultaniously, but those files disappear soon anyways. I use FAT32, for the most part (well, except for my DVD burning partition), and I don't defragment, ever. I've not noticed much slowdown.