• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

NTFS Compression: Time to use it?

Jeff7181

Lifer
Just curious if using NTFS Compression might be a good idea now that CPU power seems to be scaling ahead of disk performance. With dual and quad core processors, it seems we should be able to get some performance increases, especially on slower hard drives, such as 5400 RPM drives found in laptops.

Opinions?
 
Never. Ever ever ever ever ever. That's my opinion. Too many issues come from using compression. Not to mention hard drives are so damn cheap now, there is no reason to use it in the first place.
 
Originally posted by: Fullmetal Chocobo
Never. Ever ever ever ever ever. That's my opinion. Too many issues come from using compression. Not to mention hard drives are so damn cheap now, there is no reason to use it in the first place.

+1
 
Just curious if using NTFS Compression might be a good idea now that CPU power seems to be scaling ahead of disk performance. With dual and quad core processors, it seems we should be able to get some performance increases, especially on slower hard drives, such as 5400 RPM drives found in laptops.

It might help a bit, the only way to know for sure is to try it.

Never. Ever ever ever ever ever. That's my opinion. Too many issues come from using compression. Not to mention hard drives are so damn cheap now, there is no reason to use it in the first place.

There's no issues with NTFS compression that I'm aware of.
 
I think what he's getting at is if you could do instantaneous, on-the-fly decompression, you could get tremendous data throughput because the slowest part of the process, reading from the disk platters, would be artificially speeded up thanks to the compression causing the data itself to be smaller.

Now, the question becomes whether you can get this compression and decompression to be virtually instantaneous.
 
Originally posted by: Itchrelief
I think what he's getting at is if you could do instantaneous, on-the-fly decompression, you could get tremendous data throughput because the slowest part of the process, reading from the disk platters, would be artificially speeded up thanks to the compression causing the data itself to be smaller.

Actually, it was shown years ago that there CAN be a performance boost in this manner, by having to read smaller amounts of data through the HDD by using CPU cycles for processing compression.

You can compress fast or compress highly, or go somewhere in-between. I think even a fast compression scheme would magically make stuff faster on today's overpowered quad core systems.
 
Originally posted by: Denithor
Originally posted by: Fullmetal Chocobo
Never. Ever ever ever ever ever. That's my opinion. Too many issues come from using compression. Not to mention hard drives are so damn cheap now, there is no reason to use it in the first place.

+1

+2
 
Originally posted by: taltamir
Originally posted by: Denithor
Originally posted by: Fullmetal Chocobo
Never. Ever ever ever ever ever. That's my opinion. Too many issues come from using compression. Not to mention hard drives are so damn cheap now, there is no reason to use it in the first place.

+1

+2

And yet no one has mentioned any of these so called issues yet...
 
compatibility issues, compression issues, issues with chkdisk, lack of support on any type of software that manipulates the drive externally (aka, partition magic and the like). I think there were more, but I don't remember anymore, its been a while since I stopped using it.
The benefits are too small, the bugs too big.
 
compatibility issues, compression issues, issues with chkdisk, lack of support on any type of software that manipulates the drive externally (aka, partition magic and the like). I think there were more, but I don't remember anymore, its been a while since I stopped using it.
The benefits are too small, the bugs too big

Sounds like FUD to me. I've got NTFS compression enabled on a good number of directories/files in my XP VM and haven't run into any issues. The only thing I can think of that complained about NTFS compression was SQL Server and there's no way you'd want to put a database in a compressed directory anyway.

Even though Partition Magic is crap I'm pretty sure it supports NTFS compression just fine. NTFS has had compression since it's inception so any filesystem tool that doesn't support it should be avoided. All of the imaging tools that I know of support NTFS compression just fine too.
 
I used to wonder why hard drive makers didn't have built-in compression. There were consumer hard drive controllers with built-in compression chips years ago, in the days of "Stacker".

Hard drive sizes would instantly "double". And since the compression would be "invisible" to user applications, there shouldn't be compatibility problems. And if the chips were used on a high percentage of drives, the chips would probably be pretty cheap. One problem would be the same as with tape drive compression. You'd have to advertise native capacity, and then give "compression UP TO xx% caveat".

I don't know what effect, if any, this would have on data recovery when the drives failed. The compression algorithm would have to be in the public domain or licensable, so that software data recovery could be done.
 
I admit it SOUNDS like FUD (Fear, Uncertainty, Doubt... a method computer companies scare people to not use the products of their opponents). But the problems were real, and I am not competing with microsoft.
I didn't stop using it by choice. Nowadays I use a ZFS raidz2 array on opensolaris with data compression on.
 
One problem would be the same as with tape drive compression. You'd have to advertise native capacity, and then give "compression UP TO xx% caveat".

And the fact that recovering compressed data is more difficult since you need all of the blocks in the the stream for a successful decompression. You can limit the dependency by limiting the amount of data you compress at once but that limits your compression ratio too so it's a tradeoff.

The compression algorithm would have to be in the public domain or licensable, so that software data recovery could be done.

LZF is a good tradeoff between speed and ratio and it's available under the BSD license which is almost public domain.
 
But the problems were real, and I am not competing with microsoft.

I'm not working for or against MS either but I use NTFS compression without any issues at all every day. 99% of apps shouldn't even notice whether it's on or off. And you can always just disable it on directories used by those affected apps if they do care, like SQL Server.

I didn't stop using it by choice. Nowadays I use a ZFS raidz2 array on opensolaris with data compression on.

Which is the same thing with the only difference being the implementation.
 
99% of apps shouldn't even notice whether it's on or off.
I specified apps that handle data outside of the OS. Aka, bootable drive tools.
Which is the same thing with the only difference being the implementation.
That is the point, the issue the implementation.
 
I specified apps that handle data outside of the OS. Aka, bootable drive tools.

And the few that I've used have worked fine. Maybe if you're using software from like 2000 before XP was released and became popular there might be issues but anything that claims to support NTFS but doesn't do compression should be considered crap and avoided. So far the only program you've named is Partition Magic and it looks like the last release of it was in 2004 because Symantec bought it and killed it.

That is the point, the issue the implementation.

So Partition Magic works great with ZFS compression?

The problem isn't the implementation, the problem is stupid apps of which there are thankfully very few in this respect.
 
which is why it was the list thing I listed, the apps are the least important issue.

NTFS itself is buggy enough without adding to the problem. with undeletable files/directories cropping up here and there.

Partition magic is inapplicable to ZFS since ZFS does not work with partitions. and has tools to do anything you might want internally.

The tools NTFS uses however include MS chckdisk, and windows settings, etc... and those tend to be borked or sub par.

However, it is good to hear from you that all the acronis stuff now works with compressed NTFS.

Windows itself seems to... act up when you mix compression and system files. If you do compress, don't compress the os partition.

Also as I mentioned, the speed gained is small in very specific situations, and is offset by a very large speed loss in other situations.
 
which is why it was the list thing I listed, the apps are the least important issue.

No, the apps are the most important issue because they're what you use the computer for.

NTFS itself is buggy enough without adding to the problem. with undeletable files/directories cropping up here and there.

NTFS is fine. I've seen a few cases of that but not many and most of the time it's a userland problem with explorer or permissions and not NTFS itself.

Partition magic is inapplicable to ZFS since ZFS does not work with partitions. and has tools to do anything you might want internally.

Which was my point, you switched to Solaris so not only did you switch OSes and filesystems but you also switched the apps that were running on the OS and filesystem so the comparison is about as invalid as possible.

The tools NTFS uses however include MS chckdisk, and windows settings, etc... and those tend to be borked or sub par.

I've never had an instance where NTFS compression has caused problems for chkdsk or explorer.

Windows itself seems to... act up when you mix compression and system files. If you do compress, don't compress the os partition.

Some can't be compressed because they're needed before the compression driver is loaded, but the others work fine.

Also as I mentioned, the speed gained is small in very specific situations, and is offset by a very large speed loss in other situations.

I can't say I've ever noticed a speed loss during normal usage, not that I've noticed a speed gain either so I think in that area it is pretty much a wash.
 
No, the apps are the most important issue because they're what you use the computer for.
Are you intentionally misunderstanding me? the apps are the least important ISSUE, aka the least SEVERE issue. not the least important thing you use your computer for. Also I was refering to DATA MANIPULATION apps, Nobody uses a system to run data manipulation apps, they run data manipulation apps because they are using a system. Usually to change partitions sizes, recover lost data, etc.
NTFS is fine. I've seen a few cases of that but not many and most of the time it's a userland problem with explorer or permissions and not NTFS itself.
Nothing short of a reformat solves that, permission fixers usually work, but sometimes do not, reinstalling windows on its own partition does not solve the issue (on OS drive or on secondary drive).
Which was my point, you switched to Solaris so not only did you switch OSes and filesystems but you also switched the apps that were running on the OS and filesystem so the comparison is about as invalid as possible.
The issue here is data storage. not running application such as word processing or the like. The solaris system is a headless box fileserver.
I've never had an instance where NTFS compression has caused problems for chkdsk or explorer.
Than you haven't used it enough. Also you haven't used google because I am not the only one.
 
I'm not misunderstanding you, I'm saying apps are the most important part cause that's what you use your computer for. Use if partition management apps are few and far between.

I've always bee able to resolve any NTFS issue without a reformat, sometimes things like Cygwin are needed but that's rare too.

I don't know what part of I use it ?I use it every day and it works fine? you're not getting. So far the only example you've given was crap because it's 4 year old software
 
I'm not misunderstanding you, I'm saying apps are the most important part cause that's what you use your computer for. Use if partition management apps are few and far between.
this statement prooves you are misunderstanding.
 
Well, the Dell Mini 9 with the 8GB SSD came compressed. Today I downloaded a program to see the compression ratio, and I uncompressed the Windows folder, Program Files and Documents (unless the file was in use). It seems a bit faster, and I only lost about 800MB from the SSD. There is still about 3.1 GB free.

 
this statement prooves you are misunderstanding.

Then you're not saying what you mean because so far your only reasons for avoiding NTFS compression were incompatibility with some apps and NTFS is buggy. The latter isn't true at all and the former is almost never an issue because the apps people use every day work fine.
 
Yeah... I'm not thinking about compression in terms of freeing up more space, I'm thinking about it in terms of reading smaller chunks of data from the hard drive. I really don't think there's much, if any processing overhead anymore. I suspect NTFS compression is tuned more for performance rather than space savings.
 
Back
Top