Originally posted by: silverpig
As for monitoring if there are symlinks, I thought the same thing as kamper. One system storing a file might not know if another system has linked to one of its files via nfs or the like.
That's not an issue for my situation. I'm only concerned with local symlinks. I think hardlinks might be better for me.
I'm trying to clean up wasted space caused by duplicate files on our file server. User's "My Documents" folders are stored on our file server and I know a lot of people have the same files copied into their My Documents.
I think I, theoretically, have an idea of how it will work:
1. Use fdupes to find duplicate files
2. Move those files to a central location
3. Create links (probably hardlinks?) from the original locations to the new central location
The biggest problem is... What about when a user opens that file and edits it? Even if I use hardlinks, then wouldn't it change everyone's file?
The reason I though I would need to check for symlinks is, I figured I would run a nightly script to see if all the symlinks to a file were deleted, then the file could be deleted, but with hardlinks that shouldn't be an issue (if I am understanding hardlinks correctly).