Go Back   AnandTech Forums > Hardware and Technology > General Hardware

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 11-10-2012, 01:51 AM   #1
lambchops511
Senior Member
 
Join Date: Apr 2005
Posts: 659
Default Defrag Single File Linux ext3

Sorry if wrong section.

I have a relatively (50G) large file on ext3, I need good linear access / bandwidth times on it. Is there a way to "defrag", the way it was copied over to the server I am pretty sure was fragmented pretty badly. I don't need perfect defrag, but is there a good way to make the file "less fragmented"?

Would a something simple like

cp my_file /tmp/garbage
rm my_file
mv /tmp/garbage my_file

do the magic?
lambchops511 is offline   Reply With Quote
Old 11-10-2012, 11:15 AM   #2
Cerb
Elite Member
 
Cerb's Avatar
 
Join Date: Aug 2000
Posts: 15,229
Default

Tried defrag?

There's also shake, which is probably better.

Also, yeah, this should probably go under *nix Software.
__________________
"The computer can't tell you the emotional story. It can give you the exact mathematical design, but what's missing is the eyebrows." - Frank Zappa
Cerb is offline   Reply With Quote
Old 11-10-2012, 01:22 PM   #3
AnonymouseUser
Diamond Member
 
AnonymouseUser's Avatar
 
Join Date: May 2003
Location: Dallas TX
Posts: 8,358
Default

To see how fragmented a file is, use

Code:
filefrag -v filename
__________________
NSA Homeland Security PRISM Denial of service Malware Trojan Keylogger Cyber Command
2600 Backpack Phishing Rootkit Agro Eco terrorism Conventional weapon
Target Weapons grade Dirty bomb Enriched Nuclear Chemical Biological weapon Black out
Pressure Cooker Grid Power Smart Body scanner Electric Failure Ammonium nitrate Brown Out
Bridge Organized crime National security State emergency Security Breach Threat Standoff
SWAT Screening Virus Environmental Terrorist Dock
AnonymouseUser is offline   Reply With Quote
Old 11-10-2012, 03:47 PM   #4
mfenn
Elite Member
Moderator
General Hardware
 
mfenn's Avatar
 
Join Date: Jan 2010
Posts: 20,725
Default

Quote:
Originally Posted by AnonymouseUser View Post
To see how fragmented a file is, use

Code:
filefrag -v filename
Let's not get crazy unless there is actually a problem. I wouldn't be worried unless the file had thousands of extents.
mfenn is offline   Reply With Quote
Old 11-10-2012, 05:22 PM   #5
lambchops511
Senior Member
 
Join Date: Apr 2005
Posts: 659
Default

Quote:
Originally Posted by AnonymouseUser View Post
To see how fragmented a file is, use

Code:
filefrag -v filename
WOW! THANKS! I did not know this command. Is 86 extents good / bad for a 30 G file? I am guessing its pretty good? That probably means if I want better IO perf I need to hit SSDs?
lambchops511 is offline   Reply With Quote
Old 11-10-2012, 05:41 PM   #6
Cerb
Elite Member
 
Cerb's Avatar
 
Join Date: Aug 2000
Posts: 15,229
Default

Quote:
Originally Posted by lambchops511 View Post
WOW! THANKS! I did not know this command. Is 86 extents good / bad for a 30 G file? I am guessing its pretty good? That probably means if I want better IO perf I need to hit SSDs?
Yes. Generally, anything above around 50MB/fragment is, "good enough," and that's over 300MB/fragment. Even with some tiny fragments mixed in there, that's good enough to just not worry about it. Any newish HDD (500GB/platter or denser) aught to be able to read such a file at 100MB/s, no sweat.
__________________
"The computer can't tell you the emotional story. It can give you the exact mathematical design, but what's missing is the eyebrows." - Frank Zappa
Cerb is offline   Reply With Quote
Old 11-11-2012, 07:20 PM   #7
mfenn
Elite Member
Moderator
General Hardware
 
mfenn's Avatar
 
Join Date: Jan 2010
Posts: 20,725
Default

Quote:
Originally Posted by Cerb View Post
Yes. Generally, anything above around 50MB/fragment is, "good enough," and that's over 300MB/fragment. Even with some tiny fragments mixed in there, that's good enough to just not worry about it. Any newish HDD (500GB/platter or denser) aught to be able to read such a file at 100MB/s, no sweat.
Agree.

OP, the -v output from filefrag lists all the file's extents along with their length (5th column). Double-check to make sure that there aren't a bunch of really tiny extents, but you are most likely OK.
mfenn is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 04:50 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.