When is your Hard Drive worked the hardest?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
Originally posted by: MercenaryForHire
Bittorrent. Seriously. The random-download sequence of most files thrashes your drive to hell.

Host a bunch of Linux ISOs or something like that on a gigabit LAN, then have people try to download 5 copies at once and see how they fare. :p

- M4H

bit torrent can be killer on a HD :)
 

JonB

Platinum Member
Oct 10, 1999
2,126
13
81
www.granburychristmaslights.com
like gsellis, I see maximum hard drive IO when using Adobe Premiere (or Vegas or Pinnacle, etc.) to edit video files and am really scrubbing the timeline back and forth. Usually only a multi-drive RAID 5 setup can actually keep up, but even that has trouble.
 

Hulk

Diamond Member
Oct 9, 1999
5,118
3,664
136
When a program freezes and you press ctrl-alt-delete and then close the program that not responding the hard drive goes nuts! I don't know what's going on but if you could create a script to replicate that action it would be a good test.
 

cruzer

Senior member
Dec 30, 2001
482
0
0
Starting Internet Explorer takes like 5-7 seconds of disc scratching. Drives me crazy.

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Hulk
When a program freezes and you press ctrl-alt-delete and then close the program that not responding the hard drive goes nuts! I don't know what's going on but if you could create a script to replicate that action it would be a good test.

When you close a program, all the memory it had allocated is freed. Your system will then start moving data from your swapfile back into RAM to fill in the empty space.

A number of people have mentioned this phenomenon (saying their hard drive thrashes after 'closing program X', etc.) This is a lousy hard drive benchmark, because it's not easily reproducible or measurable, plus the only reason it happens is that you don't have enough physical RAM to hold everything.

You really want to bench programs that explicitly use the disk as swap space (such as Photoshop when working with very large files), or that use data sets so huge that they can't possibly fit into RAM (such as video editing suites like Pinnacle). Basic tasks like copying files around (either between disks, between partitions, or on the same partition), and doing multiple things at once, are also useful.

And a lot of people have also mentioned BitTorrent. I don't really use it, but if it's thrashing your hard drive that badly, I'm guessing it's not being very smart about how it writes things back to disk (it ought to assemble decently-sized chunks of the files in RAM and then write them back in a sequential burst). This might make it a poor choice for disk benchmarking, although it could be worth consideration just as a 'real-life' bench, even if its behavior is suboptimal.
 

tynopik

Diamond Member
Aug 10, 2004
5,245
500
126
Originally posted by: Matthias99
plus the only reason it happens is that you don't have enough physical RAM to hold everything.

well yes, here in the real world, many people don't have enough ram so swap performance is very relevant

in fact it's probably the one thing that irritates me the most

and don't tell me to buy more memory

i frequently work with huge file sets that would never fit in ram, and i don't expect them to*

the problem is window's caching algorithm tries to cache every bit of this huge file set (that i'm only going to access once) into memory, and in the process it shoves my whole set of working programs out to disk

after i finish working with that one set, it can take 20 seconds or more for the system to become fully responsive as it thrashes the disk to bring programs back in

but then it happens repeatedly, process a set of files, try to get control back, process another set of files, try to get control back, ad nauseum


*no, i'm not going to go buy some 16gb behemoth
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: tynopik
Originally posted by: Matthias99
plus the only reason it happens is that you don't have enough physical RAM to hold everything.

well yes, here in the real world, many people don't have enough ram so swap performance is very relevant

in fact it's probably the one thing that irritates me the most

and don't tell me to buy more memory

i frequently work with huge file sets that would never fit in ram, and i don't expect them to*

the problem is window's caching algorithm tries to cache every bit of this huge file set (that i'm only going to access once) into memory, and in the process it shoves my whole set of working programs out to disk

after i finish working with that one set, it can take 20 seconds or more for the system to become fully responsive as it thrashes the disk to bring programs back in

but then it happens repeatedly, process a set of files, try to get control back, process another set of files, try to get control back, ad nauseum


*no, i'm not going to go buy some 16gb behemoth

What, exactly, would you prefer that Windows *do*? Whatever you're running is asking it to load more data than it has room for in your physical RAM. It has no way to know that you're going to throw it away in a second and then request something that got paged out to disk because your other app ate up all the RAM.

It has to load anything you need the CPU to access into RAM. If you have a giant file that is going to get read once, but does not need to be stored, your *application* should be smart enough to not tell Windows to try to load the whole damn thing into memory at once. If you're loading 4GB of stuff, and it DOES all have to be in RAM at once, then there's really not much you can do other than getting more RAM -- either it keeps your other programs in memory and thrashes on the dataset (which is usually not the behavior you want), or it keeps the dataset in memory and then thrashes later on the other programs. You can't really have it both ways unless your programs are smart enough to limit the amount of RAM they use based on how much physical memory you have available.
 

tynopik

Diamond Member
Aug 10, 2004
5,245
500
126
Originally posted by: Matthias99

What, exactly, would you prefer that Windows *do*? Whatever you're running is asking it to load more data than it has room for in your physical RAM. It has no way to know that you're going to throw it away in a second and then request something that got paged out to disk because your other app ate up all the RAM.

only one file is processed at a time and each file will invidually fit in free ram quite comfortablly

what i'm asking is that it doesn't push out active programs to make room for disk cache. Limit disk cache to the free ram or 20mb, whichever is greater

and this isn't just the behavior of one program, i consistently get the same behavior across a wide variety of applications

unfortunately there's the whole philosophy/myth built up around 'the most recently requested file is the most likely to be requested next'. So there's been 4gb of files accessed since the last time i tried to use any of the other programs, therefore disk cache for all those files gets priority and the programs get booted out. In my case it's almost the exact opposite. 'The most recently requested file is the LEAST likely to be requested next'
 

remagavon

Platinum Member
Jun 16, 2003
2,516
0
0
Un zipping large files (with WinRAR) and scanning for parity using a PAR/PAR2 checker, as well as booting up and loading levels in newer games. Copying files to a lesser extent.
 

WackyDan

Diamond Member
Jan 26, 2004
4,794
68
91
Right now I'm watching my 200 gig backup it's data to a 160gig.... something that happens weekly as a full backup--no incrementals (limit of NAS device).... This process currently involves moving 130gig of data, which includes large ISO files for CD's and DVD's, as well directories filled with thousands of large files or relatively small files.... all during the same copy. It takes over 24hrs to complete, though sometimes much longer. I can't think of anything else that can task a drive for that duration.... desktop IDE drives anyway.
 

tweeve2002

Senior member
Sep 5, 2003
474
0
0
Norton would be the biggest stressor on my RAID 0 set up, It can make me lag in games, when it comes on and starts virus checking my RAID 0 set up.

After that would be loading levels in games.
Then booting up windows, At times it takes longer for a level to load, than windows to boot up :Q
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: tynopik
Originally posted by: Matthias99

What, exactly, would you prefer that Windows *do*? Whatever you're running is asking it to load more data than it has room for in your physical RAM. It has no way to know that you're going to throw it away in a second and then request something that got paged out to disk because your other app ate up all the RAM.

only one file is processed at a time and each file will invidually fit in free ram quite comfortablly

what i'm asking is that it doesn't push out active programs to make room for disk cache. Limit disk cache to the free ram or 20mb, whichever is greater

and this isn't just the behavior of one program, i consistently get the same behavior across a wide variety of applications

Okay, that's not what I thought you were talking about. Your issue is that Windows is pushing out INactive programs in exchange for disk cache. Yes, they should manage the cache better, but it's only pushing out your other apps because you're not using them at the moment and it assumes they're less important than the data you just read.

unfortunately there's the whole philosophy/myth built up around 'the most recently requested file is the most likely to be requested next'. So there's been 4gb of files accessed since the last time i tried to use any of the other programs, therefore disk cache for all those files gets priority and the programs get booted out. In my case it's almost the exact opposite. 'The most recently requested file is the LEAST likely to be requested next'

...which, yes, for you is wrong. I'm not sure if there's a way to avoid this behavior at the application level (it's not something I've ever tried to do in a Windows program). You either need a way for your 'read lots of huge files' application to tell Windows not to cache the files it's reading, or a way to lock your other applications' pages in cache.
 

webmal

Banned
Dec 31, 2003
144
0
0
My 74 gig Raptor worked hardest when I am encrypting/decrypting my whole hard drive (using Drive Crypt Plus Pack). The Raptor would 'lock up' for about 2 hours each time I encrypt/decrypt.

Webmal
 

MustISO

Lifer
Oct 9, 1999
11,927
12
81
Uncompressing multiple .rar onto the same drive. Using one drive to do multiple packs really grinds the drive.
 

imported_Flux

Junior Member
Jul 19, 2004
23
0
0
For about 300 public access computers on campus:
Loading Visual Studio.NET w/MSDN (Much slower than Studio 6)
Loading Matlab 6.5/7.0

Relavent Sys specs:
P4 2.2GHz 400FSB
512Mb RAM (SDRAM PC100)
5400RPM 2MB cache Western Digital 60GB EIDE HDD

My system:
Copying files between my friends system and my own over a gigabit ethernet connection.
An odd situation is, when he's pulling files from me, my drive is the bottleneck.
When I'm pulling from him, his drive is the bottleneck.
(HDD utilization specs in windows)
Also, the connection is more than capable of handling the load.
(Tested with RAM drives)

Relevant sys specs:
Mine
A64 3200+
1GB RAM (DDR400 PC3200)
7200RPM 8MB cache Maxtor 200GB SATA HDD

Friend
AXP 2600+ @2.3GHz
2GB RAM (DDR400 PC3200)
Via Raid 0: 2 x 7200RPM 8MB cache Maxtor 200GB EIDE HDD
(Asus A7V880 onboard raid controller)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
For me it's when I'm rendering a massive 3D Studio Max scene at 8MP resolution. My hard drive goes berzerk for 2-4 hours because it's constantly accessing the swap file. I have 1GB of system memory, so this type of problem may be typical among users who do 3D rendering.
 

obeseotron

Golden Member
Oct 9, 1999
1,910
0
0
Copying a large number of heavily fragmented files to another drive, usually bittorrent downloads. I agree Bittorrent itself is also one of the worst system killers I've ever experienced.

Hard drives seem terribly bad at sharing the load between two disks. Copying 2 large files to a disk concurrently instead of consecuitively takes about 5 times as long. Burning DVD's while doing anything else with the hard drive is bad. Opening a big rar for instance will take a very long time and give burnproof a workout on my dvd burner.
 

KJI

Member
Sep 21, 2004
79
0
0
My hard drive is usually under the most stress when I'm transfering a load of medium to large sized files(20~200 Mbs) from or to a USB hard drive. Usually can't do anything on either one while in the process.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,081
136
When I defragged and transferred 140GB of smut. Approx. 120,000 files. Mostly JPEG's and a few thousand MPEG's.
Did another defrag on the new hard drive after my files were aranged the way I wanted.
One week later the disk died on me.
4 years and countless memberships to porn sites and I'm only mildly irritated.

Before everybody flames me for the XXX: Judging my activities doesn't justify yours, so ah heck off.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,344
136
www.teamjuchems.com
Multiple connected computers/clients pulling off ISO's....errr....large files...anyway, I host a server on my campus like this and would really like to know if a raid array would help.

Thanks!