• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

edit: linux question now with a kick ass poll!

Ameesh

Lifer
I have a problem, when a folder gets too many files in it the linux file system seems to become exponentially slower, just doing a simple ls in it, my shell chockes and sometimes dies, scripts that do this take much longer then they should.

i need a quick way to get the count of the number of files in the folder as well getting the timestamp for the oldest file in the folder without using ls |wc -l when the folder is full (say over 10K files)

do any of you know a way?


also maybe one of you knows how to make the performance any better, i see the same behavior on ext3 and nfs mounts. goddamn sh!tty linux filesystems
 
ext3 is known to be slow, especially with that many files. Using find, you should be able to get a list of files older than a particular date or something.

Sounds like you should rethink the problem.

Wrong forum. 😉

From the batcave (batgirl would wave, but she's busy),
n0cmonkey
 
Originally posted by: n0cmonkey
ext3 is known to be slow, especially with that many files. Using find, you should be able to get a list of files older than a particular date or something.

Sounds like you should rethink the problem.

Wrong forum. 😉

From the batcave (batgirl would wave, but she's busy),
n0cmonkey

i agree, the way the problem is currently solved is ass, but i didn't solve and unfortunately i can't change it with a ton of meetings/specs/reviews etc. I thought of find as well but it didn't do great either.

any other thoughts?
 
Originally posted by: Ameesh
Originally posted by: n0cmonkey
ext3 is known to be slow, especially with that many files. Using find, you should be able to get a list of files older than a particular date or something.

Sounds like you should rethink the problem.

Wrong forum. 😉

From the batcave (batgirl would wave, but she's busy),
n0cmonkey

i agree, the way the problem is currently solved is ass, but i didn't solve and unfortunately i can't change it with a ton of meetings/specs/reviews etc. I thought of find as well but it didn't do great either.

any other thoughts?

Not sure if XFS, JFS, or Reiser4 would be better in this situation... If you had posted in OS drag would probably take a look at it. As it is, I PMed him, but I don't know if he looks in that corner. 😛

Off to google!

Wrong forum. Still. 😉
 
Originally posted by: Ameesh
Originally posted by: Nik
WRONG FORUM. AGAIN.

everyone knows the other forums are dead. do you seriously wonder why people ask question in ot?

What Nik said. And I had just come back from a quick round of poking through OS, Networking, and Software when I saw this thread. 😉
 
perl -e "opendir D,'.';@f=readdir(D);print $#f+1;"

I have no idea how fast that will be.

edit: and it doesn't check timestamps...
 
Back
Top