- Sep 4, 2006
- 16,800
- 45
- 91
So, we have some networked drives. There's shit tons of folders and files. I have a program that goes through certain folders and copies those things to others.
The problem I'm having right now is that when it encounters files it already copied it goes hellishly slow. I mean, 1 file every 3-5 seconds. It'll punch out the first few fast but after that... it crawls.
I'm not sure why. I'm using the copy-item cmdlet.
I'm doing this more or less. The #$streams are commented out right now because I was testing to see if that was what was slowing it down. It wasn't.
The $error.clear() is there because I was testing to see if maybe the $error variable was getting bogged down. Nope... didn't fix it.
I'm transfering files from one networked server to another. There's tens of thousands. The first time this program ran it actually went pretty fast. But, it missed a lot of files. I don't have a clue as to why... Recurse should include every file and folder within those folders that I am starting at, yes? Irregardless of file name or folder name? Well, it didn't...
It's pissing me off. I don't have a lot of experience with powershell. So, I'm baffled.
We could do drag and drop, but we're talking hundreds of folders that are being copied. It wouldn't be efficient in the long run. That's why I'm trying to create this script to do it for us. And, no, we cannot just drag the parent folder of all the things... There are certain folders that are supposed to be copied and then there are ones that are not. (The ones we are copying are users that no longer use our system. We're backing up their stuff. Normally this is done manually, but if we just copy and paste a huge list of names... this program can automate the task for us really well. (There's a lot of different areas that these people have their folders hidden in. There's a system. Outside of the code I showed you there is a large system at work.))
The problem I'm having right now is that when it encounters files it already copied it goes hellishly slow. I mean, 1 file every 3-5 seconds. It'll punch out the first few fast but after that... it crawls.
I'm not sure why. I'm using the copy-item cmdlet.
Code:
try {
md "$tht\T Drive\" -ea stop
#$stream.WriteLine("Make directory $tht\T Drive\ was successful")
}
catch { #$stream.WriteLine("Folder already exists at $tht\T Drive\")
}
try {
copy-item $tdrive\* "$tht\T Drive\" -Recurse -ea stop
} catch {
if($error[0].Exception.ToString().EndsWith("already exists.")){
#$stream.WriteLine("Some files may already exist in the $tht\T Drive\")
copy-item $tdrive\* "$tht\T Drive\" -Recurse #-ea 0
$error.clear()
}
elseif($error[0].Exception.ToString().EndsWith("does not exist.")){
#$stream.WriteLine("The folder $tdrive was not found")
}
else{
#$stream.WriteLine($error[0].Exception.toString())
}
}
I'm transfering files from one networked server to another. There's tens of thousands. The first time this program ran it actually went pretty fast. But, it missed a lot of files. I don't have a clue as to why... Recurse should include every file and folder within those folders that I am starting at, yes? Irregardless of file name or folder name? Well, it didn't...
It's pissing me off. I don't have a lot of experience with powershell. So, I'm baffled.
We could do drag and drop, but we're talking hundreds of folders that are being copied. It wouldn't be efficient in the long run. That's why I'm trying to create this script to do it for us. And, no, we cannot just drag the parent folder of all the things... There are certain folders that are supposed to be copied and then there are ones that are not. (The ones we are copying are users that no longer use our system. We're backing up their stuff. Normally this is done manually, but if we just copy and paste a huge list of names... this program can automate the task for us really well. (There's a lot of different areas that these people have their folders hidden in. There's a system. Outside of the code I showed you there is a large system at work.))
Last edited: