creating a script to delete TONS of files

Discussion in 'Programming' started by Homerboy, Jan 23, 2013.

  1. Charles Kozierok

    Charles Kozierok Elite Member

    Joined:
    May 14, 2012
    Messages:
    6,762
    Likes Received:
    0
    No real way to judge until we see the list or otherwise know what we're dealing with...
     
  2. Homerboy

    Homerboy Lifer

    Joined:
    Mar 1, 2000
    Messages:
    25,049
    Likes Received:
    528
    We will be keeping way more than deleting.
     
  3. imagoon

    imagoon Diamond Member

    Joined:
    Feb 19, 2003
    Messages:
    5,199
    Likes Received:
    0
    Dump the paths from SQL to a csv:
    Powershell:

    Code:
    Import-CSV "C:\temp.csv" | % { 
    $path = $_.csvcolumnname
    Remove-item "$path"
    }
    
    
    "csvcolumnname" is whatever you name the column in row 1

    If you are working at the directory levels you could wildcard the paths as needed. A couple of examples of data sql is returning might help me give you a better solution.
     
    #28 imagoon, Jan 24, 2013
    Last edited: Jan 24, 2013
  4. power_hour

    power_hour Senior member

    Joined:
    Oct 16, 2010
    Messages:
    789
    Likes Received:
    1
    Do this all the time using vbscript. We do it based on historical values of timestamps; deleting files older than a specific date.

    In a nutshell, just write the code in vbscript, save it and run it using a schedule task. Its ridiculously easy to do. Google for the script as its out there. And yeah you can do it a few ways too using Powershell or windows commands in a batch file or even Windows explorer search.
     
  5. Homerboy

    Homerboy Lifer

    Joined:
    Mar 1, 2000
    Messages:
    25,049
    Likes Received:
    528
    Initial count of first batch is 461,479. They tightened the purge parameters some, so the "millions" has been reduced. That being said. the 461,479 is only the # of scans. Each scan could contain 1-## actual .tif files. I'm using a wild card on each of the 461,479 so while its 461,479 rows of "del", it could end up being millions of actual *.tif files that get nuked still.
     
  6. Charles Kozierok

    Charles Kozierok Elite Member

    Joined:
    May 14, 2012
    Messages:
    6,762
    Likes Received:
    0
    That's no biggie.
     
  7. Homerboy

    Homerboy Lifer

    Joined:
    Mar 1, 2000
    Messages:
    25,049
    Likes Received:
    528
    No. not as horrific as it initially was planned to be. But they adjusted their purging rules so it would have been millions otherwise. Still is a hefty delete, and I'll be nervous as I double click the .bat file :)
     
  8. Charles Kozierok

    Charles Kozierok Elite Member

    Joined:
    May 14, 2012
    Messages:
    6,762
    Likes Received:
    0
    Then you don't have adequate backups. :)

    Offer remains if you need help when the time comes. Just drop me a line.
     
  9. Homerboy

    Homerboy Lifer

    Joined:
    Mar 1, 2000
    Messages:
    25,049
    Likes Received:
    528
    I'm not in charge of backups. And those that are, I don't have the most faith in.
     
  10. Charles Kozierok

    Charles Kozierok Elite Member

    Joined:
    May 14, 2012
    Messages:
    6,762
    Likes Received:
    0
    Well, you never run a delete script on a set of files without a backup, unless it is absolutely impossible.
     
  11. Homerboy

    Homerboy Lifer

    Joined:
    Mar 1, 2000
    Messages:
    25,049
    Likes Received:
    528
    If I'm told the backups are fine, then I am in the clear. If there is a F-up it is not in my scripts.
     
  12. Kyanzes

    Kyanzes Golden Member

    Joined:
    Aug 26, 2005
    Messages:
    1,082
    Likes Received:
    0
    Make sure you have a backup because even the slightest error in code could end in disaster. Any script/prg would require serious testing before allowing it to comb the folder structure.

    Perhaps you should make the test program/script copy the files somewhere and only then delete them.

    Later you can always delete the copies.