• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Something like Data Mining? Need opinions\suggestions, please.

I would like to automate a process that we are currently doing manually, but don't really have a good idea where to start.

I need to, on a monthly basis, download Antivirus updates from an http site. That's the part we do manually, of course once they're here we have the FTP processs automated to distribute it.

The antivirus updates are unique to the product we are using for systems management and we don't have FTP access to the virus updates.

Any thoughts on this.
 
They're in a relatively predictable format, though of course some type of error handling and logging will be necessary, even just a fail message would serve.
 
It sounds like it would be fairly trivial to write a perl script to retrieve the webpage, parse out the names of the updates, and then retrieve the updates.
 
Originally posted by: notfred
It sounds like it would be fairly trivial to write a perl script to retrieve the webpage, parse out the names of the updates, and then retrieve the updates.

Would it be trivial for someone who has never written a perl script before smarty 🙂😛.

Well that is actually an option, A guy in a different department has already written a few parsing scripts for me, I'll ask him. First I think I'm gonna talk to the company and see if they'll put the updates on an FTP and give me a login, as that would be both SMART and a hell of a lot easier.

Thanks for the suggestion Notfred.

Anybody else have anything?

 
Back
Top