• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

This is why Perl is useful...

A website not under my control contains a list of accounts (emails) I need to directly contact, and the only facility offered by the website is a simple form; not exactly flexible. It took me just about three minutes to extrapolate all the emails out into my own list for more lucid processing:

{
local $/;
undef $/;
$_ = <>;
}

while (/mailto: ([^\">]+)[\">]/g) {
print "$1\n";
}

And a simple CLI command:

cat page.html | perl members.pl > members.txt

I run across simple problems like this all the time, and Perl is often the quickest answer. Anyway, I just wanted to spread some Perl love.

[edit]Parsed part of it as an emoticon.[/edit]
 
alright... how can i get the url for every image in a directory, and add a specific set of code that i can determine onto the beginning and end of each url?
 
Originally posted by: evilocity
alright... how can i get the url for every image in a directory, and add a specific set of code that i can determine onto the beginning and end of each url?

I don't grok what you're saying. Are you saying the URL of ever image in a virtual directory when browsing is enabled, or are you referring to something else?
 
Originally posted by: Descartes
Originally posted by: evilocity
alright... how can i get the url for every image in a directory, and add a specific set of code that i can determine onto the beginning and end of each url?

I don't grok what you're saying. Are you saying the URL of ever image in a virtual directory when browsing is enabled, or are you referring to something else?

yes. every image's URL in a directory.

say.. the directory was www.bs.com/crapola

then the thing would list out

<<insert crap here>>http://www.bs.com/crapola/image1.jpg<<insert some more crap here>>
<<insert crap here>>http://www.bs.com/crapola/image2.jpg<<insert some more crap here>>

and so on

and drop all that info into a text file.
 
ok, what's the problem exactly? "and the only facility offered by the website is a simple form;" I don't understand what you mean. How does it look like the list with the e-mail addresses?
 
Originally posted by: ndee
ok, what's the problem exactly? "and the only facility offered by the website is a simple form;" I don't understand what you mean. How does it look like the list with the e-mail addresses?

i think by simple form, he means he's not allowed to do "Mail group"
 
Originally posted by: evilocity
Originally posted by: ndee
ok, what's the problem exactly? "and the only facility offered by the website is a simple form;" I don't understand what you mean. How does it look like the list with the e-mail addresses?

i think by simple form, he means he's not allowed to do "Mail group"

😕
 
Originally posted by: ndee
ok, what's the problem exactly? "and the only facility offered by the website is a simple form;" I don't understand what you mean. How does it look like the list with the e-mail addresses?

A page contained a list of accounts, including email addresses, and I needed to send a mass email to all. The only way to send emails to all would be to either use the simple web form provided on the site, or parse the HTML as I have done. I needed to do the latter, because I have a program that processes the emails. Make sense?

 
Originally posted by: evilocity
Originally posted by: Descartes
Originally posted by: evilocity
alright... how can i get the url for every image in a directory, and add a specific set of code that i can determine onto the beginning and end of each url?

I don't grok what you're saying. Are you saying the URL of ever image in a virtual directory when browsing is enabled, or are you referring to something else?

yes. every image's URL in a directory.

say.. the directory was www.bs.com/crapola

then the thing would list out

<<insert crap here>>http://www.bs.com/crapola/image1.jpg<<insert some more crap here>>
<<insert crap here>>http://www.bs.com/crapola/image2.jpg<<insert some more crap here>>

and so on

and drop all that info into a text file.

I'd have to see the page you were referring to, because it's not the same for every web server; however, a slight variation of my earlier example would work fine:

{
local $/;
undef $/;
$_ = <>;
}

while (/href=\"([^\">]+)[\">]/g) {
print "<<insert crap here>>$1<<insert crap here>>\n";
}
 
Originally posted by: Descartes
Originally posted by: ndee
ok, what's the problem exactly? "and the only facility offered by the website is a simple form;" I don't understand what you mean. How does it look like the list with the e-mail addresses?

A page contained a list of accounts, including email addresses, and I needed to send a mass email to all. The only way to send emails to all would be to either use the simple web form provided on the site, or parse the HTML as I have done. I needed to do the latter, because I have a program that processes the emails. Make sense?

yeah, now it makes sense 🙂 Would also go pretty quick with PHP I think 🙂
 
Back
Top