Journal Porn Whitelist's Journal: Some PERL for pr0n 5
#!/usr/bin/perl
for ($i=1; $i<27;$i++) {
for ($j=1; $j<17; $j++) {
`wget -r www.sweetamateurbabes.com/teen/$i/$j.jpg`
}
}
For those of you new to perl, those are backquotes around the wget command - they tell the shell to interpet the command string, after doing variable substitution
(I hate how slashcode takes out the indentations. If I use nested blockquotes, you'll end up with embedded junk and the script won't work. Any workarounds?)
Instructions:
- cut and paste the script using your fav. editor
- save it to get.babes.pl
- chmod u+x get.babes.pl
Total for this script: 35 megs
For those who just want to get something using the command line:
wget -r www.pornpicsrus.com -w 2
Total for this command: 48 megs
Total for both scripts: 83 megs.
Total to date: 3.22 gigs
better and faster (Score:2)
Re:better and faster (Score:1)
TMTOWTDI :-)
and for those who'd rather CURL than PERL... (Score:2)
and so you don't have to watch the status fly by and can do other tasks, add
2>/dev/null &
to that line. curl onscreen status is STDERR so the '2>...' gets rid of that, and the '&' puts the task in the background right away. just 'ls' to see if it's doing the job. not sure if that's the best way to hide STDERR but it works for me.
not knocking Perl at all--in fact, it was neat to see two ways to do something rather si
Re:and for those who'd rather CURL than PERL... (Score:1)
Tonight's is a good post to use curl on (though the subject matter is ... different).
Thanks
bash? (Score:1)
This seems a little overkill to me, if you're just going to call wget. If you are going to fetch lots and lots of images like this, the quickest way is bash:
Of course, nothing is going to stop you from using LWP (as mentioned above) to do something nice. I used to abuse wget for fetching comic books from websites (that have long ago disappeared). When I need a quick hack I'll u