Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Perl

Journal Porn Whitelist's Journal: Some PERL for pr0n 5

Here's something different - a perl script ...

#!/usr/bin/perl

for ($i=1; $i<27;$i++) {
for ($j=1; $j<17; $j++) {
`wget -r www.sweetamateurbabes.com/teen/$i/$j.jpg`
}
}

For those of you new to perl, those are backquotes around the wget command - they tell the shell to interpet the command string, after doing variable substitution :-)

(I hate how slashcode takes out the indentations. If I use nested blockquotes, you'll end up with embedded junk and the script won't work. Any workarounds?)

Instructions:

  1. cut and paste the script using your fav. editor
  2. save it to get.babes.pl
  3. chmod u+x get.babes.pl
  4. ./get.babes.pl

Total for this script: 35 megs

For those who just want to get something using the command line:

wget -r www.pornpicsrus.com -w 2

Total for this command: 48 megs

Total for both scripts: 83 megs.

Total to date: 3.22 gigs

Go to previous and next journal entries.

This discussion has been archived. No new comments can be posted.

Some PERL for pr0n

Comments Filter:
  • use LWP::Simple qw(mirror);
    for my $i (1..27) {
    for my $j (1..17) {
    mirror( "http://www.sweetamateurbabes.com/teen/" . "$i/$j.jpg", "$i-$j.jpg");
    }
    }
  • curl http://www.sweetamateurbabes.com/teen/[10-26]/[1- 1 6].jpg -o "#1-#2.jpg"

    and so you don't have to watch the status fly by and can do other tasks, add
    2>/dev/null &
    to that line. curl onscreen status is STDERR so the '2>...' gets rid of that, and the '&' puts the task in the background right away. just 'ls' to see if it's doing the job. not sure if that's the best way to hide STDERR but it works for me.

    not knocking Perl at all--in fact, it was neat to see two ways to do something rather si
    • Don't worry, I won't post a reply to every single entry
      Are you kidding? Please, keep it up. I figure that at least some of the people reading this will be looking at the code, and learning something.

      Tonight's is a good post to use curl on (though the subject matter is ... different).

      Thanks

  • This seems a little overkill to me, if you're just going to call wget. If you are going to fetch lots and lots of images like this, the quickest way is bash:

    #!/bin/sh
    for (i=0;$i<27;i=$i+1); do
    for (j=0;$j<17; j=$j+1); do
    wget -r www.sweetamateurbabes.com/teen/$i/$j.jpg
    done
    done

    Of course, nothing is going to stop you from using LWP (as mentioned above) to do something nice. I used to abuse wget for fetching comic books from websites (that have long ago disappeared). When I need a quick hack I'll u

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...