#!/bin/bash
But I REALLY hate how slashcode screws up the code formatting. the <ecode> tag handling sux the bag.
#!/usr/bin/perl
# sample url http://www.hardloveart.com/34/inf8.0015.jpg
for ($i = 1; $i <=8; $i++) {
for ($j = 1; $j <= 15; $j++) {
if ($j < 10) {
$cmd = "wget www.hardloveart.com/34/inf$i/000$j.jpg -w4";
} else {
$cmd = "wget www.hardloveart.com/34/inf$i/00$j.jpg -w4";
}
`$cmd`;
}
}
About 7 megs of pix, 120 images.
Sorry for the hiatus (holidays, stupid dog that eats EVERYTHING, work, etc., but I'm back, and here we go with another simple script:
#!/usr/bin/perl
$site="www.met-art.com";
$dir= "g1";
$file="met-art_fl_00";
$ext=".jpg -w 4";
And thanks and welcome to the 25 people who friended Porn Whitelist today
So, here's a script to download more pix.
Yesterday I wrote that if we got this journal up to 50 people friending it, I'd post a script that get a decent chunk of pr0n.
Well, 47 friends/fans is close enough among friends so:
#!/usr/bin/perl
for ($i=105;$i<=859;$i++) {
`wget -r www.pornodonkey.com/$i/1176.html`
}
curl http://www.nikkygalore.com/w200412/d10-2/pics/pp[01-20].jpg -o "nikkygalore.com.#1.jpg"
Here's the perl script:
#!/usr/bin/perl
for ($i=1; $i <=48; $i++) {
for ($j=1; $j <=16; $j++) {
$url = "www.fetishhub.net/mip/gal";
if($i<10) {$url = $url . '0'; }
$url = $url . $i . '/big_';
if ($j<10) {$url = $url . '0'; }
$url = $url . $j . '.jpg';
`wget -r $url`;
}
}
#!/usr/bin/perl
for ($i=1; $i<27;$i++) {
for ($j=1; $j<17; $j++) {
`wget -r www.sweetamateurbabes.com/teen/$i/$j.jpg`
}
}
For those of you new to perl, those are backquotes around the wget command - they tell the shell to interpet the command string, after doing variable substitution
Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse