Journal Journal: 116 pics of assorted ladies 8
#!/bin/bash
#!/bin/bash
But I REALLY hate how slashcode screws up the code formatting. the <ecode> tag handling sux the bag.
#!/usr/bin/perl
# sample url http://www.hardloveart.com/34/inf8.0015.jpg
for ($i = 1; $i <=8; $i++) {
for ($j = 1; $j <= 15; $j++) {
if ($j < 10) {
$cmd = "wget www.hardloveart.com/34/inf$i/000$j.jpg -w4";
} else {
$cmd = "wget www.hardloveart.com/34/inf$i/00$j.jpg -w4";
}
`$cmd`;
}
}
About 7 megs of pix, 120 images.
Sorry for the hiatus (holidays, stupid dog that eats EVERYTHING, work, etc., but I'm back, and here we go with another simple script:
#!/usr/bin/perl
$site="www.met-art.com";
$dir= "g1";
$file="met-art_fl_00";
$ext=".jpg -w 4";
And thanks and welcome to the 25 people who friended Porn Whitelist today
So, here's a script to download more pix.
Yesterday I wrote that if we got this journal up to 50 people friending it, I'd post a script that get a decent chunk of pr0n.
Well, 47 friends/fans is close enough among friends so:
#!/usr/bin/perl
for ($i=105;$i<=859;$i++) {
`wget -r www.pornodonkey.com/$i/1176.html`
}
curl http://www.nikkygalore.com/w200412/d10-2/pics/pp[01-20].jpg -o "nikkygalore.com.#1.jpg"
Here's the perl script:
#!/usr/bin/perl
for ($i=1; $i <=48; $i++) {
for ($j=1; $j <=16; $j++) {
$url = "www.fetishhub.net/mip/gal";
if($i<10) {$url = $url . '0'; }
$url = $url . $i . '/big_';
if ($j<10) {$url = $url . '0'; }
$url = $url . $j . '.jpg';
`wget -r $url`;
}
}
#!/usr/bin/perl
for ($i=1; $i<27;$i++) {
for ($j=1; $j<17; $j++) {
`wget -r www.sweetamateurbabes.com/teen/$i/$j.jpg`
}
}
For those of you new to perl, those are backquotes around the wget command - they tell the shell to interpet the command string, after doing variable substitution
So, here's a (very) small leech: 1.3 meg - just type it in a terminal
wget -r http://www.milfmagazine.com/wives/153evelyn/
To make up for that small one, here's 87 meg of girls with a little extra
wget -r http://www.sweetauditions.com/images -w 1
As promised earlier, here's a wget for 386,000 thumbnails, 2.3 gigs
wget -r http://www.spacethumbs.com/tgp/thumbs -w 10
Why the -w 10?
Here, as promised, is today's wget command, good for 87 meg.
wget -r http://fetishsexorgy.com/tranny/ -w 2
And for those looking for smaller images for their cellphone, pda, etc... coming soon, the wget to leech a couple of gigs of thumbnails. You'll definitely need to be running a unixish system (386,000 files in 1 directory), and just run it as a background job for a week or two.
Total to date: 741 megs
NOTE THAT SLASHCODE put an extra SPACE in jansgalleries.com
#!/bin/bash
site=http://freepornvideos.jansgalleries.com/48
#!/bin bash
for i in $(seq 1 210); do
wget http://www.8sluts.com/smb/a/suckmebitch$i.mpg
done
Instructions
Total of 297 megs of video mpegs.
All I ask is a chance to prove that money can't make me happy.