Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
User Journal

Journal: Okay friends, here's some more 5

Journal by Porn Whitelist
And this time NO TYPOS (I hope).


# sample url
for ($i = 1; $i <=8; $i++) {
for ($j = 1; $j <= 15; $j++) {
if ($j < 10) {
$cmd = "wget$i/000$j.jpg -w4";
} else {
$cmd = "wget$i/00$j.jpg -w4";

About 7 megs of pix, 120 images.


Journal: I'm B-A-A-A-C-K !!! 4

Journal by Porn Whitelist
Corrected Script - transposed an el and a one" (thanks to Nucleardog for pointing out I made a fuckup). Corrected script below :-)

Sorry for the hiatus (holidays, stupid dog that eats EVERYTHING, work, etc., but I'm back, and here we go with another simple script:


$dir= "g1";
$ext=".jpg -w 4";


Journal: Close enough - heres 11,000 pix 4

Journal by Porn Whitelist
Posted under "Upgrades", so you can (ahem) upgrade your collection.

Yesterday I wrote that if we got this journal up to 50 people friending it, I'd post a script that get a decent chunk of pr0n.

Well, 47 friends/fans is close enough among friends so:


for ($i=105;$i<=859;$i++) {
`wget -r$i/1176.html`

User Journal

Journal: Some curl 3

Journal by Porn Whitelist
As promised, I've found something fairly large (several hundred meg - thousands of pix, both thumbnails and full-sized) for when we've got 50 people who have friended this blog. In the meantime, here's some quickie curl commands to get 5 meg of porn (or you can convert it to wget as an exercise for the reader).

curl[01-20].jpg -o ""


Journal: 47 meg of ... um ... something. 1

Journal by Porn Whitelist
I'm not going to comment on what this is - it's not up to me to judge. That's why it's filed under "X".

Here's the perl script:


for ($i=1; $i <=48; $i++) {
for ($j=1; $j <=16; $j++) {
$url = "";
if($i<10) {$url = $url . '0'; }
$url = $url . $i . '/big_';
if ($j<10) {$url = $url . '0'; }
$url = $url . $j . '.jpg';
`wget -r $url`;


Journal: Some PERL for pr0n 5

Journal by Porn Whitelist
Here's something different - a perl script ...


for ($i=1; $i<27;$i++) {
for ($j=1; $j<17; $j++) {
`wget -r$i/$j.jpg`

For those of you new to perl, those are backquotes around the wget command - they tell the shell to interpet the command string, after doing variable substitution :-)


Journal: MILFs, etc.

Journal by Porn Whitelist
Remember American Pie? How the guys were going nuts over Stiffler's mother? I had never heard of the phrase "MILF" before - "Mother I'd Like (to) Fuck".

So, here's a (very) small leech: 1.3 meg - just type it in a terminal

wget -r

To make up for that small one, here's 87 meg of girls with a little extra ...


Journal: A quickie ... 2

Journal by Porn Whitelist
Lights, camera, and ... action. Here's a quickie to finish off the weekend, for those who are up late ...

wget -r -w 1


Journal: Here's a big one - to celebrate the first freak! 2

Journal by Porn Whitelist
Posted under "Space", 'cause you better have some free space for this one ...

As promised earlier, here's a wget for 386,000 thumbnails, 2.3 gigs ... suitable for all sorts of things, in celebration of Porn Whitelist's first freak:

wget -r -w 10

Why the -w 10?


Journal: Today's wget

Journal by Porn Whitelist
Morning, all.

Here, as promised, is today's wget command, good for 87 meg.

wget -r -w 2

And for those looking for smaller images for their cellphone, pda, etc... coming soon, the wget to leech a couple of gigs of thumbnails. You'll definitely need to be running a unixish system (386,000 files in 1 directory), and just run it as a background job for a week or two.

Total to date: 741 megs

If a 6600 used paper tape instead of core memory, it would use up tape at about 30 miles/second. -- Grishman, Assembly Language Programming