Forgot your password?
typodupeerror
User Journal

Journal: Last Day at Work, Graduation, New Job 1

Journal by nullard

Today is my last day at work. I'll be leaving in a few hours. I've enjoyed this job but I'm leaving to a much better one. I graduate (BSCS) in less than two weeks. I already miss being a student. Now I get to go to the "real world" of mortgages and less flexible work schedules.

I'm going to be working for a well known company so I may choose to retire this /. account and start a new one so that people don't associate my past comments with my new employer. Its not that I don't stand by my comments, I just don't want to be seen as misrepresenting my company particularly since their products sometimes appear in articles and comments.

It's funny.  Laugh.

Journal: Submitted: You too can build a software empire! 1

Journal by nullard
In order to make my submissions to slashdot feel less futile, I'm posting them in my journal. At least this way, if they get rejected, someone will see them.
----------------------------
All this hard work designing software is a smokescreen to keep the little man down. Aparently I've been wasting my time as a programmer. I've learned over a dozen languages, have taught courses in programming at the local community college, and have even assisted the author of a popular textbook with his latest edition of that book. I guess I was going about things all wrong. According to this book, there is no need to know anything about code, software engineering, or even be intelligent. You too can build a software empire!

Great. Now I'm gonna be out of a job.
User Journal

Journal: google & wget 1

Journal by nullard

I've discovered that many people don't protect directories on their web servers from directory browsing. As easy as it is to do so using server directives or a simple index.html file, they don't.

Finding such directories can be lots of fun.

For example, consider doing a google search for "Index of" mpeg. This will find many normal site results and a bunch of directory listings. If you find a directory listing that contains files you'd like to have, you can use wget to grab the whole directory for you.

First navigate to the topmost directory that you want to grab. Then execute wget -r -l inf -np -N $SITE_TO_GRAB where $SITE_TO_GRAB is the url of the directory listing.

The parameters:
-r Recursive web suck
-l inf Infinite depth (replace inf with a number to set a limit)
-np Don't go to the parent directory
-N Only download files newer than local copies (great for resuming interrupted downloads or updating your local copy)

There are many hundreds of gigs of data out there for the taking. Everything from music and movies to pr0n. Sure it's not very well targeted, but you may find someone with tastes similar to your own who allows directory browsing.

How many hardware guys does it take to change a light bulb? "Well the diagnostics say it's fine buddy, so it's a software problem."

Working...