I've discovered that many people don't protect directories on their web servers from directory browsing. As easy as it is to do so using server directives or a simple index.html file, they don't.
Finding such directories can be lots of fun.
For example, consider doing a google search for "Index of" mpeg. This will find many normal site results and a bunch of directory listings. If you find a directory listing that contains files you'd like to have, you can use wget to grab the whole directory for you.
First navigate to the topmost directory that you want to grab. Then execute wget -r -l inf -np -N $SITE_TO_GRAB where $SITE_TO_GRAB is the url of the directory listing.
-r Recursive web suck
-l inf Infinite depth (replace inf with a number to set a limit)
-np Don't go to the parent directory
-N Only download files newer than local copies (great for resuming interrupted downloads or updating your local copy)
There are many hundreds of gigs of data out there for the taking. Everything from music and movies to pr0n. Sure it's not very well targeted, but you may find someone with tastes similar to your own who allows directory browsing.