Is that it limits information sharing.
The biggest problem that the internet caused is that it destroyed culture. Worldwide.
Everyone has this common generic culture now.
This kind of culture didn't exist before the internet. Before the internet, you actually had societies develop and advance the arts. But, if you didn't notice already, culture has pretty much frozen since around 1995.
People wear the same clothes as they do in 1995. Style hasn't advanced like it did from the 50's to the 70's. Or from the 70's to the 90's.
People listen to the same kinds of music.
They use the same grammar and language from 20 years ago.
And so on.
It's a pretty well documented phenomenon, and a great Vanity Fair article from a couple years ago describes this perfectly: http://www.vanityfair.com/style/2012/01/prisoners-of-style-201201
The whole idea of information being free and shared by everyone is actually destructive to society, since that means information becomes devalued when culture becomes democratic. It devalues professional tastemakers, causing populist sensibilities to take hold, which is the exact cause of cultural stagnation. Democratic sensibilities are always obvious, and can never advance the state-of-the-art that professional tastemakers can.
So, not everyone needs to see the same movies, listen to the same music, and so on. It is perfectly fine to limit these items, to make sure there ARE "have-nots". People don't HAVE to have every single goddam song in their library.
We really do need to limit the spread of information, through costs, DRM, or other means, to cause society to advance. Right now the world is frozen in 1995, because information is too open.
Seriously, it is perfectly fine to not know things or to have things. Your life is going to be just fine. But the democratic population wants everything.
Why is this modded -1? I'ts actually a pretty interesting argument, and one I had not heard before. Moderators, using your points as means for censorship makes YOU the bad guy.
The point of the salt is that previously generated and downloadable rainbow tables are of no use. Making new ones would kindof defeat the purpose, as you're effectively brute forcing a tough, hashed password anyway at that point.
This is why it's good practice. It helps mitigate complexity concerns over user supplied passwords, and can make cracking multiple account pwd hashes unrealistic.
I should have just modded this up instead of posting my one word comment. My bad.
Congratulations. You've flunked encryption 101. You never send the plaintext password over the wire, because you can't trust the middleman. Salt and encrypt on the client end, then salt and encrypt on the server end.
SSL is better than anything you could cook up on the client-side, ya dummy.
Alas, it appears I am the fool on this one.
No. Nowadays, organizations and individuals around the world can register second-level and, in some cases, third-level domain names. (In a URL such as maps.google.com, "google" is a second-level name and "maps" is a third-level domain.) They simply need to find an accredited registrar, comply with the registrant terms and conditions and pay registration and renewal fees. The application for a new gTLD is a much more complex process. An applicant for a new gTLD is, in fact, applying to create and operate a registry business supporting the Internet's domain name system. This involves a number of significant responsibilities, as the operator of a new gTLD is running a piece of visible Internet infrastructure.
I guess that settles it.
What was your point then?
Mine was that these new gTLD will be treated exactly like an expensive bracket of