Small OSS like linux, GHC and more? What are independent devs? Rebases are done in local repositories to make the changes cleares. If everyone uses it properly the history is actually a LOT easier to audit. You can use software like gerrit to help enforce such things. SVN is worse than git in every sphere. I can't think of a good reason to use it instead of git. If you want to checkout only parts of it, there's git submodules. If you want to add large binary files to your repository, maybe you want to use git-annex. The only reason svn is still used is due to legacy. Migrating to a new VCS may be very expensive.
I think the regular pirate doesn't care if things change or not, because he's already getting things his way. Besides, it doesn't look like those producers will be able to end piracy any time soon. So I think the battle is already over. Pirates won.
Have you read tfa?
When you're doing something you like doing, then you always want to add one cool new thing. Or you want to fix something that really bugs you. The developers probably don't care much how long things take, as long as everything is done right. That's different from a corporate environment, when you really want to write as little as possible and you hope to be at the office the shortest amount of time. Moreover, you want to deal with the least amount of problems, so you need the least amount of features.
They'd need to steal your email too, because the download links come by e-mail. If they have your email and facebook accounts, I don't think being able to download your wall posts is the biggest issue. Moreover, they could still download your messages, which probably contain more sensitive information, since it's private as opposed to wall, which is more or less public.
I downloaded the archive, but my wall posts were not there either.
Well, it's more delegated than it is decentralized. In the end you still have an authority dictating the rules. Anyway, DNS has a very simple query and a very simple and straightforward answer. However, if you want to find some content on the Internet, the query and the answer are much more sofisticated. When you take into account spam, algorithm improvement and so on. It looks very hard to think of a decentralized index that could possible work and provide better results than google.
Then downforeveryoneorjustme.com should consider the HTTP status response. I'm sure it's not 200 when something bad happens.
Some centralization is required, though. There must be at least an index, like google, to find the IPs of relevant servers with the information you need. How would you solve information retrieval problem without a server-client platform. I think it's likely to be the cheapest solution for the problem. That been said, there is really no need for everyone sharing the same email server, for instance. I agree with you that there are more client-server designs than needed, but they are needed. They can also be the simplest technical solution, which is good by itself. A social network, like facebook, for instance. Didn't need to be centralized, and it would be a lot better if it wasn't.
At first sight centralization looks like a great way to reduce costs, however, there are problems that are likely to make the cost actually rise. I'm unaware of any studies showing when it's good to centralize and when it's bad. It would be a very interesting study to read. While you may need fewer machines to run 20 websites together than you'd need to run them separately, but you'll also need more qualified staff. An avarage 16 years old with a bit of time in his hands and an interest for technology can easily configure a machine capable of handling hundreds of requests per second. That's more than most sites need. As you strive to share more and more resources, you'll need more and more qualified staff, capable of figuring out how to manage the resource sharing. Resource sharing is one of the most difficult problems in computer science. With machine prices being so low, is it really worthy to solve those problems for all but really big, google sized, sites?
Moreover, they have found in other studies that emergency surgeries over the weekend are more likely to result in death than on other days of the week. So the hypothesis that it could be explained simply because of how doctors schedule surgeries can be ruled out. Besides, it seems that care over the weekend is actually poorer than care during the week days. So it seems like surgeon would not want to schedule trickier surgeries over friday. Also, the surgeon himself would like to rest over the weekend, so it is more likely that he would want the opposite. It seems that he would like to do surgeries that are less likely to result in complications on Friday.
One of the articles, the one from GCN, states: "He proposes deploying antennas using inflatable balloons, and using the cold environment of the dark side of the moon to cool the computers."
Isn't the moon suffering impacts all the time? Isn't it risky to leave a supercomputer there?
That would be incredible thing to do. I bet it would be interesting to use its idle time to projects like SETI.
The way I see it the criminal is being penalized indeed. Perhaps he really should be penalized, because stealing is not good for society, but it's a punishment, nonetheless. A robber may even be arested as well, which is yet another punishment.