Several of the older movies were set in times that have already past (Blade Runner, Colossus, 2001) but depicted technology far beyond anything we have now. Blade Runner: Organic humanoid robots, flying cars, interstellar travel. Colossus: AI-like computer that can control the world. 2001: Interplanetary travel by humans, suspended animation of humans, AI-like computer.
So we waited and the AI and other technologies never came. What does it mean when our dystopian sci-fi was too optimistic?
Maybe a more realistic view of the future is that we never create AI, or flying cars, or interplanetary/interstellar travel for humans because we are too busy wasting our resources on killing each other.
Requiring users to log in via Google+ and Facebook respectively in order to establish a real-world identity.
Yes, because everyone uses their "real" identity on G+ and FB!
This is the same weird logic used in health care insurance, which also wants to charge more or less based on individual risk. So if we follow their logic...
Extrapolating this out, they eventually end up charging each individual exactly what it will cost the insurance company to pay each individual's claims plus their profit margin. At that point, the insurance company is a useless middle man and everyone may as well be self-insured.
Add to that, some satellite internet services use DSL for the upstream connection, which wouldn't work at all for a remote station in South America.
GlobalStar is a low earth orbit (about 60 miles up) satellite communications system that can do internet traffic. Latency will be much lower than a geo-stationary satellite. But speed will be low (about the same as a phone modem) unless you tie several channels together. To keep satellite costs down, the system is a "bent-pipe," so availability will depend on whether GlobalStar has a ground station somewhere near where you are using it. Having to license ground stations in hundreds of different countries is what really held back development of this system.
Iridium is also LEO, but has more complex satellites that route calls from satellite to satellite until it is over a ground station in the US, then routes the call to the ground. Last I heard it had been appropriated by the US military (they liked that all calls went through the US instead of ground stations in other countries). I don't know whether civil service is available any more. But it would probably also be a pretty slow link since it was originally designed for phone calls.
Considering the number of things the NSA has completely missed (e.g. Boston bomber, Snowden, Bengazi, etc.) I'm beginning to wonder if the NSA really has any decent spying capabilities at all. What if this is much like a Banana Republic, were the government puffs up it's chest and parades around a bunch of military men and equipment to try to scare it's citizens into line. But actually they are totally outnumbered by the citizenry, have very little real power, and they know it.
All these "leaks" about the NSA spying on everyone in the world could just be a desperate attempt by a government that realizes it has very little real control over people to try to keep people in line. Sure, they might be collecting a lot of data, but storage and analysis may be such a monumental task that they can really only figure out things in retrospect, which really doesn't give them much advantage over classic investigation techniques. But hey, some tech companies are probably getting rich over this.
Not trying to flamebait here, and I admit I stopped following Linux kernel development about 10 years ago. It seemed to me, at least in the past, that a major roadblock to Linux being useful for audio, video, or real-time applications was that kernel-mode execution was non-interruptable. I remember there were some forks that made Linux more of a real-time OS, but I never heard of any of that being incorporated into any of the major distributions. When I asked a Linux apologist about this he acted like I was crazy and said, "Of course you can't interrupt the kernel, it's in kernel mode!" Funny, because Windows has been doing it since Windows NT.
Is this still the case with Linux? If it is, how can an application guarantee that audio and video won't experience hiccups? Just by throwing lots of CPUs and processor power at the problem? Better drivers would not solve this problem.
Writing software is more fun than working.