It's also imagining direct connections in south-east Asia that actually route via Hong Kong and Singapore. Haven't they run traceroute? This tube map seems to be an artistic project compared to the submarine cable map.
The far more likely explanation is that these people thought they were stomping on the brake, when they were in fact stomping on the accelerator. I've actually done that when the passenger kicked over a folding sun shade and it (unknown to me) wedged so that every time I pressed the brake, it also pressed the accelerator. The car would lurch forward whenever I started braking. Nothing happened because when I jammed down the brake pedal, the brake overpowered the engine and the car came to a stop. The engine was revving at an uncomfortably high RPM, but the car was stopped.
The Toyota Avensis I used to drive had some protection against this. When I pressed the accelerator pedal all the way quickly the electronic injection control would refuse to accelerate quickly instead performing a gradual acceleration. This was very annoying when I actually wanted to accelerate quickly. I had to learn to press the pedal gradually with just the right speed.
According to our experience every installer version since Leopard upgraded the previous version without checking anything except for Apple hardware. iTunes doesn't care. Our institution eventually paid for OS upgrade licenses once a year, but by that time we already had the latest version installed. It seems to be Apple policy to move users to the latest OS version whether you pay for it or not. Now they are just making it official for the latest upgrade.
Nuclear accidents have not been proven to have killed a single person.
That's not true. There have been many documented deaths.
There are reasonable estimates that as many as a couple of hundred people have died from radiation derived from power plants, total.
So what, you are telling us only 200 people suffered after Chernobil? Have you counted the early liquidators? What about hundreds of kids from that region being treated for cancer that come to the local clinic each year, is that just some unlucky coincidence?
There have been many studies that correlate radiation to cancer, you just need to multiply the numbers by the number of people and you get some nasty numbers. Like some experts have calculated 1M-4M extra cancers from Fukushima. If the #3 explosion was significantly laced with plutonium that number could even be much higher.
A hundred THOUSAND people are known to die from immediate causes of fossil fuel use every single year. Most of that is coal - which only a total idiot would use to power their home. It even releases more irradiation into the environment than nuclear power does. Coal has tiny bits of radioactive particles in it. When you burn it, you release those particles into the air. They usually settle around the coal plant, only affecting the poor shmucks stuck working or living near the coal burning power plant.
Apart from blowing reactor buildings sky high the Japanese have also been incinerating radioactive debris. The radioactive particles are being taken over by jet stream. So just inhale deeply if it's so much cleaner than coal smoke. Also you might want to store the spent fuel somewhere close to you, it will warm you up and surely there won't ever be any accident with that in the next few thousand years while it cools down and decays.
Learn math. It is your friend. It will keep you from doing stupid things like objecting to a safe, clean power source because it involves complex physics that you don't understand.
Learn nuclear physics. Learn chemistry. Learn bio-chemistry. Then redo your math. If more people understood it there would be violent demonstrations at every nuclear installation already. People like you should be conscripted to clean up after the accidents.
Voice data is analyzed for key words using automation. (Think about when you call your credit card company, and can input your CC number by voice)
If no keyword flags are raised, delete the conversation after X time (or immediately, who knows?)
You forgot one important step: voice data is converted to a very low bitrate phoneme-like representation that is good enough for subsequent approximate searches and voice based analytics (speaker recognition...).
20 Kbps is enough if you compress speech with Speex or Opus.
In other news Fukushima daiichi plant chief at the time of the accident died of cancer a few days ago. What a coincidence, maybe it has something to do with radiation.
There's unpredictable (random) noise and there's predictable noise. You can't do much about random noise except for trying to determine how much of the noise there is in particular frequency bands. But you can work around predictable noise. The general idea is for the telco equipment to run a bundle of connections in sync. Then they can correlate noise going from connection to connection in a bundle. Then they modify signals transmitted to a particular connection to include anti-noise component, that is a negative of the signal that is expected to be radiated by the other nearby connections. Well actually they have to modify all the signals to run each connection optimally, the math can be done.
I've been on a 1 Gbps connection for about 5 years now. The nice things about it are:
- you don't need to find that DVD, downloading is faster
- moving around disk and VM images is a one minute job
- you can do everything over remote desktop including video playback and editing
- low latency is nice for interactive applications like videoconferencing (no stupid late echoes)
- a large torrent downloads while you are using the toilet
- video on demand is a non issue, thats just a few Mbps, you can stream quite a few IP cameras all the time just for fun
There are several levels of abstraction that one can pursue when modeling things. We already know a lot about things in all of these levels, only not in a fully comprehensive way. Modeling and simulation is an excellent way to give insights about the gaps in the knowledge and to direct further research.
And each of the 250+ neurotransmitters has different physic-chemical dynamics. Does that mean we need to know everything before we make an overall functional model? Definitely no.
Do I have to take into consideration every car in existence to make a model of congestion on roads? No. Now bring me my spherical cow please.
If anything recent neuroscience research has shown us how little we know about how the brain works. Even for the parts whose function we do know we don't know the actual principles of operation. This is not even close to comprehensive understanding. Basically we know the functionality of the first few layers of neurons closest to receptors then we think we know bits about the next few layers but we know we don't know how the whole learning, adaptation and top-down processes work, and then the further up you go the less we know.
So while in principle I do agree that quantitative modeling going hand in hand with neuroscience research is the way to go pretending to know we can build a somewhat functioning model of the whole brain is a bit of a joke. It's OK as a far fetched goal but we should really go step by step by understanding how parts work.
About two months ago Koreans published a similar success plus they found out the surface trick also worked as a good anti-reflective coating:
But if the game server and client bunch leave Nagling on that often adds another semi-random 200+ milliseconds. I personally think Nagling belongs in the past and no longer should be enabled by default - causes more problems than it solves. It is a kludge that does something at the network layer that should more properly be done at the application layer.
No sane programmer using TCP sockets for real-time interaction would keep Nagling on. Flash protocols (used by many web games) disable it by default. For best performance it's not really necessary to do your own UDP protocol. One can use UDP for loss-tolerant low latency updates and TCP sidechannel to handle the rest. In my experience TCP without Nagling works just fine until your connection bandwidth is overloaded, at which point your performance will degrade using whatever protocol.
Just writing to the output register takes 4 cycles. The minimum of bit shuffling takes 2 cycles, so one could get new bits on the line every 6 cycles or so. You can solve clock drift during longer transfers by checking for an edge every so many cycles and compensating correspondingly. So the routine would start by running a few 01 transitions to synchronize, then you would transfer X bytes, re-synchronize on an edge, send the next X bytes etc.
Factories don't work on internet time. Once a large expensive piece of industrial equipment is installed it's there for a loooong time. I used to work on upgrading some software for four machines that were 15 years old at the time. Two new machines were ordered (with a price tag of around 4M) and they wanted the control software to be compatible with the old machines. That was about a decade ago. The plan was not to upgrade until old control computers start failing. As far as I know they are still working.
Microsoft's idea seems much like DoubleRecall, except there's a twist where they hope they can filter bad responses well enough to get useful statistics from survey responses. DoubleRecall just makes you retype advertiser's words.