"In war, the first casuality is truth".
Aeschylus (525 BC - 456 BC)
Res eo magis mutant quo manent.
Actually, by definition, faith is:
defn: : strong belief or trust in someone or something.
Thus your ability to confirm is based upon a certain trust in the validity of the scientific process. It does not mean that it is unreasonable, but simply that it is of something that you cannot observe.
As far as the 'duplicated independently', certainly that increases the validity of the measurement. But the question arises: what if there is only one instrument that can measure the phenomenon [such as CERN] ? How much is this really duplicated independently? If the ruler is marked wrong, everyone will be measuring wrong. The foundation of science is more subtle than just the ability to duplicate observations.
The fact that you COULD observe it, doesn't mean you actually will. Thus, until you actually observe it yourself, your knowledge of reality is still coming through faith. For one, you believe that the person telling you these things actually knows what he is talking about, and also that he is not attempting to lie to you. I very much doubt that many could afford a telescope that could see Titan, and so their knowledge will never rise above a simple belief that the scientist knows better than he does and he is not deceptive.
Faith in a human being can go wrong, but, let's be honest, there just isn't enough time, nor talent, nor energy nor equipment to verify everything that the experts say. Our knowledge of these things comes through hearing them from others, and thus implies at least a rudimentary faith in their competence and veracity. I might add that the confidence we have in the findings of others is necessary for the progress of human knowledge. No one would get very far if each of us had to rediscover calculus or remeasure the basic physical constants of the universe. It is faith in the metaphysical assumptions of truth, veracity and verifiability that make science possible, but the large corpus of observation is largely based on confidence in another human being.
I might add that the criteria of 'duplication' in many of the most advanced areas of physics are close to impossible for all but a very select few. Not everyone can build a hadron collider in their backyard....
Although only tested in one person, the discovery suggests that a single area – the claustrum – might be integral to combining disparate brain activity into a seamless package of thoughts, sensations and emotions. It takes us a step closer to answering a problem that has confounded scientists and philosophers for millennia – namely how our conscious awareness arises.
When the team zapped the area with high frequency electrical impulses, the woman lost consciousness. She stopped reading and stared blankly into space, she didn't respond to auditory or visual commands and her breathing slowed. As soon as the stimulation stopped, she immediately regained consciousness with no memory of the event. The same thing happened every time the area was stimulated during two days of experiments.
What happens when a government can do this to a person remotely or enmass? Tin foil hat time."
Link to Original Source
The article doesn't really specify how the 90% were spied upon. It could simply be as a consequence of recording a telephone from a known suspect. I imagine that even a terrorists normal activity consists of many mundane things that involve innocent people: they order pizza, they go to bars, they buy things in stores, etc. Of course if someone is under surveillance, all these innocent people also get involved by the simple fact that they become somehow possible accessories in his crime. I would imagine that 90% of the activity of any criminal, including organised crime, is fairly innocuous, and innocent people will be also recorded because of this.
What I would really like to know is how much of this gathering of information is a consequence of the gathering of information on a possible suspect or simply a mass gathering of data about everyone with the filter applied afterwards. If the suspect is already under surveillance, I imagine that the innocent population would tolerate a loss of privacy simply because that person is a threat. If it is the other way around, that is that information is gathered indiscriminately in order to search for possible suspects, then it is extremely dangerous.
The fact that the Post does not describe in detail these findings makes the article more sensational than useful in my opinion.
Most of the world today has developed some level of immunity to the 2009 pandemic flu virus, which means that it can now be treated as less dangerous “seasonal flu”. Professor Kawaoka intentionally set out to see if it was possible to convert it to a pre-pandemic state in order to analyze the genetic changes involved.
The study is not published, however some scientists who are aware of it are horrified that Dr Kawaoka was allowed to deliberately remove the only defense against a strain of flu virus that has already demonstrated its ability to create a deadly pandemic that killed as many as 500,000 people in the first year of its emergence."
Link to Original Source
To be honest the game was not that extraordinary, but it had its appeal. I think however it was the music that really made the game popular. Seeing that the game was all about matching up blocks that fit together, it was very poetic that they used a Russian folk song that was about courtship.
Back then, programmers had a bit of culture. These hipsters are just faking it.
Did you know you can actually use COBOL to write iOS apps ?
Having a beard even makes programming hotter.... seriously, in summertime, the beard is a no go.
Link to Original Source
Well, either he's created the mother of all LISP macros, or it's simply vaporware. Love to see it when they publish it. Code or it didn't happen.
Here is the obligatory xkcd, panel two.
I think the technical problems of driving a car by computer can be engineered far within safety requirements. In many factories robots perform much more hazardous work with impeccable precision.
However, driving a car is not just an engineering problem. There are many more problems to solve than just the software for the computer:
1/ Who is responsible for the eventual accident? Will it be operator of the computer or the programmer?
2/ Even if the computer is flawless, it does not guarantee that the OTHER DRIVERS will be flawless. Shit will happen. What happens if an automated car crashes into another automated car? Who decides whose algorithm/program is at fault?
3/ Who is going to be sued if the automated car kills someone? The computer programmer, the one who installs it, the one who builds the car, or all of the above?
4/ Who is going to insure an automated car? The only reason an insurance company will insure something is that SOMEONE is responsible for monthly payments and they can vindicate someone else if something goes wrong. I have yet to see an insurance company insure a computer program.
When it is all said in done, the problem with driving a car is above all a human problem, not a technical one.
One thing about a human doctor though is that they often know what it feels like to be in pain. A robot doesn't as it only has an algorithm. A nurse, when she sticks the needle in, will notice how you react, whether you feel pain or not. I would think the robot would need to have some manner of sensing if it is doing something harmful or painful to the patient.
However, I have had doctors and nurses that are completely insensitive to their patients, so if the robot can get it right each time it might be a better alternative. I've had sessions where it took 4 tries for the nurse to get the intravenous in correctly. It was not a very pleasant experience. I'd let the robot give it a try after that.
From the article:
before being streamed across 500 miles of Australia's National Broadband Network to the Pawsey Centre, which gets rid of most of it as quickly as possible.
Get rid of data? Don't you mean routing the data to its destination? And you would hope the Pawsey Centre actually DID something with the data and not just get rid of it.