I think that if a robust quantum computer were developed, specific algorithms would follow. Quantum computing allows you to exploit a larger computational basis than classical computing for a given number of (q)bits. The extra internal degrees freedom in a quantum state, thanks to entanglement between the elements, mean that the state-space is much larger than for a classical system and the promise of using that as a computational basis is hard to ignore.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
You'd have to account for the index of refraction of rock at the wave length that you're using. Not easy given that the density and composition of the rock changes over the distance between the detectors. Also, that's assuming that rock is reasonably transparent to some wavelength with a period >> 60 ns.
Pretty sure most of that distance is through solid rock. Rock is more-or-less opaque.
In any case, as other posters have noted, GPS has no problem resolving to much better than 20 m.
It's probably just easier to safeguard the operators with hazmat suits rather than introduce an extra link in the communications chain. The thing is, taking some low-level gamma radiation isn't all that bad. As long as you're not ingesting or absorbing radioactive materials there's not a lot of danger from spending modest amounts of time in elevated radiation. It certainly needs to be monitored, but the threat can be kept below statistically significant levels.
I think that a combination of in-person teaching and online resources are a great combination. To be truly useful; however, the online answers system should be moderated and commented on by the professor.
Often, students learn best from other students (and from teaching their classmates) but misconceptions can arise. Letting those misconceptions remain visible but with the correct solution clearly indicated is a great teaching method.
I would speculate computer inability is rooted in the whole GUI paradigm
Hardly. It's just that people who don't have the need/interest to memorize key sequences can now use computers effectively. The abilities of the interested are only increased by the addition of more extensive graphical tools abut the average ability has decreased because more computer-disinterested people now use them as part of daily life.
Hopefully. Dark matter is a very inelegant solution to observations that don't agree with theory. Even so, working out what properties it must have, should it exist, is a useful exercise because it clarifies the problem more thoroughly.
There seems to be a common misconception that incorrect theories were stupid ideas from the get-go. That's really not the case, until new evidence or new ideas come up the incorrect theories are every bit as valid as the ones that may turn out to be correct and the differences between the various competing theories may point the way to interesting new experiments.
This new theory is probably wrong, but it's founded on an assumption that, while not currently accepted as true, is experimentally verifiable. That's the assumption that anti-matter and matter have gravitation fields of opposite sign. An experiment to determined the truth of that would be very interesting.
Yeah, I was really wondering what discipline the poor guy studies. He really needs access to some proper equipment if this gizmo is related to his Ph.D.
It seems to me that S&P, along with the other credit rating agencies, lost a lot of credibility when they were giving AAA ratings to the guys holding bundles of sub-prime mortgages in the lead up to the financial crisis. I don't doubt that they play a useful role in rating smaller organisations but when it comes to rating governments and financial heavyweights they're playing politics more than they're making objective assessments.
It's worth noting that the panel was considering the results of the most-likely mode of failure under average conditions and not a worst-case scenario. If a reactor managed to explode and destroy the containment vessels, I'm sure their earlier estimates of the death toll would still apply.
The Fukushima accident suggests that Three-Mile-Island was actually more of a real disaster than a narrowly avoided one; a contained meltdown with some radiation release is a normal failure mode and not tremendously hazardous. On the other hand, the NRC report didn't consider less likely types of failure which could still produce much worse contamination. It's very tough to say beforehand how likely a given type of disaster is and very easy to look back in hindsight and say that there had been a disaster waiting to happen.
Sooner or later I'm sure a worst-case nuclear disaster will occur and the result will be a handful of acute radiation sickness deaths and a few million people who end up with a statistically-insignificant increase to their chances of getting cancer.
It only requires one motor rotating at a constant velocity and two actuators; that's hardly a complex wheel. The extreme simplicity should make it useful in a number of applications and hobbyist designs. It will, however, probably leave little rubber smudge marks on your floor.
The trouble is that publications are used as a metric by outside agencies to gauge productivity when assigning funding or offering new positions. It's simply not possible for everyone who is assessing applications to be knowledgeable enough about particular research fields to judge the merit of past publications individually so they fall back on impact factor.
It's well and good to decide to take the moral high road and make your contribution to moving science in a more open direction by only publishing in lower tier journals, but it hurts the careers of every author who doesn't already have tenure as well as the future grant prospects for your lab. I think that a move to a new publication system is necessary, but it's hard for individual scientists to move the process along. A journal with published reviewer comments is a good step in the right direction.
Think we'll be able to go to Microsoft.apple?
They happen to be using it as a mixer, but the article clearly says that it's a FET (which certainly qualifies as a non-linear device). It might not be suitable for digital logic yet, but it is a transistor I believe. Also, 10 GHz for a proof-of-concept is damn fast.