Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Submission + - The Quantum Experiment That Simulates A Time Machine

KentuckyFC writes: One of the extraordinary features of quantum mechanics is that one quantum system can simulate the behaviour of another that might otherwise be difficult to create. That's exactly what a group of physicists in Australia have done in creating a quantum system that simulates a quantum time machine. Back in the early 90s, physicists showed that a quantum particle could enter a region of spacetime that loops back on itself, known as a closed timelike curve, without creating grandfather-type paradoxes in which time travellers kill their grandfathers thereby ensuring they could never have existed to travel back in time in the first place. Nobody has ever built a quantum closed time-like curve but now they don't have to. The Australian team have simulated its behaviour by allowing two entangled photons to interfere with each other in a way that recreates the behaviour of a single photon interacting with an older version of itself. The results are in perfect agreement with predictions from the 1990s--there are no grandfather-type paradoxes. Interestingly, the results are entirely compatible with relativity, suggesting that this type of experiment might be an interesting way of reconciling it with quantum mechanics.

Submission + - Is Spacetime Countable--And Why It Matters?

KentuckyFC writes: One of the big problems with quantum gravity is that it generates infinities that are hard to deal with mathematically.They come about because quantum mechanics implies that accurate measurements of the universe on tiny scales require high-energy. But when the scale becomes very small, the energy density associated with a measurement is so great that it should lead to the formation of a black hole, which would paradoxically ruin the measurement that created it. So physicists have invented a technique called renormalisation to get rid of the infinities. They assume there is a minimum scale beyond which nothing can be smaller, the so-called Planck scale. This limit ensures that energy densities never become high enough to create black holes. This is equivalent to saying that space-time is not infinitely divisible. Instead it must be discrete, or as a mathematician might put it, countable. In other words, it is possible to allocate a number to each discrete volume of space-time making it countable, like grains of sand on a beach or atoms in the universe. Many physicists are uncomfortable with this ideas and now they may have an alternative. A small group of cosmologists are developing a new theory of gravity, called shape dynamics, in which spacetime is infinitely divisible and so uncountable . This ignores many ordinary features of physical objects, such as their position within the universe. Instead, it focuses on objects’ relationships to each other, such as the angles between them and the shape that this makes (hence the term shape dynamics). These angles and shapes are scale invariant--they are the same whatever scale you look at them. And that's why spacetime in this model is infinitely divisible. It's early days for shape dynamics but a growing number of theorists have high hopes for the theory following a recent proof that special relativity is its mathematical equivalent.

Submission + - The Paradoxes That Threaten To Tear Modern Cosmology Apart

KentuckyFC writes: Revolutions in science often come from the study of seemingly unresolvable paradoxes. So an interesting exercise is to list the paradoxes associated with current ideas in science. One cosmologist has done just that by exploring the paradoxes associated with well-established ideas and observations about the structure and origin of the universe. Perhaps the most dramatic of these paradoxes comes from the idea that the universe must be expanding. What’s curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet nobody knows how this can occur. What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that the energy must increase as well. So much for conservation of energy. And even the amount of energy associated with the vacuum is a puzzle with different calculations contradicting each other by 120 orders of magnitude. Clearly, anybody who can resolve these problems has a bright future in science but may also end up tearing modern cosmology apart.

Submission + - The Strange Story Of The First Quantum Art Exhibition in Space

KentuckyFC writes: When Samantha Cristoforetti blasted towards the International Space Station in November last year, she was carrying an unusual cargo in the form of a tiny telescope just 4 centimetres long and 1 centimetre in diameter attached to an unpowered CCD array from a smartphone camera. The telescope is part of an art project designed by the Dutch artist Diemut Strebe in which he intends to invoke quantum mechanics to generate all of the art ever made. Now MIT physicist Seth Lloyd has stepped forward to provide a scientific rationale for the project. He says the interaction of the CCD with the cosmic background radiation ought to generate energy fluctuations that are equivalent to the array containing all possible images in quantum superposition. Most of these will be entirely random but a tiny fraction will be equivalent to the great works of art. All of them! What's more, people on Earth can interact with these images via a second miniature telescope on Earth that can become correlated with the first. Lloyd says this is possible when correlated light enters both telescopes at the same time. Strebe plans to make his quantum space art exhibition available in several places before attaching the second telescope to the James Webb Space telescope and blasting that off into space too. Whatever your view on the art, it's hard not to admire Strebe's powers of persuasion in co-opting the European Space Agency, NASA and MIT into his project.

Submission + - The Mystery Of Glenn Seaborg's Missing Plutonium: Solved

KentuckyFC writes: In the early 1940s, Glenn Seaborg made the first lump of plutonium by bombarding uranium-238 with neutrons in two different cyclotrons for over a year, The resulting plutonium, chemically separated and allowed to react with oxygen, weighed 2.77 micrograms. It was the first macroscopic sample ever created and helped win Seaborg a Nobel prize ten years later. The sample was displayed at the Lawrence Hall of Science in Berkeley until the early naughties, when it somehow disappeared. Now nuclear detectives say they've found Seaborg's plutonium and have been able to distinguish it from almost all other plutonium on the planet using a special set of non-destructive tests. The team say sample is now expected to go back on display at Seaborg's old office at Berkeley.

Submission + - Entanglement Makes Quantum Particles Measurably Heavier, Says Quantum Theorist

KentuckyFC writes: Physicists have long hoped to unify the two great theories of the 20th century--general relativity and quantum mechanics. And yet a workable theory of quantum gravity is as far away as ever. Now one theorist has discovered that the uniquely quantum property of entanglement does indeed influence a gravitational field and this could pave the way for the first experimental observation of a quantum gravity phenomenon. The discovery is based on the long-known quantum phenomenon in which a single particle can be in two places at the same time. These locations then become entangled--in other words they share the same quantum existence. While formulating this phenomenon within the framework of general relativity, the physicist showed that if the entanglement is tuned in a precise way, it should influence the local gravitational field. In other words, the particle should seem heavier. The effect for a single electron-sized particle is tiny--about one part in 10^37. But it may be possible to magnify the effect using heavier particles, ultrarelativistic particles or even several particles that are already entangled.

Submission + - Cause And Effect: How a Revolutionary New Statistical Test Can Tease Them Apart

KentuckyFC writes: Statisticians have long thought it impossible to tell cause and effect apart using observational data. The problem is to take two sets of measurements that are correlated, say X and Y, and to find out if X caused Y or Y caused X. That's straightforward with a controlled experiment in which one variable can be held constant to see how this influences the other. Take for example, a correlation between wind speed and the rotation speed of a wind turbine. Observational data gives no clue about cause and effect but an experiment that holds the wind speed constant while measuring the speed of the turbine, and vice versa, would soon give an answer. But in the last couple of years, statisticians have developed a technique that can tease apart cause and effect from the observational data alone. It is based on the idea that any set of measurements always contain noise. However, the noise in the cause variable can influence the effect but not the other way round. So the noise in the effect dataset is always more complex than the noise in the cause dataset. The new statistical test, known as the additive noise model, is designed to find this asymmetry. Now statisticians have tested the model on 88 sets of cause-and-effect data, ranging from altitude and temperature measurements at German weather stations to the correlation between rent and apartment size in student accommodation.The results suggest that the additive noise model can tease apart cause and effect correctly in up to 80 per cent of the cases (provided there are no confounding factors or selection effects). That's a useful new trick in a statistician's armoury, particularly in areas of science where controlled experiments are expensive, unethical or practically impossible.

Submission + - How Data From The Kepler Space Telescope Is Changing The Drake Equation

KentuckyFC writes: The Drake equation describes how the number of other extraterrestrial civilisations in the galaxy depends on factors such as the percentage of stars with planets, the percentage of those that are capable of hosting life, the percentage of these on which life actually forms, and so on. It has been a famous rallying point for the search for extraterrestrial intelligence since the early 1960s when Frank Drake first formulated it. Since then, critics have argued that many of the parameters are unknown so the equation produces numbers that are little better than guesses. Now one astronomer points out that the Kepler Space Telescope is changing that. Kepler was specifically designed to find Earth-like planets around other stars, something it has done remarkably well. For example, the Kepler data suggests that up to 15 per cent of Sun-like stars have Earth-like planets in the habitable zone. These kinds of figures dramatically change that inferences that can be made using Drake's equation. For instance, the new data applied to the Drake equation suggests that the nearest life-bearing planet may be within 10 light years of here. But it also suggests that the nearest civilisation is likely to be thousands of light years away.

Submission + - High Temperature Superconductivity Record Smashed By Sulphur Hydride

KentuckyFC writes: Physicists at the Max Planck Institute for Chemistry in Germany have measured sulphur hydride superconducting at 190 Kelvin or -83 degrees Centigrade, albeit at a pressure of 150 gigapascals, about the half that at the Earth's core. If confirmed, that's a significant improvement over the existing high pressure record of 164 kelvin. But that's not why this breakthrough is so important. Until now, all known high temperature superconductors have been ceramic mixes of materials such as copper, oxygen lithium, and so on, in which physicists do not yet understand how superconductivity works. By contrast, sulphur hydride is a conventional superconductor that is described by the BCS theory of superconductivity first proposed in 1957 and now well understood. Most physicists had thought that BCS theory somehow forbids high temperature superconductivity--the current BCS record-holder is magnesium diboride, which superconducts at just 39 Kelvin. Sulphur hydride smashes this record and will focus attention on other hydrogen-bearing materials that might superconduct at even higher temperatures. The team behind this work point to fullerenes, aromatic hydrocarbons and graphane as potential targets. And they suggest that instead of using high pressures to initiate superconductivity, other techniques such as doping, might work instead.

Submission + - Cultural Fault Lines Determine How New Words Spread On Twitter

KentuckyFC writes: The global popularity of Twitter allows new words and usages to spread rapidly around the world. And that has raised an interesting question for linguists: is language converging into a global "netspeak" that everyone will end up speaking? Now a new study of linguistic patterns on Twitter gives a definitive answer. By looking at neoligisms in geo-located tweets, computational linguists have been able to study exactly how new words spread in time and space. It turns out that some neoligisms spread like wildfire while others are used only in areas limited by geography and demography, just like ordinary dialects. For example, the word "ard", a shortened version of "alright" cropped up in Philadelphia several years ago but even now is rarely used elsewhere. By contrast, the abbreviation "af" meaning "as fuck", as in "this food is as good as fuck", has spread across the US in just a couple of years. The difference in the way new words spread is the result of the geographic and demographic characteristics of the communities in which the words are used. The work shows that the evolution of language on Twitter is governed by the same cultural fault lines as ordinary communication. So we're safe from a global "netspeak" for now.

Submission + - Mathematical Trick Helps Smash Record For The Largest Quantum Factorisation

KentuckyFC writes: One of the big applications for quantum computers is finding the prime factors of large numbers, a technique that can help break most modern cryptographic codes. Back in 2012, a team of Chinese physicists used a nuclear magnetic resonance quantum computer with 4 qubits to factorise the number 143 (11 x 13), the largest quantum factorisation ever performed. Now a pair of mathematicians say the technique used by the Chinese team is more powerful than originally thought. Their approach is to show that the same quantum algorithm factors an entire class of numbers with factors that differ by 2 bits (like 11 and 13). They've already discovered various examples of these numbers, the largest so far being 56153. So instead of just factoring 143, the Chinese team actually quantum factored the number 56153 (233 x 241, which differ by two bits when written in binary). That's the largest quantum factorisation by some margin. The mathematicians point out that their discovery will not help code breakers since they'd need to know in advance that the factors differ by 2 bits, which seems unlikely. What's more, the technique relies on only 4 qubits and so can be easily reproduced on a classical computer.

Submission + - Stars Travelling Close to Light Speed Could Spread Life Through the Universe

KentuckyFC writes: Stars in the Milky Way typically travel at a few hundred kilometres per second relative to their peers. But in recent years, astronomers have found a dozen or so "hypervelocity stars" travelling at up to 1000 kilometres per second, fast enough to escape our galaxy entirely. And they have observed stars orbiting the supermassive black hole at the centre of the galaxy travelling at least an order of magnitude faster than this, albeit while gravitationally bound. Now a pair of astrophysicists have discovered a mechanism that would free these stars, sending them rocketing into intergalactic space at speeds in excess of 100,000 kilometres per second. That's more than a third of the speed of light. They calculate that there should be about 100,000 of these stars in every cubic gigaparsec of space and that the next generation of space telescopes will be sensitive to spot them. That's interesting because these stars will be cosmological messengers that can tell us about the conditions in other parts of the universe when they formed. And because these stars can travel across much of the observable universe throughout their lifetimes, they could also be responsible for spreading life throughout the cosmos.

Submission + - Single Pixel Camera Takes Images Through Breast Tissue 1

KentuckyFC writes: Single pixel cameras are currently turning photography on its head. They work by recording lots of exposures of a scene through a randomising media such as frosted glass. Although seemingly random, these exposures are correlated because the light all comes from the same scene. So its possible to number crunch the image data looking for this correlation and then use it to reassemble the original image. Physicists have been using this technique, called ghost imaging, for several years to make high resolution images, 3D photos and even 3D movies. Now one group has replaced the randomising medium with breast tissue from a chicken. They've then used the single pixel technique to take clear pictures of an object hidden inside the breast tissue. The potential for medical imaging is clear. Curiously, this technique has a long history dating back to the 19th century when Victorian doctors would look for testicular cancer by holding a candle behind the scrotum and looking for suspicious shadows. The new technique should be more comfortable.

Submission + - Halting Problem Proves That Lethal Robots Cannot Correctly Decide To Kill Humans 2

KentuckyFC writes: The halting problem is to determine whether an arbitrary computer program, once started, will ever finish running or whether it will continue forever. In 1936, Alan Turing famously showed that there is no general algorithm that can solve this problem. Now a group of computer scientists and ethicists have used the halting problem to tackle the question of how a weaponised robot could decide to kill a human. Their trick is to reformulate the problem in algorithmic terms by considering an evil computer programmer who writes a piece of software on which human lives depend. The question is whether the software is entirely benign or whether it can ever operate in a way that ends up killing people. In general, a robot could never decide the answer to this question. As a result, autonomous robots should never be designed to kill or harm humans, say the authors, even though various lethal autonomous robots are already available. One curious corollary is that if the human brain is a Turing machine, then humans can never decide this issue either, a point that the authors deliberately steer well clear of.

Submission + - How Baidu Tracked The Largest Seasonal Migration of People on Earth

KentuckyFC writes: During the Chinese New Year earlier this year, some 3.6 billion people travelled across China making it the largest seasonal migration on Earth. These kinds of mass movements have always been hard to study in detail. But the Chinese web services company Baidu has managed it using a mapping app that tracked the location of 200 million smartphone users during the New Year period. The latest analysis of this data shows just how vast this mass migration is. For example, over 2 million people left the Guandong province of China and returned just a few days later--that's equivalent to the entire population of Chicago upping sticks. The work shows how easy it is to track the movement of large numbers of people with current technology--assuming they are willing to allow their data to be used in this way.

Slashdot Top Deals

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...