70949681
submission
KentuckyFC writes:
Various researchers have attempted to paste an expression from one face on to another but so far with mixed results. Problems arise because these algorithms measure the way a face distorts when it changes from a neutral expression to the one of interest. They then attempt to reproduce the same distortion on another face. That's fine if the two faces have similar features. But when the faces differ in structure, as most do, this kind of distortion looks unnatural. Now a Chinese team has solved the problem with an algorithm that divides a face into different regions for the mouth, eyes, nose, etc and measures the distortion in each area separately. It then distorts the target face in these specific regions while ensuring the overall proportions remain realistic. At the same time, it decides what muscle groups must have been used to create these distortions and calculates how this would change the topology of the target face with wrinkles, dimples and so on. It then adds the appropriate shadows to make the expression realistic. The result is a way to clone an expression and paste it onto another entirely different face. The algorithm opens the way to a new generation of communication techniques in which avatars can represent the expressions as well as the voices of humans. The film industry could also benefit from an easy way to paste the expressions of actors on to the cartoon characters they voice.
70769635
submission
KentuckyFC writes:
One of the great puzzles of biology is how the molecular machinery of life is so finely coordinated. Even the simplest cells are complex three dimensional biochemical factories in which a dazzling array of machines pump, push, copy, and compute in a dance of extraordinarily detailed complexity. Indeed, it is hard to imagine how the ordinary processes of electron transport allow this complexity to emerge given the losses that arise in much simpler circuits. Now a group of researchers led by Stuart Kauffmann have discovered that the electronic properties of biomolecules are entirely different to those of ordinary conductors. It turns out that most biomolecules exist in an exotic state called quantum criticality that sits on the knife edge between conduction and insulation. In other words, biomolecules belong to an entirely new class of conductor that is not bound by the ordinary rules of electron transport. Of course, organic molecules can be ordinary conductors or insulators and the team have found a few biomolecules that fall into these categories. But evolution seems to have mainly selected biomolecules that are quantum critical, implying that that this property must confer some evolutionary advantage. Exactly what this could be isn't yet clear but it must play an important role in the machinery of life and its origin.
70586981
submission
KentuckyFC writes:
The history of pop music is rich in anecdotes, folklore and controversy. But despite the keen interest, there is little in the form of hard evidence to back up most claims about the evolution of music. Now a group of researchers have used data analysis tools developed for genomic number crunching to study the evolution of US pop music. The team studied 30-second segments of more than 17,000 songs that appeared on the US Billboard Hot 100 between 1960 and 2010. Their tools categorised the songs according to harmonic features such as chord changes as well as the quality of timbre such as whether guitar-based, piano-based orchestra-based and so on. They then used a standard algorithm for discovering clusters within networks of data to group the songs into 13 different types, which turned out to correspond with well known genres such as rap, rock, country and so on. Finally, they plotted the change in popularity of these musical types over time. The results show a clear decline in the popularity of jazz and blues since 1960. During the same period, rock-related music has ebbed and flowed in popularity. By contrast, was rare before 1980 before becoming the dominant musical style for 30 years until declining in the late 2000s. The work answers several important question about the evolution of pop music, such as whether music industry practises have led to a decline in the cultural variety of new music and whether British bands such as The Beatles and The Rolling Stones triggered the 1964 American music revolution [spoiler: no in both cases].
70426981
submission
KentuckyFC writes:
When Christopher Nolan teamed up with physicist Kip Thorne of Caltech to discuss the science behind his movie Interstellar, the idea was that Thorne would bring some much-needed scientific gravitas to the all-important scenes involving travel through a wormhole. Indeed, Thorne used the equations of general relativity to calculate the various possible shapes of wormhole and how they would distort the view through it. A London-based special effects team then created footage of a far away galaxy as seen through such a wormhole. It showed the galaxy fantastically distorted as a result, just as relativity predicts. But when it came to travelling through a wormhole, Nolan was disappointed with the footage. The problem was that the view of the other side when travelling through a wormhole turns out to be visually indistinguishable from a conventional camera zoom and utterly unlike the impression Nolan wanted to portray, which was the sense of travelling through a shortcut from one part of the universe to another. So for the final cut, special effects artists had to add various animations to convey that impression. "The end result was a sequence of shots that told a story comprehensible by a general audience while resembling the wormhole’s interior," admit Thorne and colleagues in a paper they have published about wormhole science in the film. In other words, they had to fudge it. Nevertheless, Thorne is adamant that the visualisations should help to inspire a new generation of students of film-making and of relativity.
70378581
submission
KentuckyFC writes:
When physicists attempt to calculate the energy density of the universe from first principles, the number they come up using quantum mechanics is 10^94 g/cm^3 . And yet the observed energy density is about 10^-27 g/cm^3. In other words, our best theory of reality misses the mark by 120 orders of magnitude. Now one researcher says the paradox can be resolved by considering the information content of the universe. Specifying the location of the 10^25 stars in the visible universe to an accuracy of 10 cubic kilometres requires some 10^93 bits. And using Landauer's principle to calculate the energy associated with all these bits gives an energy density of about 10^-30 g/cm^3. That's not a bad first principles result. But if the location has to be specified to the Planck length, then the energy density is about 117 orders of magnitude larger. In other words, the nature of information should lie at the heart of our best theory of reality, not quantum mechanics.
70047273
submission
KentuckyFC writes:
One of the more exciting predictions from "braneworld" theories of high energy physics is that matter can leak out of other universes into our own, and vice versa. The basic idea is that our three-dimensional universe or brane is embedded in a much larger multi-dimensional cosmos. These branes can become coupled so that a quantum particle such as a neutron can exist in a superposition of states in both universes at the same time. When the neutron collides with something, the superposition collapses and the particle must suddenly exist in one brane or the other. That means neutrons from our universe can leak into other branes and then back again. Now physicists are devising an experiment to look for this neutron leakage. They plan to put a well shielded neutron detector next to a shielded nuclear reactor that produces neutrons at a research facility in France. All this shielding means the detector should not see any neutrons from inside the reactor. However, if the neutrons are leaking into another brane and then back into our world, they can bypass this shielding and trigger the detector. The team has not yet set a date for the experiment but the discovery of neutrons (or anything else) leaking into our universe would be huge.
70015975
submission
KentuckyFC writes:
Beauty is in the eye of the beholder. But what if the beholder is a machine? Scientists from Yahoo Labs in Barcelona have trained a machine learning algorithm to pick out beautiful photographic portraits from a collection of not-so-beautiful ones. They began with a set of 10,000 portraits that have been rated by humans and then allowed the algorithm to "learn" the difference by taking into account personal factors such as the age, sex and race of the subject as well as technical factors such as the sharpness of the image, the exposure and the contrast between the face and the background and so on. The trained algorithm was then able to reliably pick out the most beautiful portraits. Curiously, the algorithm does this by ignoring personal details such as age, sex, race, eye colour and so on and instead focuses only on technical details such as sharpness, exposure and contrast. The team say this suggests that any subject can be part of a stunning portrait regardless of their looks. It also suggests that "perfect portrait" algorithms could be built in to the next generation of cameras, rather like the smile-capturing algorithms of today.
69780527
submission
KentuckyFC writes:
One of the extraordinary features of quantum mechanics is that one quantum system can simulate the behaviour of another that might otherwise be difficult to create. That's exactly what a group of physicists in Australia have done in creating a quantum system that simulates a quantum time machine. Back in the early 90s, physicists showed that a quantum particle could enter a region of spacetime that loops back on itself, known as a closed timelike curve, without creating grandfather-type paradoxes in which time travellers kill their grandfathers thereby ensuring they could never have existed to travel back in time in the first place. Nobody has ever built a quantum closed time-like curve but now they don't have to. The Australian team have simulated its behaviour by allowing two entangled photons to interfere with each other in a way that recreates the behaviour of a single photon interacting with an older version of itself. The results are in perfect agreement with predictions from the 1990s--there are no grandfather-type paradoxes. Interestingly, the results are entirely compatible with relativity, suggesting that this type of experiment might be an interesting way of reconciling it with quantum mechanics.
69473829
submission
KentuckyFC writes:
Revolutions in science often come from the study of seemingly unresolvable paradoxes. So an interesting exercise is to list the paradoxes associated with current ideas in science. One cosmologist has done just that by exploring the paradoxes associated with well-established ideas and observations about the structure and origin of the universe. Perhaps the most dramatic of these paradoxes comes from the idea that the universe must be expanding. What’s curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet nobody knows how this can occur. What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that the energy must increase as well. So much for conservation of energy. And even the amount of energy associated with the vacuum is a puzzle with different calculations contradicting each other by 120 orders of magnitude. Clearly, anybody who can resolve these problems has a bright future in science but may also end up tearing modern cosmology apart.
69202707
submission
KentuckyFC writes:
When Samantha Cristoforetti blasted towards the International Space Station in November last year, she was carrying an unusual cargo in the form of a tiny telescope just 4 centimetres long and 1 centimetre in diameter attached to an unpowered CCD array from a smartphone camera. The telescope is part of an art project designed by the Dutch artist Diemut Strebe in which he intends to invoke quantum mechanics to generate all of the art ever made. Now MIT physicist Seth Lloyd has stepped forward to provide a scientific rationale for the project. He says the interaction of the CCD with the cosmic background radiation ought to generate energy fluctuations that are equivalent to the array containing all possible images in quantum superposition. Most of these will be entirely random but a tiny fraction will be equivalent to the great works of art. All of them! What's more, people on Earth can interact with these images via a second miniature telescope on Earth that can become correlated with the first. Lloyd says this is possible when correlated light enters both telescopes at the same time. Strebe plans to make his quantum space art exhibition available in several places before attaching the second telescope to the James Webb Space telescope and blasting that off into space too. Whatever your view on the art, it's hard not to admire Strebe's powers of persuasion in co-opting the European Space Agency, NASA and MIT into his project.
69075753
submission
KentuckyFC writes:
In the early 1940s, Glenn Seaborg made the first lump of plutonium by bombarding uranium-238 with neutrons in two different cyclotrons for over a year, The resulting plutonium, chemically separated and allowed to react with oxygen, weighed 2.77 micrograms. It was the first macroscopic sample ever created and helped win Seaborg a Nobel prize ten years later. The sample was displayed at the Lawrence Hall of Science in Berkeley until the early naughties, when it somehow disappeared. Now nuclear detectives say they've found Seaborg's plutonium and have been able to distinguish it from almost all other plutonium on the planet using a special set of non-destructive tests. The team say sample is now expected to go back on display at Seaborg's old office at Berkeley.
68930045
submission
KentuckyFC writes:
Physicists have long hoped to unify the two great theories of the 20th century--general relativity and quantum mechanics. And yet a workable theory of quantum gravity is as far away as ever. Now one theorist has discovered that the uniquely quantum property of entanglement does indeed influence a gravitational field and this could pave the way for the first experimental observation of a quantum gravity phenomenon. The discovery is based on the long-known quantum phenomenon in which a single particle can be in two places at the same time. These locations then become entangled--in other words they share the same quantum existence. While formulating this phenomenon within the framework of general relativity, the physicist showed that if the entanglement is tuned in a precise way, it should influence the local gravitational field. In other words, the particle should seem heavier. The effect for a single electron-sized particle is tiny--about one part in 10^37. But it may be possible to magnify the effect using heavier particles, ultrarelativistic particles or even several particles that are already entangled.
68154703
submission
KentuckyFC writes:
Statisticians have long thought it impossible to tell cause and effect apart using observational data. The problem is to take two sets of measurements that are correlated, say X and Y, and to find out if X caused Y or Y caused X. That's straightforward with a controlled experiment in which one variable can be held constant to see how this influences the other. Take for example, a correlation between wind speed and the rotation speed of a wind turbine. Observational data gives no clue about cause and effect but an experiment that holds the wind speed constant while measuring the speed of the turbine, and vice versa, would soon give an answer. But in the last couple of years, statisticians have developed a technique that can tease apart cause and effect from the observational data alone. It is based on the idea that any set of measurements always contain noise. However, the noise in the cause variable can influence the effect but not the other way round. So the noise in the effect dataset is always more complex than the noise in the cause dataset. The new statistical test, known as the additive noise model, is designed to find this asymmetry. Now statisticians have tested the model on 88 sets of cause-and-effect data, ranging from altitude and temperature measurements at German weather stations to the correlation between rent and apartment size in student accommodation.The results suggest that the additive noise model can tease apart cause and effect correctly in up to 80 per cent of the cases (provided there are no confounding factors or selection effects). That's a useful new trick in a statistician's armoury, particularly in areas of science where controlled experiments are expensive, unethical or practically impossible.
67675191
submission
KentuckyFC writes:
Physicists at the Max Planck Institute for Chemistry in Germany have measured sulphur hydride superconducting at 190 Kelvin or -83 degrees Centigrade, albeit at a pressure of 150 gigapascals, about the half that at the Earth's core. If confirmed, that's a significant improvement over the existing high pressure record of 164 kelvin. But that's not why this breakthrough is so important. Until now, all known high temperature superconductors have been ceramic mixes of materials such as copper, oxygen lithium, and so on, in which physicists do not yet understand how superconductivity works. By contrast, sulphur hydride is a conventional superconductor that is described by the BCS theory of superconductivity first proposed in 1957 and now well understood. Most physicists had thought that BCS theory somehow forbids high temperature superconductivity--the current BCS record-holder is magnesium diboride, which superconducts at just 39 Kelvin. Sulphur hydride smashes this record and will focus attention on other hydrogen-bearing materials that might superconduct at even higher temperatures. The team behind this work point to fullerenes, aromatic hydrocarbons and graphane as potential targets. And they suggest that instead of using high pressures to initiate superconductivity, other techniques such as doping, might work instead.
67595641
submission
KentuckyFC writes:
The global popularity of Twitter allows new words and usages to spread rapidly around the world. And that has raised an interesting question for linguists: is language converging into a global "netspeak" that everyone will end up speaking? Now a new study of linguistic patterns on Twitter gives a definitive answer. By looking at neoligisms in geo-located tweets, computational linguists have been able to study exactly how new words spread in time and space. It turns out that some neoligisms spread like wildfire while others are used only in areas limited by geography and demography, just like ordinary dialects. For example, the word "ard", a shortened version of "alright" cropped up in Philadelphia several years ago but even now is rarely used elsewhere. By contrast, the abbreviation "af" meaning "as fuck", as in "this food is as good as fuck", has spread across the US in just a couple of years. The difference in the way new words spread is the result of the geographic and demographic characteristics of the communities in which the words are used. The work shows that the evolution of language on Twitter is governed by the same cultural fault lines as ordinary communication. So we're safe from a global "netspeak" for now.