ananyo writes "By analysing the chemical structure of a drug, researchers can see if it is likely to bind to, or ‘dock’ with, a biological target such as a protein. Researchers have now unveiled a computational effort that used Google's supercomputers to assesses billions of potential dockings on the basis of drug and protein information held in public databases. The effort will help researchers to find potentially toxic side effects and to predict how and where a compound might work in the body.
“It’s the largest computational docking ever done by mankind,” says Timothy Cardozo, a pharmacologist at New York University’s Langone Medical Center, who presented the project at the US National Institutes of Health’s High Risk–High Reward Symposium in Bethesda, Maryland. The result, a website called Drugable, is still in testing, but it will eventually be available for free, allowing researchers to predict how and where a compound might work in the body, purely on the basis of chemical structure"Link to Original Source
ananyo writes "Bowing to scientists' near-universal scorn, the journal Food and Chemical Toxicology has fulfilled its threat to retract a controversial paper that claimed that a genetically modified (GM) maize causes serious disease in rats after the authors refused to withdraw it.
The paper, from a research group led by Gilles-Eric Séralini, a molecular biologist at the University of Caen, France, and published in 20121, showed “no evidence of fraud or intentional misrepresentation of the data”, said a statement from Elsevier, which publishes the journal. But the small number and type of animals used in the study means that “no definitive conclusions can be reached”. The known high incidence of tumours in the Sprague-Dawley rat ”cannot be excluded as the cause of the higher mortality and incidence observed in the treated groups”, it added.
Today’s move came as no surprise. Earlier this month, the journal’s editor-in-chief, Wallace Hayes, threatened retraction if Séralini refused to withdraw the paper, which is exactly what he announced at a press conference in Brussels this morning. Séralini and his team remained unrepentant, and allege that the retraction derives from the journal's editorial appointment of biologist Richard Goodman, who previously worked for biotechnology giant Monsanto for seven years."Link to Original Source
ananyo writes "Science has a much publicized reproducibility problem. Many experiments seem to be failing a key test of science — that they can be independently verified by another lab. But now 36 research groups have struck a blow for reproducibility, by successfully reproducing the results of 10 out of 13 past experiments in psychology. Even so, the Many Labs Replication Project found that the outcome of one experiment was only weakly supported and they could not replicate two of the experiments at all."Link to Original Source
ananyo writes "New genome sequences from two extinct human relatives suggest that these ‘archaic’ groups bred with humans and with each other more extensively than was previously known.
The ancient genomes, one from a Neanderthal and one from a different archaic human group, the Denisovans, were presented at a meeting at the Royal Society in London. They suggest that interbreeding went on between the members of several ancient human-like groups living in Europe and Asia more than 30,000 years ago, including an as-yet unknown human ancestor from Asia.
“What it begins to suggest is that we’re looking at a ‘Lord of the Rings’-type world — that there were many hominid populations,” says Mark Thomas, an evolutionary geneticist at University College London who was at the meeting but was not involved in the work."Link to Original Source
ananyo writes "Nature has an article up about how crowdsourcing and social media were integrated as never before by the United Nations, the Red Cross and Doctors without Borders in the relief efforts following typhoon Haiyan:
After typhoon Haiyan smashed into the Philippines on 8 November, an army of volunteers mobilized and worked around the clock to help guide relief efforts. But these were no boots on the ground. Instead, they were citizens from around the world who quickly analysed satellite imagery and other data, generating maps to provide relief agencies with invaluable crowdsourced information.
Crowdsourced disaster response, until a few years ago informal and often haphazard, is now getting more organized, and is being embraced by official humanitarian organizations and integrated into relief operations. Volunteer efforts have multiplied thanks to the arrival of online mapping tools, the increasing popularity of social networks such as Twitter and Facebook, and the spread of mobile phones. A suite of volunteer groups are emerging that contribute to disaster response in tight coordination with conventional relief organizations."Link to Original Source
ananyo writes "Research on Borrelia burgdorferi, the bacterium that causes Lyme disease, shows that the capacity to evolve can itself be the target of natural selection.
B. burgfdorferi can cause a chronic infection even if its animal host mounts a strong immune response — evading those defences by tweaking the shape and expression of its main surface antigen, VIsE. A series of unexpressed genetic sequences organized into ‘cassettes’ recombine with the VIsE gene, changing the resulting protein such that it escapes detection by the host’s immune system.
The researchers studied the molecular evolution of the cassettes’ genetic sequences in 12 strains of B. burgdorferi. They found that natural selection seemed to favour bacteria with more genetic variability within their cassettes, and hence a greater capacity to generate different versions of the antigen.
“Greater diversity among the cassettes in itself shouldn’t be a selective advantage considering they aren’t expressed and don’t do anything else,” says lead author Dustin Brisson. “But we did find evidence of selection, so the question is what else could it be for besides evolvability?”"Link to Original Source
ananyo writes "One of the cornerstones of quantum theory is the principle that you cannot measure any property of an object without affecting the object itself. Physicists, however, have now devised a way to detect single photons of visible light without changing any of the information that they carry. Others had done the same with microwave photons, but this is the first time that it has been done in the part of the spectrum that could matter for a future 'quantum Internet'."Link to Original Source
ananyo writes "When Europe’s Large Hadron Collider (LHC) started up in 2008, particle physicists would not have dreamt of asking for something bigger until they got their US$5-billion machine to work. But with the 2012 discovery of the Higgs boson, the LHC has fulfilled its original promise — and physicists are beginning to get excited about designing a machine that might one day succeed it: the Very Large Hadron Collider (VLHC).
The giant machine would dwarf all of its predecessors (see ‘Lord of the rings’). It would collide protons at energies around 100 teraelectronvolts (TeV), compared with the planned 14TeV of the LHC at CERN, Europe’s particle-physics lab near Geneva in Switzerland. And it would require a tunnel 80–100 kilometres around, compared with the LHC’s 27-km circumference. For the past decade or so, there has been little research money available worldwide to develop the concept. But this summer, at the Snowmass meeting in Minneapolis, Minnesota — where hundreds of particle physicists assembled to dream up machines for their field’s long-term future — the VLHC concept stood out as a favourite."Link to Original Source
ananyo writes "The plague of non-reproducibility in science may be mostly due to scientists’ use of weak statistical tests, as shown by an innovative method developed by statistician Valen Johnson, at Texas A&M University. Johnson found that a P value of 0.05 or less — commonly considered evidence in support of a hypothesis in many fields including social science — still meant that as many as 17–25% of such findings are probably false. He advocates for scientists to use more stringent P values of 0.005 or less to support their findings, and thinks that the use of the 0.05 standard might account for most of the problem of non-reproducibility in science — even more than other issues, such as biases and scientific misconduct."Link to Original Source
ananyo writes "Key members of the US House of Representatives are seeking to require the National Science Foundation (NSF) to justify every grant it awards as being in the “national interest”. The proposal, included in a draft bill from the Republican-led House Committee on Science, Space, and Technology and obtained by Nature, would force the NSF to document how its basic science grants benefit the country.
The requirement is similar to one in a discussion draft circulated in April by committee chairman Lamar Smith (Republican, Texas). At the time, scientists raised concerns that ‘national interest’ was defined far too narrowly. The current draft bill provides a more expansive definition that includes six goals: economic competitiveness, health and welfare, scientific literacy, partnerships between academia and industry, promotion of scientific progress, and national defence.
But many believe that predicting the broader impacts of basic research is tantamount to gazing into a crystal ball. All scientists know it’s nonsense,” says John Bruer, president of James S. McDonnell Foundation and former co-chair of an NSF task force that examined requiring scientists to state the 'broader impacts' of their work in grant applications."Link to Original Source
ananyo writes "First came reports of earthquakes caused by hydraulic fracturing and the reinjection of water during oil and gas operations. Now US scientists are reporting tremors may have been caused by the injection of carbon dioxide during oil production.
The evidence centres on a sudden burst of seismic activity around an old oil field in the Permian Basin in northwest Texas. From 2006 to 2011, after more than two decades without any earthquakes, seismometers in the region registered 38 tremors, including 18 larger quakes ranging from magnitude 3 to 4.4, scientists report in the Proceedings of the National Academy of Sciences. The tremors began just two years after injections of significant volumes of CO2 began at the site, in an effort to boost oil production.
“Although you can never prove that correlation is equal to causation, certainly the most plausible explanation is that [the tremors] are related to the gas injection,” says Cliff Frohlich, a seismologist at the University of Texas Institute for Geophysics in Austin, who co-authored the study."Link to Original Source
ananyo writes "A US team that claims to have built the world’s most sensitive dark matter detector has completed its first data run without seeing any sign of the stuff. In a webcast presentation today at the Sanford Underground Laboratory in Lead, South Dakota, physicists working on the Large Underground Xenon (LUX) experiment said they had seen nothing statistically compelling in 110 days of data-taking. “We find absolutely no events consistent with any kind of dark matter,” says LUX co-spokesman Rick Gaitskell, a physicist at Brown University in Providence, Rhode Island.
Physicists know from astronomical observations that 85% of the Universe’s matter is dark, making itself known only through its gravitational pull on conventional matter. Some think it may also engage in weak but detectable collisions with ordinary matter, and several direct detection experiments have reported tantalizing hints of these candidate dark matter particles, known as WIMPs (Weakly Interacting Massive Particles). Gaitskell says that it is now overwhelmingly likely that earlier sightings were statistical fluctuations.
Despite the no-shows at XENON-100 and LUX, Laura Baudis, a physicist on XENON-100 at the University of Zurich in Switzerland says physicists are not ready to give up on the idea of detecting WIMPs. They may simply have a lower mass, or may be more weakly interacting than originally hoped. “We have some way to go,” she says."Link to Original Source
ananyo writes "He founded two genetic-sequencing companies and sold them for hundreds of millions of dollars. He helped to sequence the genomes of a Neanderthal man and James Watson, who co-discovered DNA’s double helix. Now, entrepreneur Jonathan Rothberg has set his sights on another milestone: finding the genes that underlie mathematical genius.
Rothberg and physicist Max Tegmark, who is based at the Massachusetts Institute of Technology in Cambridge, have enrolled about 400 mathematicians and theoretical physicists from top-ranked US universities in a study dubbed ‘Project Einstein’. They plan to sequence the participants’ genomes using the Ion Torrent machine that Rothberg developed.
Critics say that the sizes of these studies are too small to yield meaningful results for such complex traits. But Rothberg is pushing ahead. “I’m not at all concerned about the critics,” he says, adding that he does not think such rare genetic traits could be useful in selecting for smarter babies.
Some mathematicians, however, argue that maths aptitude is not born so much as made.. “I feel that the notion of ‘talent’ may be overrated,” says Michael Hutchings, a mathematician also at Berkeley."Link to Original Source
ananyo writes "Using data pulled from online genealogy sites, a renowned ‘genome hacker’ has constructed what is likely the biggest family trees ever assembled. The researcher and his team now plan to use the data — including a single uber-pedigree comprising 13 million individuals, which stretches back to the 15th century — to analyse the inheritance of complex genetic traits, such as longevity and facial features.
In addition to providing the invitation list to what would be the world’s largest family reunion, the work presented by computational biologist Yaniv Erlich at the American Society of Human Genetics annual meeting in Boston could provide a new tool for understanding the extent to which genes contribute to certain traits. The pedigrees have been made available to other researchers, but Erlich and his team at the Whitehead Institute in Cambridge, Massachusetts, have stripped the names from the data to protect privacy."Link to Original Source
ananyo writes "Allen Nicklasson has had a temporary reprieve. Scheduled to be executed by lethal injection in Missouri on 23 October, the convicted killer was given a stay of execution by the state’s governor, Jay Nixon, on 11 October — but not because his guilt was in doubt. Nicklasson will live a while longer because one of the drugs that was supposed to be used in his execution — a widely used anaesthetic called propofol — is at the centre of an international controversy that threatens millions of US patients, and affects the way that US states execute inmates.
Propofol, used up to 50 million times a year in US surgical procedures, has never been used in an execution. If the execution had gone ahead, US hospitals could have lost access to the drug because 90% of the US supply is made and exported by a German company subject to European Union (EU) regulations that restrict the export of medicines and devices that could be used for capital punishment or torture.
This is not the first time that the EU’s anti-death-penalty stance has affected the US supply of anaesthetics. Since 2011, a popular sedative called sodium thiopental has been unavailable in the United States.
“The European Union is serious,” says David Lubarsky, head of the anaesthesiology department at the University of Miami Miller School of Medicine in Florida. “They’ve already shown that with thiopental. If we go down this road with propofol, a lot of good people who need anaesthesia are going to be harmed.”"Link to Original Source