Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Bitcoin

Submission + - World's first bitcoin endowment fund has launched (lifeboat.com)

Maria Williams writes: The March 2013 EU–IMF grab of 40% of large bank accounts in Cyprus that held government-backed money was the key event that caused the Lifeboat Foundation to feel that the time for 21st century money was here and to launch the world's first bitcoin endowment fund. Brian Cartmell will match the next 500 BTC in donations.
Science

Submission + - Battling Asteroids, Nanobots, and A.I. (nytimes.com)

Maria Williams writes: The Lifeboat Foundation is a nonprofit that seeks to protect people from some seriously catastrophic technology-related events. It funds research that would prevent a situation where technology has run amok, sort of like a pre-Fringe Unit.

The organization has a ton of areas that it's looking into, ranging from artificial intelligence to asteroids. A particular interest for the group revolves around building shields and lots of them, such as Neuroethics Shield — "to prevent abuse in the areas of neuropharmaceuticals, neurodevices, and neurodiagnostics."

Space

Submission + - A Space Elevator in 7 Years (lifeboat.com)

Maria Williams writes: Interesting article!

The article ends with "I can imagine that any effort like this would get caught up in a tremendous amount of international political wrangling that could easily add years on to the project. We should not let this happen, and we should remind each other that the space elevator is just the railroad car to space — the exciting stuff is the cargo inside and the possibilities out there. A space elevator is not a zero sum endeavor: it would enable lots of other big projects that are totally unfeasible currently. A space elevator would enable various international space agencies that have money, but no great purpose, to work together on a large, shared goal. And as a side effect it would strengthen international relations.

Security

Submission + - 10th anniversary of Y2K (lifeboat.com) 1

Maria Williams writes: The 2009 Lifeboat Foundation Guardian Award has been given to Peter de Jager on the tenth anniversary of Y2K which he helped avert. This award is in recognition of his 1993 warning which alerted the world to the potential disaster that might have occurred on January 1, 2000 and his efforts in the following years to create global awareness of the problem, and the possible solutions. His presentations, articles, and more than 2,000 media interviews contributed significantly to the world's mobilization to avoid that fate.
Robotics

Submission + - Google and NASA back the Singularity University

Maria Williams writes: Google and NASA are throwing their weight behind a new school for futurists in Silicon Valley to prepare scientists for an era when machines become smarter than people.

The new institution, known as the "Singularity University" is to be headed by Ray Kurzweil.

Google and NASA's backing demonstrates the growing mainstream acceptance of Ray's views, which include a claim that before the middle of this century artificial intelligence will outstrip human beings, ushering in a new era of civilization.

To be housed at NASA's Ames Research Center, a stone's-throw from the Googleplex, the Singularity University will offer courses on biotechnology, nanotechnology, and artificial intelligence.
It's funny.  Laugh.

Submission + - Participants Riot at Foresight Unconference 1

Maria Williams writes: Found at the Lifeboat Blog: AP, Nov. 3 — The Foresight Unconference announced today that scientists have discovered that there in no world below 1 nanometer. "It's just space," said noted scientist Eric Drexler. "We don't know what to do!" Participants left dazed. Many decided to go back to philosophy, one attendee said.

"Like I said, It's full of stars!" said novelist Arthur C. Clarke, reached in Sri Lanka. "There's no there, there."

In a tragic note, famous author Ray Kurzweil was committed to an insane asylum, ranting "the Singularity isn't near! This is horrible!!!"

Participants then staged a bonfire outside of Yahoo! headquarters, burning copies of The Singularity Is Near and Engines of Creation . "We've been totally mislead!" screamed one woman, ripping her clothes off and jumping in the fire. "It's all over!!!"

"It's my fault," admitted conference organizer Christine Peterson. "By calling it the 'unconference', I cancelled out the entire science of nanotechnology."
Security

Submission + - Apocalypse Soon? Naval Group to Discuss Extinction 2

Maria Williams writes: Wired says Should the U.S. military be thinking more about asteroid shields, lifeshield bunkers and antimatter weapon shields? Oh, and an alien shield.

If these defensive systems/catastrophic scenarios are something you feel the Navy should be pondering, visit the Lifeboat Foundation's plea for donations. Lifeboat Foundation is dedicated "to helping humanity survive existential risks." The Chief of Naval Operations Strategic Studies Group contacted the foundation because it wants its future leaders to have the "opportunity to gain insights into the activities of the Lifeboat Foundation and have discussion about different programs you have to help 'safeguard humanity'."
Security

Submission + - James Martin is 2007 Guardian Award winner

Maria Williams writes: KurzweilAI.net says "The Lifeboat Foundation has presented its 2007 Guardian Award to Dr. James Martin in recognition of the achievements of his Future of Humanity Institute in studying global catastrophic risks and the impacts of future technologies.

The award is given annually to a respected scientist or public figure who has warned of a future fraught with dangers and encouraged measures to prevent them.

James Martin, author of 102 textbooks, founded the James Martin Institute for Science and Civilization, the James Martin 21st Century School at the University of Oxford, and The Future of Humanity Institute, which studies how anticipated technological developments may affect the human condition in fundamental ways, and how we can better understand, evaluate, and respond to radical change. FHI currently runs four interrelated research programs: human enhancement, global catastrophic risks, methodology and rationality, and impacts of future technologies."
Sci-Fi

Submission + - The Top Ten Transhumanist Technologies (instapundit.com)

Maria Williams writes: InstaPundit reports that Transhumanists advocate the improvement of human capacities through advanced technology. Not just technology as in gadgets you get from Best Buy, but technology in the grander sense of strategies for eliminating disease, providing cheap but high-quality products to the world's poorest, improving quality of life and social interconnectedness, and so on. Technology we don't notice because it's blended in with the fabric of the world, but would immediately take note of its absence if it became unavailable. (Ever tried to travel to another country on foot?) Technology needn't be expensive — indeed, if a technology is truly effective it will pay for itself many times over.

Transhumanists tend to take a longer-than-average view of technological progress, looking not just five or ten years into the future but twenty years, thirty years, and beyond. We realize that the longer you look forward, the more uncertain the predictions get, but one thing is quite certain: if a technology is physically possible and obviously useful, human (or transhuman!) ingenuity will see to it that it gets built eventually. As we gain ever greater control over the atomic structure of matter, our technological goals become increasingly ambitious, and their payoffs more and more generous. Sometimes new technologies even make us happier in a long-lasting way: the Internet would be a prime example.

This top 10 list was created by Michael Anissimov who was featured in The Singularity is Near.

Slashdot Top Deals

In every non-trivial program there is at least one bug.

Working...