Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Defending Against Harmful Nanotech and Biotech 193

Maria Williams writes "KurzweilAI.net reported that: This year's recipients of the Lifeboat Foundation Guardian Award are Robert A. Freitas Jr.and Bill Joy, who have both been proposing solutions to the dangers of advanced technology since 2000. Robert A. Freitas, Jr. has pioneered nanomedicine and analysis of self-replicating nanotechnology. He advocates "an immediate international moratorium, if not outright ban, on all artificial life experiments implemented as nonbiological hardware. In this context, 'artificial life' is defined as autonomous foraging replicators, excluding purely biological implementations (already covered by NIH guidelines tacitly accepted worldwide) and also excluding software simulations which are essential preparatory work and should continue." Bill Joy wrote "Why the future doesn't need us" in Wired in 2000 and with Guardian 2005 Award winner Ray Kurzweil, he wrote the editorial "Recipe for Destruction" in the New York Times (reg. required) in which they argued against publishing the recipe for the 1918 influenza virus. In 2006, he helped launch a $200 million fund directed at developing defenses against biological viruses."
This discussion has been archived. No new comments can be posted.

Defending Against Harmful Nanotech and Biotech

Comments Filter:
  • In summary... (Score:0, Insightful)

    by Anonymous Coward on Monday March 13, 2006 @11:17AM (#14907552)
    Pompous blowhard who no longer does any real research gives award to two other pompous blowhards who also no longer do any real work.
  • Pandora's Box (Score:5, Insightful)

    by TripMaster Monkey ( 862126 ) * on Monday March 13, 2006 @11:18AM (#14907558)
    From TFS:
    He [Robert A. Freitas, Jr.] advocates "an immediate international moratorium, if not outright ban, on all artificial life experiments implemented as nonbiological hardware.
    Sorry to disagree with a Lifeboat Foundation Guardian Award winner, but this approach is doomed to failure. Every prohibition creates another underground. If a moratorium or ban is imposed, then only the people with contempt for the ban will be the ones doing the research...and these are precisely the people who are more apt to unleash something destructive, either accidentally or maliciously.

    A moratorium or ban is the worst possible thing we could do at this juncture. The technology is available now, and if we want to be able to defend ourselves against the problems it can cause, we have to be familiar enough with it to be able to devise a solution. Burying our heads in the sand will not make this problem go away. Like it or not, Pandora's Box is open, and it can't be closed again...we have to deal with what has escaped.
  • Obviously... (Score:5, Insightful)

    by brian0918 ( 638904 ) <brian0918.gmail@com> on Monday March 13, 2006 @11:19AM (#14907575)
    Obviously, to stop potential misuse of advancing technology, we must stop technology from advancing, rather than stop those who are likely to misuse it from having access to it and the power to misuse it...
  • Re:Pandora's Box (Score:4, Insightful)

    by Daniel Dvorkin ( 106857 ) * on Monday March 13, 2006 @11:24AM (#14907597) Homepage Journal
    Bingo. I was particularly amused (in a sad, laughing-because-it-hurts way) by the note that Joy "helped launch a $200 million fund directed at developing defenses against biological viruses" -- this is kind of like the X-Prize foundation calling for a moratorium on development of rocket engines.
  • by fishdan ( 569872 ) * on Monday March 13, 2006 @11:37AM (#14907720) Homepage Journal
    The thing is, if you read Why the future doesn't need us [wired.com], or if you even think about it a little bit -- the possibility of killing machines being a real threat to humanity is not that far fetched.

    We have done a good job (IMHO) of keeping our nuclear power plants relatively safe, but that's mainly because the kid down the street can't build a nuclear power plant. But he can build a robot [lego.com].

    And imagine the robot you could build now with the resources of a rogue state. Or even a "good" state worried about it's security. Now imagine what they'll be able to build in 20 years. I could easily imagine Taiwan thinking that a deployable, independant (not remotely controlled) infantry killing robot might make a lot of sense for them in a conflict with China. And Taiwan's clearly got the ability to build state of the art stuff.

    I'm not a Luddite, I'm not even saying don't make killer robots. I'm just saying that just as the guys working on The Manhatten Project [amazon.com] were incredibly careful -- In fact alot of their genius is in the fact they did NOT accidentally blow themselves up. Programmers working on the next generation devices need to realize that there is a very credible threat that mankind could build a machine that could malfunction and kill millions.

    There is no doubt in my mind that within 20 years, the U.S. Military will deploy robots with the ability to kill in places that infantry used to go. Robots would seem very likely to be incredibly effective as fighter pilots as well. Given these things as inevitable, isn't it prudent to be talking NOW about what steps are going to be taken to make sure that we don't unleash a terminator? I personally don't trust governments to be good about this either -- I'd like to make sure that the programmers are at least THINKING about these issues.

  • Re:Pandora's Box (Score:5, Insightful)

    by Otter ( 3800 ) on Monday March 13, 2006 @11:48AM (#14907836) Journal
    Every prohibition creates another underground. If a moratorium or ban is imposed, then only the people with contempt for the ban will be the ones doing the research...and these are precisely the people who are more apt to unleash something destructive, either accidentally or maliciously.

    In fact, early on in the development of recombinant DNA research, there was a voluntary moratorium until appropriate ethical and safety methods were put in place. Those measures were enacted in an orderly, thought-out way, research started up again and it turned out that the fears were wildly exaggerated.

    If a moratorium or ban is reasonably short-term and includes all serious researchers (voluntarily or through law), there's no reason why it can't be effective. Your vision of an underground is true for products like alcohol and marijuana, not for truly cutting edge research. There's no underground to do things that are genuinely difficult.

    (Not, by the way, that I'm saying there should be such a ban.)

  • Re:Pandora's Box (Score:4, Insightful)

    by Scarblac ( 122480 ) <slashdot@gerlich.nl> on Monday March 13, 2006 @12:11PM (#14908061) Homepage

    Close - it's like people who are so enthusiastic about the prospects of space travel, that they believe quantum warp megadrives may well be invented within a few months! And society isn't quite ready for that (perhaps in a couple of years?) - so we'd better call for a moratorium!

    In another post you called them Luddites, I think they're just about the total opposite of that. These are the names you always see in the forefront of strong AI and nanotech speculation, the fringe that would be the lunatic fringe if they weren't so ridiculously intelligent. Does KurzweilAI.net [kurzweilai.net] look like a Luddite site to you?

  • A dose of reality (Score:5, Insightful)

    by LS ( 57954 ) on Monday March 13, 2006 @12:13PM (#14908082) Homepage
    Why don't we start making regulations for all the flying car traffic while we're at it? How many children and houses have to be destroyed in overhead crashes before we do something about it? And what about all the countries near the base of the space elevator? What if that thing comes down? I certainly wouldn't want that in MY backyard. How about:

    * Overpopulation from immortality
    * Quantum computers used to hack encryption
    * Dilithium crystal polition from warp drives

    Come on! If you are aware of the current state of nano-tech? We've got nano-bottle brushes, nano-gears, nano-slotcar motors, nano-tubes. i.e. we've got nano-progress, zilch. We are a LONG FUCKING WAY from any real problems with this tech, in fact so far off that we will likely encounter problems with other technology before nanotech ever bites us. Worrying about this is like worrying about opening a worm-hole and letting dinosaurs back onto the earth because some physicist wrote a book about time-travel.

    We've got a few dozen other issues 1000 times more likely to kill us. Sci-Fi fantasy is an ESCAPE from reality, not reality itself.
  • You underestimate (Score:3, Insightful)

    by RealProgrammer ( 723725 ) on Monday March 13, 2006 @12:32PM (#14908260) Homepage Journal
    the power of Chance.

    Sooner or later, all numbers come up.
  • by elucido ( 870205 ) on Monday March 13, 2006 @12:33PM (#14908275)
    Rogue states? No, rogue individuals are what we have to worry about.
    You have to worry about terrorists of the future getting a hold of this. It's debateable if there are any true "rogue" states, as communist states are sanctioned and isolated. North Korea is a threat, but China has influence over North Korea and it's not in China's best interest to allow North Korea to go terrorist. I don't think we have to worry about the middle east anymore, the middle east is being liberated as we speak and by the time this technology comes along the middle east will be as Democratic as Japan.

    The war on terrorism is neccessary to PREVENT people from abusing these kinds of technology. INDIVIDUALS, not rogue states. You talk about states as if states arent made up of people.

  • by dpilot ( 134227 ) on Monday March 13, 2006 @12:38PM (#14908317) Homepage Journal
    Asimov's Three Laws were always nifty tools for fiction, and certainly gave ground for constructing interesting plots.

    But the hard point about the 3 laws, and the short-shrift given them was that it was *hard* to do. At the most elementary, *how* do you recognize a human being? How do you tell it from a robot or a mannequin, so that when there's imminent danger you go right past them and save the amputee with bandages on his face? How do you evaluate the orders by one human won't cause harm to another? "We're testing this rocket against that abandoned building, shoot it," when the so-called abandoned building is actually in use.

    Or an even simpler problem - recognizing and interpreting a spoken command.

    Killer Robots are a heckuva lot easier to create than ones that will obey the Three Laws.
  • Re:In summary... (Score:5, Insightful)

    by Jerf ( 17166 ) on Monday March 13, 2006 @01:54PM (#14909030) Journal
    If that were possible, don't you think that evolution would have come up with it already?

    The rest of your argument is good, but this is not a valid point. Evolution can only progress from point to point in the space of possible life forms in very small increments, when measured appropriately. (Earth evolution only, for instance, uses DNA, so Earth evolution can be measured fairly accurately by "DNA distance", but technically that's just a small part of the life-form space.)

    There are, presumably, life forms that are possible, but can not be evolved to, because there is no path from any feasible starting life form to the life form in question by a series of small steps. Presumably, given the huge space of "possible life forms", the vast majority in fact belong to this class, just as the vast majority of "numbers" aren't integers (although not with the same ratio; presumably the set of viable life forms is finite, if fuzzy).

    It is entirely possible that a "grey goo" machine, which would fulfill most definitions of life, can't be incrementally evolved to, yet it could still exist. It is also possible that it could be evolved to, but simply hasn't yet.

    For all the complexity that evolution has popped out, it has explored an incomprehensibly small portion of the space of possible life forms.
  • by Kadin2048 ( 468275 ) <slashdot.kadin@xox y . net> on Monday March 13, 2006 @02:01PM (#14909103) Homepage Journal
    I think that you are forgetting that there are people out there -- some quite intelligent people, actually -- who can be totally aware and cognizant of the fact that they're doing something immoral, and do it anyway. In the extreme, we call them sociopaths, but even "average" people will do amoral things if they're compensated correctly.

    There are lots of people who worked in weapons development who were probably good, law-abiding, go-to-church-on-Sunday people who believe killing is wrong; I'm willing to bet all of them probably thought they were moral.

    People have an incredible ability to compartmentalize their lives; you can try to indoctrinate some researcher that making a new strain of superflu or a bigger bomb is bad, and they'll still go in to work on Monday and do it. Maybe they'll develop a drinking problem, beat their wife, or get depressed, but I think they'd still do it. People work on projects because aside from the tangible benefits (paycheck, nice car/house, etc.) it's a technical challenge. No amount of moralizing is going to make that less attractive to people who are really good an interested in the subject matter.

    And you'd always have the quasi-sociopaths who just don't give a damn about morality and can happily say "yep, I'm building a bigger bomb, it's going to kill millions of people at once, but isn't it beautiful?" It doesn't matter what the government does to encourage or discourage them, they're going to do their thing one way or another. If they weren't given cubicles to fill in some US weapons-research lab, the brightest and most highly-motivated would find someone else to do it for. (E.g., Gerald Bull, the guy behind the Iraqi 'Supergun,' before he was assassinated.) I would much prefer to have people like that working in a bunker somewhere in Nevada than Manchuria.

    It's naive to think that the people who work on weapons would just work on vaccines or solving world hunger; I think you need to consider the possibility that many of them may enjoy their jobs, and do them with the full knowledge of what their inventions are used for.
  • by santiago ( 42242 ) on Monday March 13, 2006 @02:22PM (#14909291)
    As someone with a graduate degree in robotics from the largest robotics research center in North America, I find the concept of robots posing any sort of threat to anything more than a handful of humans at at time to be completely laughable for now and the forseeable future. Even were we to produce robots sufficiently competent to be capable of causing intentional lasting harm, it would only be at the behest of their controllers, due to the amount of maintenance required to keep them running. A self-maintaining, much less self-replicating robotic threat of any sort is decades away, at a minimum. The current level of deadliness a robot can generate is a cruise missile, which is a robotic suicide bomber that will kill you dead, but in no poses a threat to humanity as a whole.
  • by ajnsue ( 773317 ) on Monday March 13, 2006 @02:37PM (#14909420)
    Is there a clever name for intelligent people who feel compelled to voice an opinion outside of their field of expertise?
    I want to hear more people admit they are not qualified to comment Authoratively on important issues. I've got a really good mechanic - but I dont ask his opinion on my termite problem, even though I am sure he may have some better than average insight.
    Unfortunately our celebrity obsessed culture re-enforces this problem by churning the same pot of opinion and viewpoint. What does Oprah think about the middle east, what does Ray Kurzweil think about BioTechnology?
    Yes, yes they are smart. But I would like smart people to defer to other smart people. There are no one stop genius' anymore
  • by Kadin2048 ( 468275 ) <slashdot.kadin@xox y . net> on Monday March 13, 2006 @03:28PM (#14909930) Homepage Journal
    Well that's not entirely what I meant. Ethical considerations will affect some people's actions, perhaps even most people's actions, so teaching (indoctrinating) them is worthwhile on a large scale. Teaching people that stealing and violence are wrong keeps society from falling apart overnight. However, that doesn't mean that stealing and violence don't occur.

    With something like weapons development, what I'm saying is that it's better to take the people who are going to make bigger and better killing machines (or viruses, or whatever) and give them a way to do it that minimizes the risk of those same weapons being used against us. Or really, do something to insulate the people who are going to make weapons from the people who are going to use them indiscriminately. Since you're not going to stop people from either, you can at least try to keep them far apart.

    Thus I'm quite happy to have the U.S. Government -- being a U.S. citizen, I prefer to be on the side with all the guns, of course your opinions can vary from mine -- keep the weapons researchers gainfully employed in our labs. If we did as some peaceniks occasionally suggest, and just stopped paying for it tomorrow, we'd have a situation like they had/have in Russia, where suddenly a lot of nuclear scientists/engineers are FedExing their resumes to North Korea and Iran.

    You may not think that the U.S. is necessarily the best entity to be in control of all those weapons, and I might even agree with you in theory, but there are a whole lot of worse people to have them.
  • by humina ( 603463 ) on Monday March 13, 2006 @05:59PM (#14911180)
    Individuals do not have the resources to develop and deploy weapons the same way that states do. People die on a mass scale from rouge states, not from rouge individuals. The more money that states pour into developing weapons, the more likely they are to use them. The US has developed and used many different chemical, biological and nuclear weapons. The US has also used these same weapons. I could easily see the US developing more devastating and lethal weapons and then using them.

    "the middle east will be as Democratic as Japan."

    We can all dream. I'm hopeful this will happen but skeptical that it will.

    "The war on terrorism is necessary to PREVENT people from abusing these kinds of technology. INDIVIDUALS, not rogue states."

    The war on terrorism is going so well that osama is still not caught after about 4.5 years. That war is not going so well. The US operations in Iraq acts as a wonderful recruitment tool for terrorist organizations. Preventing someone from becoming a terrorist is all about winning hearts and minds. We lost that battle a while ago. Terrorists don't use advanced technology to kill people. They use the guns and weapons the US sold to them in the 60's 70's 80's, or whenever we sold them weapons, or they use other methods such as planes.

    A lot of people make the mistake of saying that the war on terrorism is working because we have not had a nuke or a biological attack in the US. It's like saying I've never seen flying pink cows cause I sprinkled magic dust in the air. The dust must work cause I've never seen one of those cows yet. The fact is that terrorists won't use nukes or biological weapons. They will use crude weapons cause that's all they can afford. You can't prevent someone from obtaining a crude weapon so you HAVE to prevent that person from becoming a terrorist in the first place.

  • by Anonymous Coward on Tuesday March 14, 2006 @12:53AM (#14913520)
    ...blood fueds. When the axis of maximum profits kills a muslim over there, ALL that guys relatives want revenge. Unless the US and UK engages in total 100% genocide, they will never "win". People who understand the middle east tried to tell those neocon MORONS this before they started. a lot of their own hired analyst employees, civil and military, told them this, and they all got FIRED for talking out of turn. Of course, if you read those troskyite neocon nutjobs previous essays and foreign policy papers, that was the PLAN, the "clash of the civilizations", and 9-11 was the EXCUSE they allowed to go down on purpose to implement this harebrained scheme. They are not only stupid, insane and as completely loony as the raggiest headed muslim fundy, but they are murderers of the highest order, and traitors to boot.

    It will completely destroy the west over time, then the rest of the planet. It's the worst possible foreign policy gambit of the last several hundred years, bar none. You'll see it hitting hard over the next year as the phony economy continues to crumble, then you'll see the rest of the planet pulling away from the dollar, then you'll see them get real desperate and start as many wars as they think it will take to divert attention and stay in power. And when they try that stunt, a lot of the larger nations elsewhere on the planet will temporarily ally with each other and nuke the living snot out of the US and UK.

    It is the mother of all screwups.

Kleeneness is next to Godelness.

Working...