Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Big bags of water... that's what we are. (Score 0) 156

Yes, if you had a self-sustaining moon colony it would survive most global disasters. But so would a Mars colony, and a Mars colony would be much easier to establish. Once you have boosted out of Earth's gravity well, the difference between going to Luna and going to Mars is minimal, especially for non-living supplies that don't require life support.

Comment Re:Big bags of water... that's what we are. (Score 0) 156

We have landers now that can soft land thousands of pounds at once. Much heavier weights can be balloon-bounce landed at higher shock forces. Many items, including food and reserve oxygen could stand such forces. As to how much.... you can't have too much. Start sending the stuff there and keep sending it for the 100 or 1000 years... as much as is needed. There is no higher priority than the survival of the species. We spent $2 trillion on Iraq and got NOTHING for it. So if it costs $10 trillion or $100 trillion or $1000 trillion to save the species, that would be a good deal.

Comment Big bags of water... that's what we are. (Score 3, Interesting) 156

The human body is a fragile bag of water, not well suited to radiation exposure, temperature extremes,changes in air pressure, high acceleration forces, or long periods of isolation from a sustaining biosphere. Almost anything that can be done in space is better done by robots. The ONLY reason for people to venture into space is to get to the surface of another habitable planet for which we are evolved. And there is only one such place in reach: MARS!

Yes there are good reasons for going to Mars. Greatest among them is to safeguard the species from any catestrophic impacts on Earth they would extinguish us. We have the technology to colonize Mars now. To make it economical, colonization should be a one-way pioneering trip. Nobody comes back, ever. (I made this suggestion to NASA 17 years ago and was told that NASA does not do suicide missions. Now, many folks at NASA have come around to my point of view. )

Rep. Culberson has not learned the crucial lesson from the demise of the Apollo program... that political motivations for exploring space are not sustainable in the minds of a fickle constituency that wants to be entertained by a list of new "American Firsts in Space". Colonization of Mars requires the serious dedication of the best scientists of Earth to the mission of human survival.

Forget the moon. In terms of the fuel required to reach it on a one-way mission, it is not really much closer than Mars. I has far less to offer as a base for a new sustainable human civilization. (Although I'm sure it would make a nice military base to shoot stuff at Earth). The fact the Rep. Culberson is talking about returning to the moon is the best indication that he is not a serious thinker about why NASA should be involved in human space travel.

Comment Revolution (Score 4, Insightful) 628

The last line of the article is the most important. " Otherwise, people will find a solution — human beings always do — but it may not be the one for which we began this technological revolution." The word "revolution" here is a double entendre. He is saying something you will not often read in the Harvard Business Review: that automation is going to destabilize society to an extreme extent.

mbone, you are exactly correct that this is about concentration of wealth. The concentration of wealth is self-limiting because nobody will have money to buy the goods being produced. But conditions of life at the limit will be unbearable. So a new mechanism to distribute the bounty of society will have to be developed. But the rich will not recognize that until the mobs with pitchforks are breaking into their gated communities.

Comment Mr Barone: (Score 2) 417

1) already rich --
So what? I never said his motivation was personal profit. There are many motivations to be self serving.

2) the head of an extremely well-funded (Paul Allen money) NON-PROFIT, with the business model of "let's try to do some cutting edge AI research with open source code"
  So what? He wants, for whatever reason, to make AIs. Is he ignoring the fact that programmers (and since this is open source everyone will have the source code) will decide to intentionally make the AI autonomous. In fact, I'm pretty sure someone will be working on autonomy long before the AI actually becomes functional.

and 3) an actual world-class expert in the field, rather than a smart person prognosticating about something he only casually understands
He is an expert in something that does not yet exist? That's ridiculous. There is a clearly real danger here. That does not necessarily mean stop, but it certainly does mean that extreme caution is warranted.

Comment Until the angry mob.... (Score 0) 162

figures out the algorithm. The problem here is that the mob has figured out how to abuse hashtags. So how long would it take for the mob to also figure out that they can take over the hashtag by making "thoughtful" ratings on the mobs favorite meme. The mob can iterate this process faster than the tweet masters can fix it.

Comment And another word for "Darwinian Evolution" is: (Score 2) 417

"Evolution." There is nothing un-Darwinian about non-biological evolution. Natural selection applies to any variable system in which survival or propagation success can depend on modifications to the system. In fact, evolution in a self-aware AI could proceed at an exponentially higher rate because
1. the generation time could be measured in milliseconds rather than decades
2. the AI could intentionally direct the changes to maximize success rather than depending on the MUCH slower processes of chance mutation.

Comment I guess Elon Musk and Stephen Hawking (Score 5, Interesting) 417

didn't think of that. Two of the smartest people on the planet apparently just forgot to consider the blindingly obvious fact that programmers are not going to intentionally program AI to have it's own agenda. Exept that:
1. Some programmers at some point will try to program autonomy and
2. Shit happens

Musk and Hawking are clearly smart enought to consider the autonomy argument and then DISCARD it. I, for one, welcome our cybernetic overlords, but lets not pretend that AI autonomy is not a threat. Mr. Etzioni has his own self-serving reasons to pooh-pooh warnings that could interfere with his business model. And I am so happy that I finally got to use the term "pooh-pooh" in a /. post.

Slashdot Top Deals

You have a message from the operator.

Working...