Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Re: what will be more interesting (Score -1, Troll) 480

by tmosley (#49345581) Attached to: Jeremy Clarkson Dismissed From Top Gear
"this separation is taken very seriously"

Hahaha, it's like you were born yesterday. Or maybe 80 years ago. The West has degraded. No lever of power is left unattended. The independence of media is a bald faced lie, and has been for at least 14 years, and I suspect that time will tell us that it was NEVER independent, and that it has always been controlled by intelligence organizations.

Comment: Re:Nuclear Disarmament is Idiotic (Score 1) 222

by tmosley (#49340875) Attached to: How Nuclear Weapon Modernization Undercuts Disarmament
Not really. There were plenty of wars in poor countries before nukes came about. More even than today. Today, small countries have to worry about stepping on the toes of nuclear powers when thinking about declaring war on their neighbors, which has cut the number of wars fairly dramatically.

Comment: Nuclear Disarmament is Idiotic (Score 3, Insightful) 222

by tmosley (#49337149) Attached to: How Nuclear Weapon Modernization Undercuts Disarmament
Nuclear weapons prevent wars between great powers with great success. Only a complete idiot or a warmonger of the most evil type would call for nuclear disarmament. Of course, one of those groups is very useful for the other.

If we didn't have all this nuclear non-proliferation nonsense, not only would the world be a peaceful place, but we'd have cheap, abundant nuclear power everywhere. There wouldn't be any "developing" countries--they would all be first world.

Trying to have wars in a world with nuclear weapons is like trying to have gangs of roving banditos in a nation where everyone carries around rifles and handguns. It's just not possible, and anyone who tries won't last very long.

Comment: Re:Agreed. (Score 1) 280

by tmosley (#49335989) Attached to: Steve Wozniak Now Afraid of AI Too, Just Like Elon Musk
Wow, you are a complete idiot. If you were in charge, we'd all be doomed.

You are certainly free not to think or listen to some of the foremost thinkers in the field (ie the good people at Lesswrong). Also, nice use of the racecard. Never had that pulled on me in a discussion about AIs before, but it is just as good of a thought killer there as it is everywhere else.

Comment: Re:Quantum Computing Required? (Score 1) 280

by tmosley (#49330339) Attached to: Steve Wozniak Now Afraid of AI Too, Just Like Elon Musk
Who said anything about modelling a human brain? We are talking about building an AGI, not an artificial human.

That said, neural nets do resemble neural connections in animals. They aren't being modeled by transistors directly (ie one transistor!=one neuron, or neural net node), but rather the NN is being simulated with a GPU.

Comment: Re:Quantum Computing Required? (Score 1) 280

by tmosley (#49329795) Attached to: Steve Wozniak Now Afraid of AI Too, Just Like Elon Musk
"there is zero scientific basis for it."

I disagree. We are creating individual modules, getting computers to do what until very recently only human minds could do. If we can make a neural net that can tag pictures with what's in it and what those things are doing, then cognition isn't really all that far behind. We just have to make the same leap that evolution did when humanity was born, only we will have it easier because unlike evolution, we act with deliberate purpose, and we now know which part of the brain is responsible for long term planning, which I suspect is the missing link between what we have done so far, iterated out until it can carry out all the functions of the animal brain, and the complete AGI.

Also, you seem to completely miss the point of AGI. It isn't to create a special little snowflake homonculus, but rather to create an optimizer, a Great Optimizer that will remake the universe in our image, ensuring our survival and freedom at least until the heat death of the universe. Whether it feels feelings or pretends to feel feelings or feels nothing and communicates as much, that is all fine, so long as it carries out its purpose, and so long as its purpose is properly programmed with the long term best interests of all humanity and other sentient beings at heart.

Comment: Re:Agreed. (Score 2) 280

by tmosley (#49329665) Attached to: Steve Wozniak Now Afraid of AI Too, Just Like Elon Musk
"rebel"

No, just the opposite. I think a strong AI will carry out its programming to the letter. The problem comes when it is given open ended problems like "maximize the number of paperclips in your collection.

The need to fulfill such a task will drive it towards self improvement and also cause it to eliminate potential threats to its end goal. Threats like, say, all of humanity.

Comment: Re:Ultracompetent robots (Score 3, Insightful) 111

by tmosley (#49328289) Attached to: Bring On the Boring Robots
Yes, in the sense that I, as a human, could interpret signals from a keyboard. Not nearly as efficient as the digital method.

There is now an AI which can be shown a picture (or a hundred trillion of them) and label not only what is in the picture (say, a little girl and a dog) but can identify what is going on in the picture (the little girl is playing with the dog). There is another that can look at a picture and identify the sentiment being expressed by that picture. There is yet another that can take a sample of writing and give a fairly accurate and fairly reproducible psychological profiles on the authors.

Also note that you have again proven how little you actually know about the field by trivializing visual processing by comparing it to keyboard recognition. We are creating little parts of brains here, but you don't understand that for some reason. I suspect it has something to do with your advancing age.

Comment: Re:Quantum Computing Required? (Score 1) 280

by tmosley (#49328141) Attached to: Steve Wozniak Now Afraid of AI Too, Just Like Elon Musk
My understanding was that quantum computing allows for massively parallel computations, not increased speed of communications, and certainly not an increase in efficiency. IE its good for doing some tasks that are hard today, like cracking encryption, but its no better at adding 2 and 2 than a regular computer, maybe even much worse.

You can be replaced by this computer.

Working...