Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: With a decent connection (Score 1) 453

by binkzz (#42192599) Attached to: Over 1000 Volunteers For 'Suicide' Mission To Mars

I'd consider it if I had enough stuff to keep me occupied up there. At the least, a decent internet connection.

Although, it'd be kind of problematic trying to register on websites.

"Country: Eh.. United State of Mars?"
"Continent: Shit I don't know, should have paid better attention at the briefing."

What laws would I have to adhere to?

Could I legally pirate software?

And would it be illegal for people to download from my Mars server?

Would I be the king of Mars?

Comment: Re:Progress. (Score 1) 104

by binkzz (#40382381) Attached to: A Faster Jigsaw Solving Algorithm

Still don't get it? Why would we even allow it to evolve? These machines will ultimately be created and controlled by us, it would be almost trivial for us to control their evolution and decision making processes. Why would we do the "let the AIs build smarter versions of themselves"-thing to begin with? Why would not just let them design smarter versions and then build them ourselves after we put in the limitations we decide they need?

Because we don't have a choice. If we decide to build a new AI based on the instructions of the "old" AI with an IQ of say, 500, there's no way we could be sure we could control it or know exactly how its decision making process works. I think it's a little naive and optimistic to think we could control something several times smarter than us, even if we keep it caged.

Comment: Re:Progress. (Score 1) 104

by binkzz (#40368035) Attached to: A Faster Jigsaw Solving Algorithm
I think rhere are a number of key differences in drive. AI motivation will unlikely work in the same way as ours (or we're doomed from the start). But once they're smarter than us, we will have them design even smarter versions, and once they reach IQs of over 400, there's no way we could possibly hope to imagine what goes on inside their brains. We could be manipulated as easily as ants.

But I still think that if we're given enough time on this planet, super intelligent and fully autonomous AI is inevitable. It excites me more than it scares me.

Still, as a Christian, I don't believe we'll ever get that far. And maybe we shouldn't, either.

Comment: Re:Progress. (Score 1) 104

by binkzz (#40367731) Attached to: A Faster Jigsaw Solving Algorithm

I never really understood this kind of fear of 'artificial intelligence'. I mean, yes we have all seen HAL9000 and Skynet in the movies, but what I never understood about those (aside from why they thought it was a good idea to put both systems in full control of mission critical stuff) was that they were supposed to be even remotely human-like. Even if we do create artificial intelligence it'll be *nothing* like human intelligence. First off all there is no reason to make a computer that might decide it does not want to do whatever it is you ask it do. Secondly this hypothetical AI would interact with and perceive the world in a completely different way from humans. I don't think there is any reason at all to fear the AI.

You must have misread me. I'm not afraid of AI. Given enough time we'll make an AI that's way smarter than us, and it'll likely be able to govern us much better and fairer than we can govern ourselves. Or it will annihilate us completely, without any real way for us to defend ourselves.

Comment: Could be very useful (Score 1) 84

by binkzz (#39901053) Attached to: Researchers Push Implanted User Interfaces

Carrying out chips could be very useful, but it's also very scary and dangerous. What if we're forced by law to have these things inserted? We could be tracked and monitored continuously in the name of safety from terrorism. Though part of me knows chipping is inevitable: It is too powerful to resist for governments and large corporations.

" . . . and cause that as many as would not worship the image of the beast should be killed. And he causeth all, both small and great, rich and poor, free and bond, to receive a mark in their right hand, or in their foreheads: and that no man might buy or sell, save he that had the mark, or the name of the beast, or the number of his name." -- Revelation 13:15-17

Comment: Re:What the bloody goddamned fuck? (Score 1) 169

by binkzz (#39448939) Attached to: Hobbit Pub Saved By Actors Stephen Fry and Sir Ian McKellen

So why not tell them to stop doing that rather than trying to sue them out of existence?


He said: "When it's an established business, we like to get the company to acknowledge they are using our trademarks, stop selling infringing articles and then we will grant them a licence for a nominal fee - approximately $100 a year." It features characters from Tolkien's stories on its signs, has "Frodo" and "Gandalf" cocktails on the menu and the face of Lord of the Rings film star Elijah Wood on its loyalty card. A letter from SZC had asked the pub to remove all references to the characters.

They weren't sued out of existence, just asked to pay the license fee and stop using trademarked items.

"Pull the trigger and you're garbage." -- Lady Blue