Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Good points, plus ..... (Score 1) 786

(the Kennedy administration) was responsible for initiating the NSA/moon project

Well, that certainly puts the Apollo project into perspective.

"Tranquillity Intercept Station here... the cable has landed... That's one... small... step for an analyst; one... giant leap... for signals intelligence..."

"Beautiful, beautiful... magnificent information..."

Comment Re:We're not machines (Score 1) 401

Being free means that there's nothing else, typically hereditary, environment or coercion, that dictates how we must behave.

I'm not sure that that's actually a good definition of freedom. Everyone has a heritage and an environment, and everyone is coerced in many overt and covert ways by the society and economy in which they must function. We're shaped very strongly by our natural and built environment as well. Doesn't mean we're not "free"; but on the other hand, we're not totally detached either.

Why does your honest man refuse to lie? Was he taught as a child that lying was wrong? Did his parents set an inspiring example by refusing to lie when it would be to their advantage? Are these not causative factors in his upbringing that reduced his freedom?

It is a good point that unpredictability of behaviour is not necessarily the same as freedom. But I think freedom can only be defined in relation to some context. This person or environment is not coercing my behaviour at this moment: but my behaviour, even though internalised as what I believe to be a self-selected code of honour, may still be the direct result of events in my history that I didn't choose to experience.

Comment Re:Important Pedantic Correction (Score 1) 401

That does not preclude a general theorem prover from generating useful results on code about halting behavior (or any other behavior), it just can't answer every question about every piece of code, when given any possible input.

To be fair, neither can a human, given the same code.

Or are there indeed programs where a human programmer can intuit the behaviour of complex code without running it, but an algorithm can't?

Not entirely a silly question; after all, we still have human mathematicians and haven't managed to replace them all with Matlab scripts. What is it that the human mind is doing that our computers so far aren't?

Comment Re:Sam Harris (Score 1) 401

Our actions either regress to prior causes and we are ultimately not responsible for them (you didn't control the circumstances of your birth or upbringing), or randomness inherent in a chaotic system; and we can't be held accountable for randomness either.

I'm not sure what precisely "can't be held responsible/accountable" means practically.

A white-tail spider might be a purely deterministic biological machine, but I'm still going to either squash it (or if I'm feeling merciful, catch it in a jar and toss it outside) if it comes into my house. Because I know its behaviour is likely to injure me; I don't have to "hold it responsible" for its behaviour in some moral/spiritual sense in order to extrapolate its future actions from its present state, and intervene.

A botnet on my computer certainly is a completely deterministic machine, with not a shred of agency or accountability, and I'm going to squash it even harder than the spider and with even less regrets. I don't consider it to have any kind of moral responsibility - but I know that it's a thing, that exists, that has an inside and an outside and that its inside includes certain predictable behaviours, and that those behaviours are hostile to my interests, and I'm going to recognise it and judge it not for its metaphysical stack-backtrace but for what it is right now, and what it will do.

Why do we need to have any idea of moral "accountability" before we can judge and act on another human's behaviour? Inferring their current state from their past actions, and predicting from that state their future actions, seems to be enough for all practical purposes.

Granted that humans do have the ability to change their behaviour toward other humans, which to me is the entire point of not being hash and hateful in our justice system; I'm in favour of forgiveness, but "they're not responsible for their actions" doesn't make any sense to me. Criminal justice is a clear-headed pragmatic matter of preventing people from doing bad things in the future - or becoming a cause of bad things in the future by way of inspiration - not metaphysical retroactive assignment of ultimate blame. Isn't it?

Comment Re:Siri doesn't have free will (Score 1) 401

If we had no free will we would have no need for Governments, armies, laws, etc..

Conversely, if we had totally free will we would also have no need for Governments, armies and laws, since all of this machinery is based around one group of humans limiting and controlling the expressed will of others; if it were impossible to control another's will, nobody would ever try.

I think to me the question is not 'do we have free will' but 'how free is our will?' Because to me it's not a 0%/100% question. It's clear to me that we have some freedom of will. It's also clear to me that we do not have total freedom of will. We're not free to choose our race, birthplace, parenting; we're not free to choose many elements of the education which will form the contents of our thoughts. But neither are we completely controlled by external forces after our birth. From this limited freedom we seem to each evolve our own separate ideas and viewpoints, some more divergently than others. The paradox seems to lie in most formulations of the question wanting the answer to be 'yes' or 'no', when in reality it's somewhere in the middle.

Comment Re:appearing to have free will (Score 1) 401

"But is there really any difference between having free will and appearing to have free will?

Are you sure? There's a fairly simple thought experiment:

Is there a difference between being able to write a novel, and appearing to be able write a novel? That is, if the end product is actually a novel?

If you were given only a text file to read (and, eg, unlimited Google Books access to make sure that it wasn't trivially plagiarised), would you be able to tell the difference between an "actual novel" and an "apparent but not-actual novel" ? I mean, if the novel turned out to be better written than something by Dan Brown, and not obviously spambot gibberish?

At what point would you allow yourself to decide that "this is actually a novel, and not just something that appears to be identical to a novel, but isn't"?

Now extrapolate "novel" to "any behaviour". At what point does behaving identically to a person with free will start to become different from being a person with free will?

I for one don't understand how anyone can separate behaviour from being. If it looks like a duck and it quacks like a duck and every experimental test performed on it returns 'isduck=true'...

Comment Re:So what is this about? (Score 3, Insightful) 242

I guess that's why he's keen on embarrassing the US rather than say Russia or China.

Well, since he worked for the USA and didn't work for Russa or China, I'd imagine the number of insider documents he has about the intelligence services of Russia and China is zero.

"But why doesn't Jeff Bezos talk about Google's operations, hmm? Why is it always Amazon that he wants us to think about? What is it that he has to hide? He's obviously a Google double agent, isn't he?"

Comment Re:Why (Score 1) 143

Would anyone in their sane state want this:

"This ability would allow a quantum computer to decrypt many of the cryptographic systems in use today."

Nobody sane, no, but the NSA and GCHQ would love that. While lighting a cigar under the "no smoking next to the nuclear weapons" sign in the pool of suspicious green ooze at the abandoned military experiment base codenamed Icarus 13 that was formerly the Lovecraft House for Angry Psychic Orphans built on top of a desecrated Indian burial ground.

Comment Re:Moo (Score 1) 438

I never understood why Star Trek ships had to establish a "standard orbit" to begin with. They have enormous amounts of power available along with the magic warp field. So why couldn't they keep themselves suspended in one spot above a planet, regardless of gravity?

Especially since they can apparently move from one planet to the next in a system in a matter of minutes - even using 'impulse engines' - which if they were obeying standard Newtonian physics would take days at best if they accelerated at 1G all the way. Well, I suppose they could be accelerating at multi-G speeds, since they've got wacky warp drive inertial compensators, but at that point any pretense that 'impulse drive is just standard Newtonian chucking mass out the back' is long gone.

Comment Re:Idiot pruf (Score 1) 228

Yes government should get involved in the design of routers, and write laws about software code vetting.

Yes. They should.

That is, if you want your router to be fit for the purpose for which it was sold rather than be a dangerous toy that gets your home network rooted and your bank account drained, your files seized, your webcam activated and used to take compromising photos which are then used for extortion...

Plus, your personal network becomes my problem if it gets rooted and used to launch botnet attacks at me. Computer network security is a public security issue, and that's a valid role for government.

Slashdot Top Deals

The secret of success is sincerity. Once you can fake that, you've got it made. -- Jean Giraudoux

Working...