Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:your friend still has the copyright (Score 1) 227

Yes, they have to comply if he asks a reasonable amount for a picture in a weekly with a circulation like Elsevier. That doesn't mean that suing them will be profitable. On the other hand, being the defender in a copyright infringement case would be bad publicity for Elsevier, which has much more to lose.

Comment your friend still has the copyright (Score 1) 227

Your friend has given permission to use the picture with proper attribution, Elsevier doesn't give proper attribution. Your friend still has the copyright for the picture, he can offer Elsevier to give them permission for use in their weekly paper for, say, $100,-, if they immediately start giving proper attribution on the website. Elsevier may not distribute their weekly if they don't pay (that what copyright means).

Comment Re:You can't trust Asians (Score 1) 132

With the net someone from anywhere has just as much access to all the information you'd need to learn how to do this. there's nothing special about the chinese, the russians or the americans, hackers come from everywhere.

There is something special about "the americans", a lot of them are rather monolingual. It is harder for a monolingual non-X speaker to crack an X computer system than for a multilingual non-X speaker. Someone who speaks/understands some X has an even bigger advantage. Most people from Malaysia know Malay, Lin Mun Poo probably knew Chinese, selling data in "a diner" probably requires some fluency in English.

Comment Re:Workable? (Score 1) 200

How will that work if, say, a European citizen complains that Facebook (based in the U.S.) has been mis-using their personal data?

Facebook removes the personal data: problem solved.

If they don't remove, they will probably be sued for damages. They will lose, because they are breaking the law. The compensation will be high, because the damages of loss of privacy are high and Facebook was knowingly breaking the law. If they pay, more people will ask them to remove personal data. If they do remove: the problem is solved. If they don't, they can't keep paying.

If they don't pay, Facebook will, in Europa, be treated like a criminal organization. Not the end of the world, but it does restrict the choice of holiday destinations.

Comment Re:What about other people's data about me? (Score 1) 200

That whole page untrue?

Not the whole page, but Lenin wasn't exactly known for giving only sound bites, he spoke quite some time. Trotsky and Kamensky didn't just stand there like statues during the whole speech. Using photographs without Trotsky and/or Kamensky is not the same thing as altering photos to remove Trotsky and/or Kamensky. Comments like In this file the viewing direction of trotzki has been altered. In the original image he is viewing directly to the camera. kind of show my point: not only the viewing direction of Trotsky has been altered, almost everyone has moved their heads. ;-)

Comment Re:Agreed (Score 1) 200

[...] the people that care about how having their data abused will stop using that company.

Perhaps your time is almost free, but most people don't want to check the entire chain of production for every cup of coffee or sandwich they buy. That's why they want some reasonable lower limits for hygiene. Having to read a "privacy policy", which the company can abuse anyway, raises the transaction costs too much.

Comment Re:What about other people's data about me? (Score 1) 200

That's US black propaganda. See for example http://commons.wikimedia.org/wiki/Commons:Administrators'_noticeboard/Archives/User_problems_14#User:Erik_Warmelink, The person noting some obvious holes in the story of newseum.org (full disclosure: that would be yours truly) is now blocked indefinitely from commons.

Comment Re:This fooled someone? (Score 1) 257

Quite a large part of this is how you define "intelligence".

Yep.

So for a chatbot not to have the ability to factorise large numbers (or in fact do any intense mathematical calculations) quickly doesn't necessarily mean it would be dumbed down.

It would, artificially removing a mental ability is dumbing down. You are a wetware, aren't you?

Comment Re:This fooled someone? (Score 1) 257

The whole idea behind the chatbot is to pretend to be human. Teaching it to factorise quickly would be counterproductive.

The Turing test was designed to see if computers are intelligent. If chatbots have to act dumber than they are (acting as if they are as slow as wetware) to succeed, the test is flawed.

Comment Re:This fooled someone? (Score 1) 257

Valid. That would depend largely on the software, though.

But humans can't be programmed to factorize well, in fact they are so bad at factorization that "intelligent human" is an oxymoron. A chatbot can be trained to factorize in a few minutes, most humans don't understand it at all and those few humans who do understand it, are awfully slow.

Comment Re:This fooled someone? (Score 1) 257

That said, that number wasn't too big, so a conventional computer should handle just fine.

My point is that it is trivial to ask a question which weeds out the humans. The Turing test is terribly carbocentric. If the judge were a computer and had to tell which is the computer and which is the human, the judge would be ready almost instantly.

And frankly, I consider being able to factorize 12010258260 a bit more a sign of intelligence than knowing the name of a candidate in some elections.

Comment Re:Motives (Score 2, Insightful) 260

I have to agree. I know a former State Department official who was relatively far up the chain and he's told me the same thing: People tend to vastly overestimate the capabilities of the US, particularly on the intelligence and global influence fronts.

I know a Secretary of State who told the UN Security Council that Iraq had weapons of mass destruction.

He lied.

Slashdot Top Deals

An authority is a person who can tell you more about something than you really care to know.

Working...