Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Google To Predict Accuracy of Political Statements 249

pestario writes "Google CEO Eric Schmidt talks about a service which can give the probability of the accuracy of statements made by politicians, among other things. From the Reuters article, Schmidt says: "We (at Google) are not in charge of truth but we might be able to give a probability." Can Google's 'truth predictor' bring an end to sound bites and one-liners? I'm not holding my breath...""
This discussion has been archived. No new comments can be posted.

Google To Predict Accuracy of Political Statements

Comments Filter:
  • by ndogg ( 158021 ) <the@rhorn.gmail@com> on Thursday October 05, 2006 @07:56AM (#16319105) Homepage Journal
    Some one should make a show out of that.

    Why do execs say such funny things away from their engineering teams? And why do I get the sneaking suspicion that some group at Google has actually figured out how to do this?

    Anyway, until this is beyond hype, I find the Annenberg Fact Check [factcheck.org] to be the most reliable source out there.
  • by t0xic@ ( 156547 ) on Thursday October 05, 2006 @08:06AM (#16319177)
    I guess Psychohistory is here! I just wish Isaac Asimov would have lived a bit longer.
  • by hal2814 ( 725639 ) on Thursday October 05, 2006 @08:21AM (#16319303)
    Is Google going to be backing up the true and false statements with sources? Furthermore, what sources are they going to use? How will they evaluate statements that are viewed as true by some sources but false by others? I don't know about you guys but I don't exactly trust Google to give me some sort of percentage true or false without justifying their position. I also don't entirely trust Google not to abuse such a position. Often the truth is what you make of it and I'm not so sure I'll buy into Google-branded truth. I think that researching what the politicians say yourself is your best line of defense in determining how much they lie.
  • by Anonymous Coward on Thursday October 05, 2006 @08:26AM (#16319341)
    Indeed they will. Issuing in a deluge of "Truth Engine Optimization" consultants. And anyone who doesn't have the money to hire them will be seen as less and less trustwothy. I think we all see this as a fool's errand.

    I find it interesting the reflection this shows of where we are with net content in the days of Search Engine Optimization. In utopian theory, the web is perfectly democratised content where anyone can post anything. The search engines are supposed to match users to sites based solely on the content of the sites, and rank them solely on relevance. However, in this age of Search Engine Optimization, it's possible for someon with enough knowledge and/or consultants to claw their way to the top of the pile, which (necessarilly) is at the expense of less optimized sites that are more relevant. Which, of course, forces the more relevant sites to optimize themselves in return to "restore the balance."

    We're now somewhat a level abstracted from the utopian purity of "relevance" in Google (and other search engine) results, just as this new tool would abstract us away from the purity of actual "truth"
  • Won't mean anything (Score:3, Interesting)

    by Shotgun ( 30919 ) on Thursday October 05, 2006 @08:45AM (#16319579)
    Just like the politicians statements.

    It was one of the Asimov books that talked about an area of science that analyzed politician's statements. The analyzed a particular politician's 2-hour speech and discover he had not said anything. That is the art of politics. Convincing people that you are on their side without makeing any promises.

    I predict the Google tool will predict 0% truth in most statements, because a prerequisite will be that something was stated.
  • by misanthrope101 ( 253915 ) on Thursday October 05, 2006 @09:56AM (#16320481)
    There are auditory and visual cues to detecting a lie
    I think there are auditory and visual clues to detecting a willful falsehood, but what about people who are sincere? I think VP Cheney means what he says, really, and he doesn't seem affected by what we call "reality." No matter how many CIA or Defense Department studies or reports contradict what he's saying, he still stays on-message. The more prominent of a role religion plays in public life, the more frequently we see what I call "faith-based reality." People believe whatever the hell they want, and they don't consider fact, expertise, education, or even the glaringly obvious to threaten their worldview in any way. They're used to believing things based on their gut feeling, they've grown up in a culture where they're told to trust that inner voice and distrust "the secular world," and lo and behold, that's what they do. A lie detector isn't going to catch someone who sincerely believes something that isn't true.

    For example, parts of the country (the Bible Belt comes to mind) that rely more on abstinence-only education have a higher teen pregnancy rate, but that doesn't dissuade religious people from thinking that abstinence-only education is better. You don't have to collect data or analyze trends if you just know, and people who just know things based on their "conscience" aren't really lying. They're just using a kind of thinking that doesn't rely on objective reality. What's more, their confidence will actually be higher than "secularists," because the secular worldview always entails the awareness of our own fallibility, thus an element of self-doubt, which doesn't plague those who feel they are instruments of divine providence. They more sincerely and steadfastly believe in their faith-based reality than you do in your reality-based reality. So you'd be tripped up by your device long before they would be.

  • by Doc Ruby ( 173196 ) on Thursday October 05, 2006 @10:24AM (#16320951) Homepage Journal
    Of course they will. That "natural selection" of political speech in the media environment has (d)evolved it into the useless ruler of the meme pool now governing us.

    Before we ever actually produce "artificial intelligence", the machines will have taken over. Maybe we're better off, since politics is a job for computers, not humans, just like chess.
  • How do you tell a politician is lying? Easy, his lips are moving

    The more interesting question is how to tell when a search engine is lying.

    There seems to be an assumption that an algorithm is immune to "lying" because code is somehow objective. I think that's a naive position and an outright fallacy. A lie? Well, that would be a subjective judgment, wouldn't it.

    For one thing, the mere notion that you can reduce "accuracy" to a single number is questionable.

    How many people are happy in the US? Well, that depends on happiness, polling techniques, etc.

    How many people are unemployed? Well, that depends on the definition of unemployment. Does working at McD's count as employed if you were formerly a rocket scientist? Does not being on unemployment rolls count?

    Do we have a sound economy? How is Google going to rate that when experts presently disagree? Probabilities? Probabilities of what? A crash? Rich people losing money? Poor people? The strongest evidence you have that the answer Google says it will offer is likely to be inaccurate is the dimensionality of the response... if it returns a single response when there are many subjective answers, then that itself is evidence of bias.

    I seem to recall someone saying that the only real probability is 1 or 0, and everything else is a fiction we construct based on our belief that we have set up the problem with the correct analysis and independent variables. Google does not have independent variables at its disposal. Google has the world's largest set of interconnected variables, feeding back on each other. It's more likely that we will define what Google says to be true than find that Google is right, since Google's opinion will become accepted as truth and will then itself influence outcomes. Accuracy loses meaning in the presence of such a feedback loop.

    I could go on, and might do so in another forum, but forunately some others (here [slashdot.org] and here [slashdot.org] and surely others) have done so. For now I'll just point to the old quote variously attributed to Twain or Disraeli [quotationspage.com]: There are three kinds of lies: lies, damned lies, and statistics..

    People's willingness to blindly turn to Google for the answer of all borders creepingly (if not creepily) on religion, and Google is so enthralled with the fun it is having that it's seeming to always be pushing the line on what is ethically reasonable.

    The assessment of truth is one topic that we, as humans, should not outsource to machines. As soon as we believe machines can do that, we might as well all just execute a "shutdown" and wait until we're needed again.

    p.s. If you're wondering about my subject line, it's the title of a Star Trek episode in which a character asks "Is truth not truth for all?" As a child, I had learned from this episode that there was just one truth, not to be hidden. But on reflection, now older, I don't know that that's really true. Nor do I want to live in a world where a "child intelligence" (Google) is busy making the globally visible mistakes necessary to learn the next higher order truths about truth.

One man's constant is another man's variable. -- A.J. Perlis

Working...