Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

OpenCyc 1.0 Stutters Out of the Gates 195

moterizer writes "After some 20 years of work and five years behind schedule, OpenCyc 1.0 was finally released last month. Once touted on these pages as "Prepared to take Over World", the upstart arrived without the fanfare that many watchers had anticipated — its release wasn't even heralded with so much as an announcement on the OpenCyc news page. For those who don't recall: "OpenCyc is the open source version of the Cyc technology, the world's largest and most complete general knowledge base and commonsense reasoning engine." The Cyc ontology "contains hundreds of thousands of terms, along with millions of assertions relating the terms to each other, forming an upper ontology whose domain is all of human consensus reality." So are these the fledgling footsteps of an emerging AI, or just the babbling beginnings of a bloated database?"
This discussion has been archived. No new comments can be posted.

OpenCyc 1.0 Stutters Out of the Gates

Comments Filter:
  • by CrazyJim1 ( 809850 ) on Thursday August 10, 2006 @12:51PM (#15882130) Journal
    Cyc is only words and descriptors. If you attach them to 3d shapes and actions in the 3d world, the program can imagine what you're saying. It can even obey and do tasks if hooked up into a robotic body and scan the room. It requires the technology of being able to scan its environment then run something like the program they run to find text inside of images. Instead of finding text inside of images, its finding objects inside an environment. Pretty simple once you understand the basics, but it will take a lot of work. A longer descriptor of this can be found at: AI page [geocities.com] Cyc isn't a waste, but you need to do something harder to make it into AI, you need to attach 3d objects to every noun, and apply 3d actions to every verb, etc. I'd say that'd be on the realm of next to impossible, so yeah what they've done really doesn't advance AI at all.
  • by dthulson ( 904894 ) on Thursday August 10, 2006 @01:10PM (#15882308) Homepage
    According to this FAQ entry [opencyc.org], it's not fully open-source...
  • by johndcyc ( 864700 ) on Thursday August 10, 2006 @01:41PM (#15882637)
    Cyc does need to collect massive data with the help of people and other smart programs (parse that however you like).

    The Cyc Foundation, a new independent non-profit org, has been working for several months on a game for collecting knowledge, but we will need your help. You can help now by working on game interfaces and/or programming. Or you can help later by playing the game.

    Listen in on our Skypecast tonight (every Thursday night) at 9:30pm EST. Look for it on the list of scheduled Skypecasts at skype.org.
  • by johndcyc ( 864700 ) on Thursday August 10, 2006 @01:52PM (#15882731)
    We plan to exploit the N-grams in our knowledge collection work at the Cyc Foundation.

    If you hadn't seen me mention it already :-), you can join our Skypecast tonight.
  • Re:I Don't Get It (Score:3, Informative)

    by Prof.Phreak ( 584152 ) on Thursday August 10, 2006 @02:35PM (#15883228) Homepage
    Unfortunately, however this is still a long way from sentient AI.

    Not only that, it's based on an assumption that you can use symbolic rules to represent knowledge. Which is a pretty big assumption, considering that our brains don't have a list of these rules.
  • by Prof.Phreak ( 584152 ) on Thursday August 10, 2006 @03:02PM (#15883530) Homepage
    Is one to assume that the way to common sense logic in a machine is via linguistic/symbolic knowledge representation?

    Amm... Those researchers need jobs too. Yes, I absolutely agree with your point, but unfortunately it's not a very popular point in many research circles.

    I discussed this issue with a logic dude once... and what I got was that the rules of logic don't have any specific granularity... so technically, you can model neural networks, bayes nets, etc., via millions of very simple `logical' rules. Just like you're running statistical learning thing on a -computer-, the computer itself uses logic gates in the CPU to perform the computation---thus, you can model anything using logic and symbolic manipulation (sort of a lame excuse to have symbolic AI...)
  • Re:I Don't Get It (Score:3, Informative)

    by TheGreek ( 2403 ) on Thursday August 10, 2006 @03:21PM (#15883683)
    If you had just spelled "capitol" correctly 20 years ago, you would have gotten an answer
    The Capitol building sits in the nation's capital, Washington, D.C, you fuckwit.
  • by sean.geek.nz ( 735084 ) on Thursday August 10, 2006 @06:54PM (#15885216)

    The solution is contextualization.

    No, the problem is contextualization. The solution is something CYC doesn't come close to.

    Your "vampire" example is a typical AI researcher's example: it's too trivial to show the real problem. That's because with "vampire" you can get the context of "fiction" from the word. So let's take a more typical word: "tree"

    Basic ontology: A tree is a plant.

    Basic fact: A plant requires air and water to live.

    Have you watered your red-black binary tree today? How about your boxed christmas tree? Your family tree? Your oak tree?

    You can solve this by saying that only *some* trees are living things. But then you lose the power CYC's ontology-and-logic combination was supposed to give you because you can no longer reach userful conclusions based on the fact that this palm tree is a tree.

    Or you can solve it by deciding the problem is english and its foolish use of the word 'tree' for several different things, so you invent your own words tree(1) tree(2) tree(3), etc. But that just moves the problem of understanding the world out of your system and onto your users. Your clear logical rules and ontology become an unmaintainably complex hodge-podge of exceptions. And worse, it misses the fact that all these trees really are trees even if they're not real trees. It's not an accident that we call a christmas tree a tree.

If you think the system is working, ask someone who's waiting for a prompt.

Working...