OpenCyc 1.0 Stutters Out of the Gates 195
moterizer writes "After some 20 years of work and five years behind schedule, OpenCyc 1.0 was finally released last month. Once touted on these pages as "Prepared to take Over World", the upstart arrived without the fanfare that many watchers had anticipated — its release wasn't even heralded with so much as an announcement on the OpenCyc news page. For those who don't recall: "OpenCyc is the open source version of the Cyc technology, the world's largest and most complete general knowledge base and commonsense reasoning engine." The Cyc ontology "contains hundreds of thousands of terms, along with millions of assertions relating the terms to each other, forming an upper ontology whose domain is all of human consensus reality." So are these the fledgling footsteps of an emerging AI, or just the babbling beginnings of a bloated database?"
AI needs a 3d environment to work (Score:4, Informative)
Not really open source? (Score:4, Informative)
Re:Web games much better for collecting this info (Score:2, Informative)
The Cyc Foundation, a new independent non-profit org, has been working for several months on a game for collecting knowledge, but we will need your help. You can help now by working on game interfaces and/or programming. Or you can help later by playing the game.
Listen in on our Skypecast tonight (every Thursday night) at 9:30pm EST. Look for it on the list of scheduled Skypecasts at skype.org.
Re: It's not either/or (Score:2, Informative)
If you hadn't seen me mention it already
Re:I Don't Get It (Score:3, Informative)
Not only that, it's based on an assumption that you can use symbolic rules to represent knowledge. Which is a pretty big assumption, considering that our brains don't have a list of these rules.
Re:Waste of Time and Money. Sorry. (Score:3, Informative)
Amm... Those researchers need jobs too. Yes, I absolutely agree with your point, but unfortunately it's not a very popular point in many research circles.
I discussed this issue with a logic dude once... and what I got was that the rules of logic don't have any specific granularity... so technically, you can model neural networks, bayes nets, etc., via millions of very simple `logical' rules. Just like you're running statistical learning thing on a -computer-, the computer itself uses logic gates in the CPU to perform the computation---thus, you can model anything using logic and symbolic manipulation (sort of a lame excuse to have symbolic AI...)
Re:I Don't Get It (Score:3, Informative)
Re:Conflict of intent (Score:2, Informative)
The solution is contextualization.
No, the problem is contextualization. The solution is something CYC doesn't come close to.
Your "vampire" example is a typical AI researcher's example: it's too trivial to show the real problem. That's because with "vampire" you can get the context of "fiction" from the word. So let's take a more typical word: "tree"
Basic ontology: A tree is a plant.
Basic fact: A plant requires air and water to live.
Have you watered your red-black binary tree today? How about your boxed christmas tree? Your family tree? Your oak tree?
You can solve this by saying that only *some* trees are living things. But then you lose the power CYC's ontology-and-logic combination was supposed to give you because you can no longer reach userful conclusions based on the fact that this palm tree is a tree.
Or you can solve it by deciding the problem is english and its foolish use of the word 'tree' for several different things, so you invent your own words tree(1) tree(2) tree(3), etc. But that just moves the problem of understanding the world out of your system and onto your users. Your clear logical rules and ontology become an unmaintainably complex hodge-podge of exceptions. And worse, it misses the fact that all these trees really are trees even if they're not real trees. It's not an accident that we call a christmas tree a tree.