Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:How does it work? (Score 3, Insightful) 46

DARPA funded the project, and DARPA fund lots of projects. I think a debate about whether DARPA is good or bad is pretty out-of-scope for this particular work: we made a game that might show how software verification could be crowdsourced.

The games do try to be fun, that's why none of them are "look at this loop and write an invariant". Xylem dresses up the problem statement as logic puzzles that surround the growth of exotic plants. I don't have an iPad to play the final version of Xylem on, but we tried hard to come up with a compelling game.

I don't believe the expected player base really cares about whether the project was funded by DARPA or not. I understand if you don't, but I think you would also have to stop using the Internet if you have such an issue with DARPA funded projects :)

Comment Re:How does it work? (Score 3, Interesting) 46

I worked on Xylem when I was a grad student at UCSC. I was not on the team when it launched, so my info may be out of date.

What players are being asked to do is find loop invariants for code. The invariants are hard for a computer to come up with (and be useful), but are easier to check given certain bounds. So there is no predetermined win state, each answer is checked server-side to see if it holds up within the bounds (or, if the answer is already known, the cache hit is returned). If the invariant is complex and holds, it gets scored highly. If it's trivial and holds, it gets a lower score. If it doesn't hold, the instance where it doesn't hold is returned to the player.

Does this help?

Programming

Submission + - StarCraft AI Competition Results 2

bgweber writes: The StarCraft AI Competition [http://games.slashdot.org/story/09/11/12/1729217/StarCraft-AI-Competition-Announced] announced last year has come to a conclusion [http://eis-blog.ucsc.edu/2010/10/starcraft-ai-competition-results]. The competition received 28 bot submissions from universities and teams all over the world. The winner of the competition was UC Berkeley's submission, which executed a novel mutalisk micromanagement strategy. During the conference, a man versus machine exhibition match was held between the top ranking bot and a former World Cyber Games competitor. While the expert player was capable of defeating the best bot, less experienced players were not as successful. Complete results, bot releases, and replays are available at the competition website [http://eis.ucsc.edu/StarCraftAICompetition].
Real Time Strategy (Games)

Submission + - AIIDE 2010 StarCraft AI Competition 2

bgweber writes: The 2010 conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE 2010) will be hosting a StarCraft AI competition as part of the conference program. This competition enables academic researchers to evaluate their AI systems in a robust, commercial RTS environment. The competition will be held in the weeks leading up to the conference. The final matches will be held live at the conference with commentary. Exhibition matches will also be held between skilled human players and the top performing bots.

Competition details are available at: http://users.soe.ucsc.edu/~bweber/starcraft.html

Slashdot Top Deals

Human beings were created by water to transport it uphill.

Working...