Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
User Journal

Journal Journal: e voting scam

Activist: E-voting to be a 'train wreck' SAN JOSE, Calif. -- Ambushing registrars and tracking down executives at their homes and offices, a literary publicist has uncovered conflicts of interests and security flaws inside the companies that make electronic ballot machines. Searching the Web and poring over newspaper clippings, Bev Harris has unearthed obscure arrest records, ties to conservative political groups and other embarrassing secrets of senior executives at voting companies.
User Journal

Journal Journal: Java

article here The results he got were that Java is significantly faster than optimized C++ in many cases. "They also show that no one should ever run the client JVM when given the choice," Lea adds. ("Everyone has the choice," he says. To run the server VM, see instructions in the Using the Server JVM section below.) Using the Server JVM Every form of Sun's Java runtime comes with both the "client VM" and the "server VM." "Unfortunately, Java applications and applets run by default in the client VM," Lea observes. "The Server VM is much faster than the Client VM, but it has the downside of taking around 10% longer to start up, and it uses more memory." Lea explains the two ways to run Java applications with the server VM as follows 1. When launching a Java application from the command line, use java -server [arguments...] instead of java [arguments...]. For example, use java -server -jar beanshell.jar. 2. Modify the jvm.cfg file in your Java installation. (It's a text file, so you can use Notepad or Emacs to edit it.) This is located in C:\Program Files\Java\j2reXXX\lib\i386\ on Windows, /usr/java/j2reXXX/lib/i386/ on Linux. You will see two lines: -client KNOWN -server KNOWN You should change them to: -server KNOWN -client KNOWN This change will cause the server VM to be run for all applications, unless they are run with the -client argument. He can be contacted at keith@kano.net.
User Journal

Journal Journal: Technology

Chemical, Printable RFIDs
The RFID Journal says that CrossID, an Israeli startup, has developed an RFID system that can be printed using an inkjet printer.

RFID Tags For The Rich Greedo writes "While reading this piece about designing 'experiences' in the Globe and Mail, I came across this interesting tidbit: If you're a frequent Prada shopper (and who on /. isn't?), the loyalty card in your wallet or purse contains a RFID tag that announces your arrival in the store. When you encounter a saleswoman, her handheld computer brings up your tastes, buying history, vital statistics and personalized suggestions from in-stock and coming inventory; the handhelds also place orders and book change rooms. Every item for sale bears an RFID tag.

RSA Keeps RFID Private
RSA Security Inc. will unveil a finished version of its RFID "Blocker Tag" technology that prevents radio-frequency identification tags from being read.
PARC's New Networking Architecture
PARC's New Networking Architecture named Obje, to establish a device-independent networking system. Essentially, it allows two devices to teach each other how to talk amongst themselves. It does this by sending actual code over the network."
It's About Connectivity Not The Internet I've been trying to avoid writing about the Internet as such. With as "At the Edge" I'm looking at larger issues but can't escape writing more directly about the Internet.

The 'Pervasive Computing' Community "Most of us are using computers,but also PDAs and cell phones. And this trend is accelerating in our increasingly networked wireless world. We might use hundreds of computing devices by the end of this decade. Still, we are slaves to our machines. With every new device, we have to learn new commands, languages or interfaces. The Cambridge-MIT Institute (CMI), a strategic alliance between the University of Cambridge in the UK and the Massachusetts Institute of Technology in the U.S., has enough of it and wants to give back control to the users.

User Journal

Journal Journal: Genetic algorithms

DISCOVER Vol. 24 No. 8 (August 2003)

Table of Contents

Darwin in a Box
Are you ready for computers that speed up the process of evolution and teach themselves to think?
By Steven Johnson

Illustration by Leo Espinosa

On the screen, an animated figure takes a step forward and tries to walk. Instead it collapses immediately, falls on its back, and flails its legs helplessly. Then it reappears at the left of the screen, takes a few delicate baby steps, and falls again. Returning to the screen, it raises its knees, takes six or so confident strides, and drops on its side. After trying over and over again to walk, the figure finally marches successfully across the screen as though its motions had been captured directly from videos of a human walking.

This little film won't win an Oscar for Best Animated Short, but the software that generated it stands as a small miracle of computer programming. The figure was not taught how to walk by an offscreen animator; it evolved the capacity for walking on its own. The intelligence to do so came from some clever programming that tries to mimic nature's ability to pass along successful genes.

The idea is called a genetic algorithm. It creates a random population of potential solutions, then tests each one for success, selecting the best of the batch to pass on their "genes" to the next generation, including slight mutations to introduce variation. The process is repeated until the program evolves a workable solution. Originally developed in the 1960s by John Holland at the University of Michigan, genetic algorithms are increasingly being harnessed for real-world tasks such as designing more efficient refrigerators.

Genetic algorithms make it possible for computers to do something profound, something that looks an awful lot like thinking. And that little animated figure learning how to walk showcases some design developments that permit computers to make their own decisions--without guidance from humans.

The payoff is immediate and obvious for creators of popular entertainment. Most big-budget Hollywood movies or action-oriented video games are teeming with walking (and running and jumping) computer-rendered figures. For these characters to seem believable, they have to move in convincing ways, which means that somehow they have to be taught how to walk. Until recently, filmmakers either had to instruct each limb to move in a particular way or they had to map in three dimensions a real person's movements and apply that information to a virtual character. You can see the approach in the way the character Gollum moves in Lord of the Rings: The Two Towers. That laborious approach creates convincing results, but they're notoriously inflexible. If animators record someone walking downhill for one scene, and then decide later that the character needs to trip over a rock along the way, they have to go back and choreograph the whole sequence all over again.

Instead, Torsten Reil, an Oxford researcher turned animation entrepreneur, decided to borrow a page from nature and use the power of evolution to solve the problem of making a digitized character move convincingly. "First, we created a simple stick figure: It's got gravity; it's got joints," he explains. "Then we put virtual muscles in and a neural network that controlled the muscles. The problem is: How do you get the network to do what you want it to do? If you just have a randomly assembled neural network, it will send quite complex signals to the muscles, but that's usually not walking--it's more like some random twitches." The muscles all work, and they're wired up to the central nervous system, but the character still doesn't know anything about walking.

The character's body plan involved 700 distinct parameters that needed to be optimized to teach it how to walk like a human. "If you look at that system with your human eyes, there's no way you can do it on your own, because the system is just too complex," Reil says. "That's where evolution comes in."

Reil and his team created a genetic algorithm to explore the potential ways that the figure's control system could be refined. The ingredients of a genetic algorithm are actually relatively simple: a population of "organisms," each with a distinct set of "genes"; rules for the mutation and recombination of those genes; and a "fitness function" to evaluate which organisms are the most promising in each generation. In this case, the fitness function was "distance traveled from origin without falling over."

The algorithm generated 100 animated characters, each with a randomly assembled neural network controlling its muscles. Then the algorithm let them all try walking. Predictably enough, the first generation was almost completely inept. But a few figures were slightly better than the rest--they took one hesitant step before crumbling to the ground. By the standards of the fitness function, they became the winners of round one. The software made 20 copies of their neural networks, introduced subtle mutations in each of them, added 80 new participants with randomly wired networks, and started the next generation walking.

        Like organic life, genetic algorithms come in two primary flavors: those that feature sex and those that don't. Some algorithms "mate" fitness-function survivors, recombining genes in the process. Others clone the most successful solutions and introduce variation purely through mutations.

Genetic algorithms invariably have surprises. Reil's animations rapidly advanced in their ability to travel without falling, but they didn't always walk. "We got some creatures that didn't walk at all but had these very strange ways of moving forward: crawling or doing somersaults." The creatures were playing by the rules of the game, so Reil had to change the rules. "We had to put in a few exceptions: It's not just distance traveled, it's distance traveled without the center of mass going below a certain point."

Eventually, Reil optimized the procedure to take only 20 generations and a few minutes of computation time. The team created a short time-lapse video that shows sample clips from several generations along the way, including the best walker from generation one (the initial figure flailing on the ground) and ending with the successful striding figure in the 20th generation.

This is one of those situations in which reinventing the wheel is a good thing. Watching the time-lapse video clips, one can't help but marvel how this virtual evolution is roughly analogous to the real-world evolution of our ancestors millions of years ago when they first began to walk upright across the savannas of Africa. The stick figure strides convincingly not because someone engineered it to do so but because an evolutionary process allowed the figure itself to find its way to that distinct pattern of movement and muscular control.

The genetic algorithm doesn't make the computer self-aware in a HAL 9000 kind of way, but it does make the computer genuinely creative, capable of imaginative leaps and subtle connections that might elude the minds of human engineers. And the end result is a useful product, now incorporated into an animation software package called Endorphin.

Reil and his team are not alone in unleashing genetic algorithms on practical tasks. Bill Gross and his team of inventors at Idealab in Pasadena, California, are using genetic algorithms to develop a new solar energy device (see "Catch the Fire," page 52). Gross believes genetic algorithms have the potential to revolutionize engineering. Instead of using software as merely a visualization tool that helps draw a contraption, he envisions genetic algorithms that can handle the entire design process. You define your organism, your genes, and your fitness function and let the software do the hard work of actually figuring it out.

"I think this is the way engineering should be done: Instead of defining your part or your circuit board, define your objective and let the software evolve the answer. Let's say I want a table. Instead of drawing out a table, you say, My constraints are these: I want a plane at this height, with this sideways rigidity, and so on. And then you tell the software, OK, you've got bars, beams, screws, bolts--make the best thing you can at the lowest cost."

Genetic algorithm advocates often talk about their software in the language of ecosystems: predators and prey, species and resources. But Gross has another idea--less rain forest and more assembly line. "Let's say you give the software access to the entire McMaster-Carr industrial supply catalog. They have 400,000 parts in stock: screws, bolts, hinges, everything. So you've got the whole gene pool of those parts available." Somewhere in that mix is the machine you're dreaming of, and simulated evolution may well be the fastest way to find it.

"You state your objectives, let the thing evolve with the optimum combination of parts at the lowest price, and the machine will be there this afternoon," Gross says, his voice rising with excitement. "That's an extreme exaggeration--but not that extreme!"

Slashdot Top Deals

If you have a procedure with 10 parameters, you probably missed some.

Working...