jbrodkin writes: "The same IBM processors in your Xbox 360, PS3, the car you drive and some of the world's fastest supercomputers are leaving for Mars today to support a NASA mission searching for extraterrestrial life. And this is no mere coincidence. Lessons learned from the incredible video throughput of the PlayStation 3 and the extreme scalability and reliability of mainframes factor into the processors being used on the Phoenix Mars Lander. Similarly, the experience building processors that make the most efficient use of energy on a spacecraft is helping IBM make data centers on Earth more efficient in a time when limitations of space and power are increasingly important.
"This is the onboard machine that runs all of the functions that will have to be performed somewhat autonomously on Mars when it lands," explains Dave McQueeney, chief technology officer for IBM's federal contracting business. "These are the computers inside the spacecraft that are responsible for the navigation, control, scientific instruments, power management... the things that are the brains of the Lander itself.""
Ellis D. Tripp writes: "An Australian court has ruled that an eBay seller cannot back out of an auction sale once it is successfully completed. The court has ordered a seller to hand over a vintage airplane to an eBayer who bid just over the reserve price of $128,000, despite a subsequent non-eBay offer of over $200,000. More details here:
jg21 writes: According to this article in Social Computing Magazine, the increasingly widely used label Enterprise 2.0 signifies, above all, software enabling collaboration — what TFA calls "a many-to-many communication medium that creates interaction" – and therefore does *NOT* really include email, which the author characterizes as a "one-to-one communication medium... more about instruction." What are the realistic chances that actual collaboration will become the number one form of communication in the enterprise, displacing email?
Davemania writes: I am working for a research group that requires to do a large amount of data analysis (each of these files could be up to 1 gig in size). We're planning on buying up to 10 pc with linux on them to do these scientific processing (matlab, etc ) but what would be the best configuration for these 10 linux boxes ? Preferably, we would like to maintain all the data on one server and send it out to the Linux box for processing automatically. What approach would be the best, clustering ? OpenMosix, Rock Cluster ? or just simply drop files into the share folder and remote desktop in ?
Soft writes: `Did you hear the one about (...) the astronaut who became so despondent after his orbital experiment failed that his colleagues feared he would blow the hatch on the space shuttle?' Jon Clark, a former NASA flight surgeon, tells Alan Boyle's cosmic log about a number of horror stories which happened in space over the course of the space program.
(To ward off predictable jokes, there are none with diapers; that didn't happen in space, anyway.)
Ant writes: "San Francisco Chroncile says Stanford researcher, Caitlin O'Connell, discovered that elephants can hear with their feet. They are specialists in seismic communication, relying upon sound waves that travel within the surface of the ground instead of through the air.
Seen on Neatorama."