Forgot your password?
typodupeerror

Comment: Re:Yes, we need to revisit everything. (Score 2) 50

by mo (#41275869) Attached to: Researchers Create First All Optical Nanowire NAND Gate
It's not that humans are not adaptable, it's that parallel computing is hard for humans to figure out. Linear execution lends itself to all kinds of easy abstractions: loops, branches, methods, etc. Parallel computing, not so much. Mutexes are awful. The best we've got is message passing and functional programming, but even that is hard to design correctly to be both understandable and exploit inherent parallelism.

Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital computing was invented by Touring before we even had built a computer. It's easy to visualize how it works. My brain explodes though trying to imagine a fuzzy-logic analog equivalent of a touring machine.

I used to think that AI research combined with neuroscience would figure out a simple solution to this problem, but it's increasingly seeming like, no, it's even complicated in the brain.

So people can pine for analog memristor computation, and analog optical computing all they want, but the hardware is the easy part here. Get the software side solved, and if you build it they will come. But it's not because we aren't used to these problems, it's because these problems are really really hard.

Comment: Re:So it's remote? (Score 5, Insightful) 403

by mo (#38055818) Attached to: Siri Protocol Cracked
Speech recognition isn't too CPU intensive, but it's *massively* memory intensive. It's not unreasonable for speech recognition engines to eat up a gig of ram, and the 4S only has 512mb. However, push it to a server with lots of ram and it can handle lots and lots of simultaneous speech recognition queries. It's tailor made to be a server-side task. At least until phones have gigs of free memory that aren't needed.

Comment: Nonsensical question (Score 0) 255

by charlie (#35773226) Attached to: What If America Had Beaten the Soviets Into Space?
For geopolitical reasons the Eisenhower administration wanted the USSR to be first to orbit a satellite -- because it would set a precedent for free orbital flight over any territory, thus allowing the USA to orbit the Corona spy satellites without the USSR being legally free to pop off ASAT weapons at them.
In practice, Von Braun was ordered to ballast Thor IRBM tests with concrete to prevent them "accidentally" making orbit prematurely.

Comment: memristor-based analog computers (Score 2) 347

by mo (#35549590) Attached to: Michio Kaku's Dark Prediction For the End of Moore's Law
Even with transistors the same size, there are so many avenues to explore in processor design. Just off the top of my head, how about a memristor-based analog co-processor for tasks like facial detection or language/speech recognition. How about processors with asynchronous clocks, or clockless designs. Sure, they're harder to build, but once transistor sizes fixate, might as well spend the effort because designs will have a much longer lifecycle.

Comment: Re:None of the above. (Score 2, Interesting) 342

by mo (#33533534) Attached to: My Camera ...

There's a few things a DSLR will get you that no point and shoot has.

First, big form factor means big sensor which means good shots in low light/fast exposure. Point-and-shooters are a huge handicap at sporting events for this reason.

Secondly, big lenses allow you to get tight depths of field. With p&s cameras, generally everything in frame is in focus. Being able to use focus to pull your subject out and blur the background is hugely valuable.

Space

Supermassive Black Hole Is Thrown Out of Galaxy 167

Posted by samzenpus
from the moving-to-better-quarters dept.
DarkKnightRadick writes "An undergrad student at the University of Utrecht, Marianne Heida, has found evidence of a supermassive black hole being tossed out of its galaxy. According to the article, the black hole — which has a mass equivalent to one billion suns — is possibly the culmination of two galaxies merging (or colliding, depending on how you like to look at it) and their black holes merging, creating one supermassive beast. The black hole was found using the Chandra Source Catalog (from the Chandra X-Ray Observatory). The direction of the expulsion is also possibly indicative of the direction of rotation of the two black holes as they circled each other before merging."
Databases

Cassandra and Voldemort Benchmarked 45

Posted by timothy
from the rifling-the-file-cabinet dept.
kreide33 writes "Key/Value storage systems are gaining in popularity, much because of features such as easy scalability and automatic replication. However, there are several to choose from and performance is an important deciding factor. This article compares the performance of two of the most well-known projects, Cassandra and Voldemort, using several different mixes of access types, and compares both throughput and latency."

Comment: Re:12 year old product compares to iPad, and couri (Score 1) 293

by charlie (#31852688) Attached to: The iPad vs. Microsoft's "Jupiter" Devices
The Sharp Mobilon was also sold as the Vadem Clio.

I owned one, back in the day.

I assure you, the claimed 10-16 hour battery life is a ludicrous exaggeration. In reality, it was good for 4.5-6 hours on a charge.

(Battery life claims are a lot more conservative these days; I remember the first-gen Apple Powerbooks, where the PB100's claimed life of "two and a half hours" was closer to 40 minutes -- and they were by no means the worst of the bunch!)

Also: the thing was near-as-dammit unusable due to crappy design decisions. For example, WinCE 2.11 had the window "close" button right next to the "Maximize" button -- and the pen digitizer was inaccurate enough that if you didn't calibrate the screen very carefully you'd end up hitting "close" instead of "maximize" about 50% of the time!

Comment: What SoftMaker is *really* for ... (Score 1) 110

by charlie (#31639634) Attached to: SoftMaker Office 2010 For Linux Nearing Release
I've used it on and off for about eight years now.

SoftMaker office isn't really a decent replacement for OO.o on Linux. But there is one place where it's indispensible -- if you have a WinCE or Windows Mobile PDA/smartphone, it's miles better than the Pocket version of Microsoft Office. It actually makes my old HP iPaq 214 useful for writing.

Comment: Re:Reading the disk will be tricky. (Score 4, Interesting) 325

by charlie (#31557150) Attached to: Need Help Salvaging Data From an Old Xenix System
... However, as I remember from back when I worked at SCO (years before the name and some assets were sold to the lunatics from Utah), Xenix filesystem and partition table support was rolled into SCO UNIX SVR3.2/386. And Open Desktop. And ODT came with a proper working TCP/IP stack. It's probably overkill, but once you've tried using uucp to get the files off the BBS, you might want to pull the ST506 drive (presumably an MFM-encoded one, not RLL-encoded) and stick it into a shiny new 386 with, say, 4Mb of RAM and a 40Mb disk with SCO UNIX installed. That should enable you to mount the filesystems and export them via NFS. It's a lot of work, though.

Comment: One fundamental point ... (Score 4, Informative) 350

by charlie (#29636327) Attached to: Will Books Be Napsterized?
One fundamental point that tends to get overlooked is that unlike CDs or cassette tapes before them, books traditionally came with built-in DRM, insofar as copying them (via scan/OCR/proofread) was a really tedious process. Whereas it's relatively easy to crack the DRM on, for example, MobiPocket or Microsoft Reader books (and probably ePub by now). So the DRM'd formats are easier to pirate than the previous "analog"-analog format. What this portends for the future remains to be seen, but wearing my full-time novelist hat, I'm a bit worried. The music industry has efficiently trained people to grab files without throwing money at the artists, by bringing the role of publishers into disrepute. Now we're all set to repeat the experience, and unlike a rock band, most authors don't perform well on stage.

Comment: Re:It didn't bring people to the platform (Score 1) 364

by mo (#29289637) Attached to: Game Over For Sony and Open Source?

Let me preface this answer by revealing that I no longer work in the video game industry, as I did not enjoy it enough to stay. A lot of people cut their teeth on writing Windows stuff for fun, maybe working on mods, but a fair amount of developers worked their way up from QA. At least where I worked, it seemed like there were way too many people wanting to get into the video games industry, and once they did get in, they worked their asses off. People would learn to code due to their love of games, not because they liked coding. There seemed to be a lot of very bright high-school guys who, instead of doing the whole computer-science thing at a university, would work QA, and then progress up to be a developer. These people were highly respected because of their commitment.

There was another group of people who formed the more senior developers who got started in academia. People who worked on the engines ususally had PHD's in computer science with an emphasis on graphics. I would think graduate work on game theory or AI would put you in this group.

Being an old-school linux hacker who cut his teeth by contributing to OSS projects, I felt a bit out of place. Most of the guys in the industry don't leave because the idea of working on something other than video games is distasteful. Me, I find lots of engineering problems satisfying.

"There is hopeful symbolism in the fact that flags do not wave in a vacuum." --Arthur C. Clarke

Working...