Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Yes, we need to revisit everything. (Score 2) 50

It's not that humans are not adaptable, it's that parallel computing is hard for humans to figure out. Linear execution lends itself to all kinds of easy abstractions: loops, branches, methods, etc. Parallel computing, not so much. Mutexes are awful. The best we've got is message passing and functional programming, but even that is hard to design correctly to be both understandable and exploit inherent parallelism.

Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital computing was invented by Touring before we even had built a computer. It's easy to visualize how it works. My brain explodes though trying to imagine a fuzzy-logic analog equivalent of a touring machine.

I used to think that AI research combined with neuroscience would figure out a simple solution to this problem, but it's increasingly seeming like, no, it's even complicated in the brain.

So people can pine for analog memristor computation, and analog optical computing all they want, but the hardware is the easy part here. Get the software side solved, and if you build it they will come. But it's not because we aren't used to these problems, it's because these problems are really really hard.

Comment Re:So it's remote? (Score 5, Insightful) 403

Speech recognition isn't too CPU intensive, but it's *massively* memory intensive. It's not unreasonable for speech recognition engines to eat up a gig of ram, and the 4S only has 512mb. However, push it to a server with lots of ram and it can handle lots and lots of simultaneous speech recognition queries. It's tailor made to be a server-side task. At least until phones have gigs of free memory that aren't needed.

Comment memristor-based analog computers (Score 2) 347

Even with transistors the same size, there are so many avenues to explore in processor design. Just off the top of my head, how about a memristor-based analog co-processor for tasks like facial detection or language/speech recognition. How about processors with asynchronous clocks, or clockless designs. Sure, they're harder to build, but once transistor sizes fixate, might as well spend the effort because designs will have a much longer lifecycle.

Comment Re:None of the above. (Score 2, Interesting) 342

There's a few things a DSLR will get you that no point and shoot has.

First, big form factor means big sensor which means good shots in low light/fast exposure. Point-and-shooters are a huge handicap at sporting events for this reason.

Secondly, big lenses allow you to get tight depths of field. With p&s cameras, generally everything in frame is in focus. Being able to use focus to pull your subject out and blur the background is hugely valuable.

Space

Supermassive Black Hole Is Thrown Out of Galaxy 167

DarkKnightRadick writes "An undergrad student at the University of Utrecht, Marianne Heida, has found evidence of a supermassive black hole being tossed out of its galaxy. According to the article, the black hole — which has a mass equivalent to one billion suns — is possibly the culmination of two galaxies merging (or colliding, depending on how you like to look at it) and their black holes merging, creating one supermassive beast. The black hole was found using the Chandra Source Catalog (from the Chandra X-Ray Observatory). The direction of the expulsion is also possibly indicative of the direction of rotation of the two black holes as they circled each other before merging."
Databases

Cassandra and Voldemort Benchmarked 45

kreide33 writes "Key/Value storage systems are gaining in popularity, much because of features such as easy scalability and automatic replication. However, there are several to choose from and performance is an important deciding factor. This article compares the performance of two of the most well-known projects, Cassandra and Voldemort, using several different mixes of access types, and compares both throughput and latency."

Comment Re:It didn't bring people to the platform (Score 1) 364

Let me preface this answer by revealing that I no longer work in the video game industry, as I did not enjoy it enough to stay. A lot of people cut their teeth on writing Windows stuff for fun, maybe working on mods, but a fair amount of developers worked their way up from QA. At least where I worked, it seemed like there were way too many people wanting to get into the video games industry, and once they did get in, they worked their asses off. People would learn to code due to their love of games, not because they liked coding. There seemed to be a lot of very bright high-school guys who, instead of doing the whole computer-science thing at a university, would work QA, and then progress up to be a developer. These people were highly respected because of their commitment.

There was another group of people who formed the more senior developers who got started in academia. People who worked on the engines ususally had PHD's in computer science with an emphasis on graphics. I would think graduate work on game theory or AI would put you in this group.

Being an old-school linux hacker who cut his teeth by contributing to OSS projects, I felt a bit out of place. Most of the guys in the industry don't leave because the idea of working on something other than video games is distasteful. Me, I find lots of engineering problems satisfying.

Comment It didn't bring people to the platform (Score 4, Interesting) 364

I used to work for Sony developing PS2 games. The number of people I met that cut their teeth writing code on the linux kit before getting into the business was exactly 0. I might have been the only person I knew who even had a modchipped PS2, everybody else just didn't care since they had the PS2Tool on their desk to do development. Sony is probably discontinuing offering Linux because it didn't spark the development push that they had hoped for. Still, I would think this would limit the number of supercomputer clusters that use PS3's. You'd think the marketing benefits of being a platform in the top 100 supercomputers would be valuable, but perhaps Sony is still willing to work with academic institutions to make this possible still.

Slashdot Top Deals

Getting the job done is no excuse for not following the rules. Corollary: Following the rules will not get the job done.

Working...