Forgot your password?
typodupeerror

Comment: Fail (Score 1) 99

by smolix (#36090390) Attached to: Consumer Device With Open CPU Out of Beta Soon

I know this is going to be a flamebait. But before you flame me, consider the following: I'm researcher and get paid for what I do. I've released quite a few codes as open source and invented a bunch of algorithms which are not patented and used in many applications (think email spam filter, face recognition, etc.). And I've worked in industry and academia. For almost two decades. So I know both open and closed source.

First off, ideas have value. As in Dollar value. Take NVIDIA for instance - they don't have a semiconductor fab, so they send their chip layout to a place like TSMC or Global Foundry or Samsung or any other place to have their files turned into chips. These places are like modern printing presses. If their mask, vhdl or layout information were open source they wouldn't be able to reap the benefit from their investment into building the next generation of chips. Or as a more extreme case, take ARM. They design processor cores and license the microarchitecture to other (possibly fabless) design companies such as Apple which, in turn, tweak the design, add more stuff to it, and then ship it to the foundries. In other words, all the good stuff is in the plans, much less in the actual hardware.

So, designing an open source CPU is probably not going to work. Why not? Well, unlike with software, there's a massive barrier to entry. Talk Millions of Dollars rather than a few hundred to buy a laptop and install some version of GCC on it. Few users can afford this. This pretty much kills the model where many users take advantage of a good idea and share it to make it better. Yes, there are good ideological reasons but most people don't do things for ideology (note the emphasis on most). They do them for fun, profit, fame, convenience, or some other less noble goal.

As for the piece of hardware itself, hmmm, not sure why I would want to buy an overpriced and function limited and incompatible device.

Comment: Attend only if it's a good conference (Score 1) 244

by smolix (#35340550) Attached to: Is Attending a CS Conference Worth the Time?

Attending a conference (computer science or otherwise) doesn't mean much. You get to travel, stay at a fancy hotel (or a youth hostel if your university is poor) and present things. So what! There's that extra line on your CV.

It's worth it, though, if the people attending the conference are experts and you manage to discuss with them. Or if others see your work and build on it. Or if your work gets cited a lot as a result of attending the conference. Or if you manage to start an exciting joint research project. I've been to about 50 conferences so far and have published over 100 papers and the good ones are really worth it.

I'm not so sure about CCSC, though. Beyond that, I'm not a big fan of PhD conferences or sessions. If the work is good, everyone will want to hear it, so it'll be featured in the main conference anyway. If it isn't, having a special session won't help you.

Comment: Directional Antenna Problem (Score 2, Insightful) 373

by smolix (#32984856) Attached to: Death Grip Tested On iPhone Competitors

Besides a) attenuation due to hand holding and b) change of the antenna characteristics due to bridging there's a third problem which really exacerbates the first two: the antenna of the iPhone 4G is highly directional. In other words, it matters a LOT which way you point the phone. Sometimes even small changes around it can make a big difference in terms of whether you get data or not.

You can test this out (assuming you've got access to an iPhone 4G) by running a speed test application (there are plenty in the App Store) while holding / pointing the phone in different ways. I can trigger signal loss even without holding the phone. No bumper whatsoever is going to fix that problem and this is plain and simple bad antenna design. I lose a lot more data when streaming radio on the 4G than what the 3G did even though the bandwidth is (potentially) much higher.

Moon

LRO Photographs Soviet Lunar Landers From the '70s 24

Posted by Soulskill
from the i-can-see-my-house-from-here dept.
braindrainbahrain writes "Photographs of the Sea of Crises on the Moon taken by the Lunar Reconnaissance Orbiter show the Soviet lunar landers Luna 20, Luna 23 and Luna 24, which landed on the Moon in the 1970s. In addition to the landers, it is possible to see the tracks made by the Lunokhod lunar rover! The Soviet Lunokhod lunar rover predates the first successful Mars Rover by some 30 years. (Note: Very cool old-style artists' drawings of the Soviet craft at the Wikipedia links above.)"
Image

College To Save Money By Switching Email Font 306

Posted by samzenpus
from the smallest-things dept.
The University of Wisconsin-Green Bay has come up with an unusual way of saving money: changing their email font. The school expects to use 30% less ink by switching from Arial to Century Gothic. From the article: "Diane Blohowiak is the school's director of computing. She says the new font uses about 30 percent less ink than the previous one. That could add up to real savings, since the cost of printer ink works out to about $10,000 per gallon. Blohowiak says the decision is part of the school's five-year plan to go green. She tells Wisconsin Public Radio it's great that a change that's eco-friendly also saves money."

Testing can show the presense of bugs, but not their absence. -- Dijkstra

Working...