Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

The Panic Over Fukushima 536

An anonymous reader points out an article in the Wall Street Journal about how irrational fear of nuclear reactors made people worry much more about last year's incident at Fukushima than they should have. Quoting: "Denver has particularly high natural radioactivity. It comes primarily from radioactive radon gas, emitted from tiny concentrations of uranium found in local granite. If you live there, you get, on average, an extra dose of .3 rem of radiation per year (on top of the .62 rem that the average American absorbs annually from various sources). A rem is the unit of measure used to gauge radiation damage to human tissue. ... Now consider the most famous victim of the March 2011 tsunami in Japan: the Fukushima Daiichi nuclear power plant. Two workers at the reactor were killed by the tsunami, which is believed to have been 50 feet high at the site. But over the following weeks and months, the fear grew that the ultimate victims of this damaged nuke would number in the thousands or tens of thousands. The 'hot spots' in Japan that frightened many people showed radiation at the level of .1 rem, a number quite small compared with the average excess dose that people happily live with in Denver. What explains the disparity? Why this enormous difference in what is considered an acceptable level of exposure to radiation?"

US CIO/CTO: Idea of Hiring COBOL Coders Laughable 265

theodp writes "If you're a COBOL programmer, you're apparently persona non grata in the eyes of the nation's Chief Information and Chief Technology Officers. Discussing new government technology initiatives at the TechCrunch Disrupt Conference, Federal CIO Steven VanRoekel quipped, 'I'm recruiting COBOL developers, any out there?,' sending Federal CTO Todd Park into fits of laughter (video). Lest anyone think he was serious about hiring the old fogies, VanRoekel added: 'Trust me, we still have it in the Federal government, which is quite, quite scary.' So what are VanRoekel and Park looking for? 'Bad a** innovators — the baddest a** of the bad a**es out there,' Park explained (video), 'to design, create, and kick a** for America.' Within 24 hours of VanRoekel's and Park's announcement, 600 people had applied to be Presidential Innovation Fellows."

Comment Re:heh (Score 1) 1091

Here's one area I have to give Apple some props in: their OSX interface puts some damn pretty and friendly makeup on the pig that was the old FreeBSD interface

Small correction:


What Apple's acquisition did was give the NeXT team money to update OpenStep for a next generation of hardware and throw marketing dollars at it to put it in front of people. Don't get me wrong: Apple's work post-acquisition on updating the interface was fantastic, but let's give credit where credit is due.

Comment Wrong Conference (Score 1) 49

I really hate the reporting around Hadoop. Most of these people have absolutely no clue what they are talking about, and this article is just another example of that. Any bit of simple research would have revealed that the actual open source community of developers around Hadoop, Hive, Solr, etc, can be found at ApacheCon. Of course Strata is amazingly commercial: O'Reilly, being a corporate entity, is trying to make cash around the latest craze. If they weren't, they'd make sure the ASF and the other OSS organizations that help make the software had some space and would actually attend.

Open Source

Big Data's Invisible Open Source Community 49

itwbennett writes "Hadoop, Hive, Lucene, and Solr are all open source projects, but if you were expecting the floors of the Strata Conference to be packed with intense, boostrapping hackers you'd be sorely disappointed. Instead, says Brian Proffitt, 'community' where Big Data is concerned is 'acknowledged as a corporate resource', something companies need to contribute back to. 'There is no sense of the grass-roots, hacker-dominated communities that were so much a part of the Linux community's DNA,' says Proffitt."

Comment Startups, Mailing Lists, etc. (Score 1) 506

The easiest way to find a company who hires for open source work is to look at who is actually submitting patches back, participates on mailing lists, files bugs, etc. From my own experiences, it seems as though almost every Bay Area startup or former startup from the past 10 years (but clearly not all of them) are doing work in open source either out in the open or behind closed doors. Many positions don't have open source in big bright letters, so you might need to just flat out ask. If you are outside of the Bay Area, those companies exist but will require more legwork.

Comment Re:BSD license was always more permissive, so grea (Score 1) 808

I don't see why anyone would not want to use the GPL if they want their software to be free and open. Why create something, give it out for free, and then allow businesses to take your work, profit from it, and give nothing back? Maybe these developers are hoping to get bought out by a large company someday?

There are many businesses that want to profit from their own open source projects by including them or parts of them in other, proprietary works. The GPL essentially makes that impossible.

Comment Re:Hadoop HDFS (Score 1) 320

but I see no reason that it couldn't serve you well as a large personal file service.

HDFS is not POSIX or mountable. So actually using the data from something that is expecting POSIX is going to painful. "But there is a FUSE plug-in!" Yes, there is, but you'll take a 60% perf hit using it, assuming that it still works in newer versions of Hadoop. See none of the hardcore devs actually use it, so there is a very good chance it is completely busted.

In any case, there are still problems around losing the fsimage and having no real HA for the NN, needing quite a bit of RAM for any significant amount of files, don't forget that 8TB now turns into at least 24TB counting the 3x replication factor, etc, etc, etc.

So no, really this isn't a solution for this particular problem.

Comment Re:Buffalo WZR-HP-G300NH (Score 1) 196

+1 on this one.

We've been using the Buffalo modified version of DD-WRT for a few months now. It replaced a Linksys E3k that was continually dropping connections. Overall, we're pretty happy with it (QoS, DHCP, etc). I'll definitely check on the link speed, although it is connected to DSL modem that can't do gigabit anyway. :)

Comment Re:Not wanting to put a dampener on things... (Score 1) 67

This isn't about Microsfot getting involved with open source. This is about Microsoft not getting left out. Beyond the countless startups, Apache Hadoop already has major players like Amazon, Dell, EMC, HP, IBM, NetApp, Oracle, VMware, ... trying to make a dent in the community in some form or another. Hell, I have a SuperMicro catalog on my desk emblazoned with the Apache Hadoop logo all over it. Like Oracle, they are coming in very late to the party and now need to play catch-up. Buying off Hortonworks is a very fast way to do that.

Comment Re:Hadoop is written in Java (Score 1) 67

Actually, there is an ever increasing amount of JNI (read: C) code in Hadoop that is in the critical path for security and performance features. Most of that code is not very portable. So either MS is going to pay for some major overhauling of that code, completely new code/branch to replicate that functionality or MS Hadoop is going to be severely lacking in features/performance.

Slashdot Top Deals

PL/I -- "the fatal disease" -- belongs more to the problem set than to the solution set. -- Edsger W. Dijkstra, SIGPLAN Notices, Volume 17, Number 5