(...) That their plan is to just dump the bacteria in local mud and have it generate electricity.
What I tried to point out is that further consideration is needed on whether the environment needs to prepared/sterilized, i.e. made noncompetitive in ecological sense for the bacteria to do their job. It's not naive, it's biotechnology 101.
(...) pass the mud/waste water/etc through the fuel cell to produce electricity.
As it comes to wastewater, it may be good idea, technology for doing that already exists. Mud is dense so mass transport would be extremely energy consuming.
That's halfway down in the article
WHAT is halfway down in the article? I don't get it. You've quoted half of the article and what? What should I focus on?
I imagine there may even be filters in place where the waste comes into make sure that any natural predators are weakened or killed to continue allowing the organisms to thrive.
Filtering wastewater is a bit tricky because if you're pushing it through a microporous membrane to get rid of organisms bigger than a diameter of a pore, it requires exercising an extra pressure, so it needs energy. Sterilization through UV light, the same thing. One can imagine getting rid of competing organisms by means of injecting a chemical compound for which our bacteria is resistant, but for others it's toxic. Don't get me wrong, I don't debunk the whole idea, just try to point out they're in the middle (well, maybe further because it's a real breakthrough) of preparing something which is of efficient use in real world.
License to publish at Nature Publishing Group (publishing house of "Nature" series of journals, really big payer in the field of natural sciences) draws my favorable attention. The point is that the aurhor isn't required to give out the copyright of their published contributions, instead authors grant NPG a license to publish their paper. As it comes to reusing parts of published papers in future work, prior publishers' permission isn't mandatory. This doesn't work in case of review papers, which are commissioned by the publisher, where NPG is granted full copyright.
Does license to publish do any difference? Yes, because six months after publication the author has right to archive the manuscript in a free-access repository, even on NPG's server.
There's one more thing, which however applies only to biological sciences. Since 2008 those papers in Nature which publish organisms' genome for the first time are copyrighted under Creative Commons attribution-non commercial-share alike unported licence.
To conclude, it's worth noting that the academic world is pushing publishers towards less strict publishing policies, thats a big example.
People using this NIST data do it because it has NIST sign on it, so they don't risk being dependent on tabulated values from not exhaustively verified source. If you're rewritting the source code, you should take care to establish means by which users could check that data are unaltered with respect to what NIST servers contain. If you work for renowned institute, that should be easy, just store the database on your server and sync it with NIST, along with sources of data cited at NIST website.
As it comes to Fortran programming, it's optimal language for scientific computing. Modern dialects have some of the power of C (allocatable arrays, long subourtine names, free format code, modules, interoperability with C), but, what is preferable in scientific computing, programmer isn't encouraged to tinker with machine-specific stuff. Many existing codes are written in Fortran, e.g. powerful LAPACK library and many computational chemistry packages, so for many physicists/chemists/engineers Fortran is the only language they know and care of. Moreover, Fortran in recent years has gained parallel-programming functionality thanks to OpenMP (it's provided with features eqivalent to that in C/Cpp).
The vast upshot of this, is that it helps weed out those websites that are cheating the system, and trying to get their website as the #1 google hit, so they can show you ads. So the large part of what they are doing is tracking spam websites, not real ones.
Actually, it calls for further explanation, because manual tweaking of results produces bias and legal concerns. As guy from Google said,
We don't use any of the data we gather in that way. I mean, it is conceivable you could. But the evaluation site ratings that we gather never directly affect the search results that we return. We never go back and say, 'Oh, we learned from a rater that this result isnâ(TM)t as good as that one, so letâ(TM)s put them in a different order.' Doing something like that would skew the whole evaluation by-and-large. So we never touch it.
Mankind's knowledge stands on the shoulders of Google, so they can't just hire, say, a thousand students and use this evaluation as an significant weighting factor. It's rather a evaluation of algorithms for the sake of further improvement done fully by algorithms.
Life is a healthy respect for mother nature laced with greed.