I was under the impression that we're living amid a second generation of stars. Early-universe stars were very large, very bright, and have all died already, giving birth to the current generation of cooler, longer living stars we see in our night sky. Granted, I'm basing that off of reading the Xeelee sequence, so who knows.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Which makes this approach seem similar to that heavy breathing about black holes not existing due to quantum corrections. It's headline hyperbole. So mathematical singularities may not exist, and there is very likely a minimum length in the universe.
This paper seems to exchange one infinity (density) for another (time in the past). If there was no initial bang and no crunch to come what was the universe doing for that infinity in the past before it expanded?
The *potential* harm is that you buy Acme ThingMaker 5.0 and make great works of meaningful art, and Acme requires you to pay them more money for ThingMaker 6.0 to access the works you've created with tools you bought that have "expired". Intentional software obsolescence was a real problem 10 years ago. RMS is part of the reason it is less of a problem now.
I still think he's wrong about interop with LLVM for EMACS, but he wasn't wrong about the dangers of proprietary software and the need for someone to keep vendors honest.
And herein lies the rub. Richard is arguing in favor of restricting the functionality of GNU software in hopes of ensuring others don't get some unintended use out of it.
On principle, that flies in the face of everything he has built.
What point is there in pouring resources, be they money or sweat equity, into software that is less useful by design? (Again reminiscent of FSF's "Broken by Design" campaign.)
The trouble is you've forgotten who's sitting between you and the developers. The developers themselves often don't make the call between fixing a bug or adding a feature. Unhealthy Scrum practices often lead to "stakeholders" usually Product Management or Marketing prioritizing features over defects and technical debt.
I've seen the organizational vertical slice approach work very well in the past, but you have to have management team that enjoys responsibility. Once the company goes down the Matrix-Management path, it's more about spreading blame around and abrogating accountability.
That would make some sense if the projects themselves were intended for highly concurrent operation, thus the choice of language, thus the defects in that category because that's what the code is for.
I will say that the all of those languages have very particular models for concurrency, such that misunderstanding the models can lead to design errors in the code. Harder problems plus clever code often yields brutal bugs.
So which connector do I need to flush to get Google fiber?
Step one: corner the maple syrup market.
The mathematics of the problem of Ebola propagation are clear. So are reasonable responses to limit propagation chances. But we let politics decide against science. Who are the science-deniers now?
Needs a banana for scale.
"A giant steaming pile of awesome."
Nah... Noonian Hutt, Jabba's son.
You forgot to mention that the Democrats had provisions that allowed the FCC to dictate "balance" for political opinion sites. They tried to sneak in censorship provisions, and a fast-lane (in the deal with Google), in the final version of "Net Neutrality".
There is zero correlation between the name of the proposed law and its effect in practice. If you draft legislation and call it "The Save the Babies Act", then include provisions for removing arsenic regulation in the water supply coupled with a gag order, the law is bad and should be struck down. This is especially true when one side blocks amendments that would fix it (*cough* Democrats). You'll have to suffer the opposition and the media trumpeting about how you "hate babies" and have investments in pitchfork factories, but you really should work to kill that proposed law.
... whereas catestrophic failures of nuclear plants are always going to happen and likely to be injurious to human health. In conclusion I don't think we can conclude that nuclear is safest, I think wind looks safer from those Forbes figures.
Fukushima: Built in the late 60s and early 70s.
Chernobyl: Began operation in the 70s.
Three Mile Island: Constructed in the late 60s and early 70s.
We have far newer designs for nuclear reactors for which it would be physically impossible for them to melt down or fail catastrophically. 40+ years experience in how to do something better can count for an awful lot. For example: http://en.wikipedia.org/wiki/Pebble_bed_modular_reactor