The fact that they never touch the philosophical issues of "droids rights" makes me classify Star Wars more into the Fantasy than in the Science Fiction genre. It takes place in a universe where apart from some engineering progress towards bigger weapons no scientific progress is made (except maybe the midichlorians lapse), and technology itself is never questioned but is just a plot device. Just like droids.
Slashdot videos: Now with more Slashdot!
No, warmer would be bad.
A warmblooded animal, such as mammals with their core temperature of ~37ÂC for mammals and few degrees more for birds, constantly produces heat. That is heat must go somewhere, otherwise it would lead to overheating. So the only choice is to run at a temperature which is above that of the environment. Once those temperatures come too close to each other, all animals reduce their activity more and more to prevent said overheating.
So, a jump in global temperature, i.e. one that is faster than evolution can keep pace with, would pose a serious threat to animals in areas where the gap between their core temperature and the environment is reduced.
How far is multiarch support, i.e. being able to install 32 and 64 bit packages along side each other, and that not just on the Intel architecture but any CPUs which support both 32 and 64bits?
Have any other distros pulled this off?
Sounds like the return of the Sapir-Whorf hypothesis
Just what I was thinking when I read the article. And then I had to think of the Marain, a fictional constructed language in the Culture universe. I wonder if a society would actually decide to change their language if there was sufficient evidence that it hinders their cultural development. Sort of like the switch to the Latin alphabet as it happened for Vietnamese and Turkish, only a bit more invasive.
tailor a bacteria to attack or compete with a bacteria which you needed to control
This already exists in the from of a virus which attacks bacteria, also known as a Bacteriophage. It doesn't even have to be programmed from the outside to keep up with the evading, evolving bacteria; it just evolves as well. And even if you wanted to "program" this feature, you'd have to deal with the nasty problem of protein folding in silico. Better to leave this entire process highly parallel in wetware.
programmable immune system
Also known as Vaccination, and this happens naturally after every infection. And again you don't have to program anything, it uses a random walk to find matching antibodies which attach themselves to bugs.
This discovery will sooner result in a very parallel, but also clockrate wise very slow computer than in immunological advances. And if this gets used in the human body via gene therapy it will be used to regulate genes, i.e. as an if/else block, not to calculate anything fancy.
- Antibodies are much larger than your typical antibiotic molecule. The latter is like jamming a wrench into a very specific part of the cellular machinery to grind it to a halt. If a mutation in the machinery changes the location where your wrench used to fit you have a resistant bacteria. Because antibodies are larger a single mutation usually doesn't throw them off. This however also means antibodies can only attach themselves to the surface, and that usually doesn't kill the bacteria but flags it for the immune system. The small molecules can pass through membranes and attach themselves anywhere. Finding the spot and designing a fitting molecule is the hard part. And since that is even harder for larger antibodies, i.e. proteins, my guess is they want to take those you find in nature and multiply them.
- The immune system has its own evolutionary process to counter the problem of a moving target (somatic hypermutation, sidenote, the other idea here is to use bacteria eating viruses, phages, which evolve on their own). One way to jumpstart that is plain old vaccination, maybe there are plans to introduce those blueprints faster.
However, once in these fields, there is the entirely different issue of the glass ceiling, i.e. not getting promoted beyond a certain level.
Light pollution is turning us into the Krikkit!
Oh, but you do get mutations! In fact, mutations which allow you to defeat H1N1! And not just a single replaced amino acid, no, lots more! Now how does that silly virus look?
When an immune systems B-cell find something it doesn't like, such as a virus, it goes into a feedback loop, mutates itself so that some copies will dislike said virus even more. In the end you have an immune system against which this virus doesn't stand a chance even though it was a completely unknown pathogen hours earlier. And this response will remain intact for years! (see: vaccination) This is called somatic hypermutation. On the downside, somatic means it won't make it into your germ line so your children will have to mutate all on their own again (though IIRC some of the mothers immune system cells make it into the child to help out a bit).
When the curve became flatter less understanding was required, however more people started using it. So I wonder if the mass adoption of technology compensates for the reduced required depth, i.e. if the first easy steps encouraged more people to take a deeper look at things compared to when you had no choice but to do that.
Data on the percentage of computer users in each generation which were hobby programmers at a certain would be interesting.
As much as nuclear energy would help reduce CO2 emissons, the the anti-nuclear crowd has to be seen as a "force of nature" making new power plants less likely. The idealist would fight against irrationality, but as a realist I would redirect that energy elsewhere, e.g. against the NIMBYs who think wind turbines ruin the coastlines and kill birds or bats.
Also, if oil is non-renewable because it takes millions of years to re-form, then nuclear fuels are the ultimate non-renewable with a "when is the next supernova due?" regeneration period. And the energy density and relative ease of use is just too good to waste it powering our washing machines and slashdot browsing. Maybe in a few hundred years outer solar system exploration will be in a serious crunch because the lack of a good power source after all the uranium, thorium, plutonium etc. has been used up.
Ubuntu 8.04 was great, but it took me some time to get used to it, and sometimes it didn't feel like Unix. It had its own way of doing things, and customizing things wasn't so simple. Anyway, it was doable, but you had to do it the debian way, and the ubuntu way. Just knowing Unix wasn't enough. Some things seemed unnecessarily complex.
9 added even more tricks, but was still ok.
I recently upgraded many of my systems to 10.04. They decided to change everything again. Ubuntu has become unnecessarily complex. With this upstart crap, they obliterated 30 years of Unix tradition. Many things are so buried behind poorly documented ubuntu-ways of doing things, that you actually have to dig for hours in order to find how something is actually being done.
Yes, it works, and it looks great, and it's a fantastic modern operating system. But it isn't Unix anymore. What used to be accomplished by a simple symlink (and undone by deleting that symlink) has now been replaced by tones of little seemingly isolated shell scripts. They keep changing the way things are done, and implementing new abstraction layers implemented mostly through shellscripting. But they sometimes maintain compat with the original positions of the files you are looking for (Through yet more scripts).
For instance, delete the symlinks to
So, I ask Slashdot, do we really need this? Is this moving-away-from-unix trend really necessary? or are we just reinventing the wheel and needlessly alienating old school sysadmins?"