Then you don't want spinning platters in your laptop draining your battery, you want a full on SSD that just barely sips power with no speed loss.
Nor have I seen anyone attach i- to anything and not have it be a reference to Apple
Weren't around in the 1999-2000 era then huh? Compaq had the iPaq running WinCE / WinMobile about the same time as Apple had come out with the iMac, and the HP / Compaq device is a lot closer to what a modern tablet is than the iMac could ever be considered.
I actually miss my iPaqs, it was nice having a built in stylus + holder - a decent small tipped stylus no less, not like the crap you an get today that is as big as your thumb - and a processor the literally could rival a desktop( my last iPaq was a 633Mhz with enough RAM to truly multitask + SD + CF card expansion with bluetooth, wifi and extra components that used the SD interface, back when desktops usually ran at ~1Ghz unless you dumped a ton of money into a 1.6-2.0Ghz system). Modern tablets had been getting close to the same usefulness again, at least before the Nexus 7/10 didn't ship with expansion....
I know what stable / testing / unstable means with Debian, I've run mostly testing since Etch went into the testing branch back when Sarge went stable. I'm just saying it would make more sense manpower wise to simply track testing for all of the releases.
The LTS releases would not be affected since this is how it is done now, and the every 6th month release would require less debugging if they pulled out of testing since the major showstoppers are pretty much all found before packages can migrate from Sid > testing. That would allow them more time to concentrate on the changes / software / default settings that they develop that makes it more "user friendly" OOTB.
That is the way I would do it anyways, it seems that that way would be the least work pulling in tons of patches from upstream* and being able to concentrate (more, since testing will still need some patches) on the in-house codebase without losing much if any of the "bleeding edge freshness". Except for the freeze right before a new stable release for the most part testing is only at most a few weeks behind upstream releases, and even then if absolutely necessary "backporting" from Sid > testing is usually as simple as installing the newer package and maybe a few depends from Sid.
*Assuming that your devs are not upstream devs for the the packages and are writing the major patches themselves and then shipping them upstream. While some packages I'm sure are like that the likelyhood of all the devs being like that are pretty low.
I did like SuSe10.0 when I ran it on a laptop that had a picky Trident CyberBlade vid card that was a pain to get working on any other distro at the time. Then came 10.1 which ran about as fast as a quadriplegic dog that had died a day before the race...
That said, the one best thing I liked about SuSe? apt4rpm could be used in place of Yast for installing stuff. Yast was AWESOME for configuring the system ( I still miss that in Debian after all these years), but I found it a pain to use for installing stuff after being used to apt with Debian.
That said Yast was still light years ahead of RH5's RPM cyclic dependency hell.
It's Ubuntu. Don't take my bashing it the wrong way, it is a good thing to have an intro level distro for new users as well as pushing to make Linux more mainstream user friendly, but....
The way Ubuntu does things is, in my opinion, insane. They track Debian unstable snapshots which is only minimally tested and then introduce their own bugs on top of the existing bugs in unstable, then try to iron out the worst of the bugs before the next point in the 6 month release cycle comes due. This does not lend itself all that well to making a truly stable user experience. You can even see that at work by tracking users reactions to releases, there have been flop releases that pushed users to jump ship to pure Debian ( seem look / feel / package management experience, just less general hand holding) or rolling back to previous releases and refusing to update.
I know they can't really track stable since Debian has a much longer release cycle, but at the very least they should track testing. Testing generally has the worst of the major bugs worked out ( or the packages wouldn't have been able to move out of unstable ) while still remaining "fresh" enough with updated packages when not in release freeze.
Secondly, it depends. With bug free code you shouldn't be able to crash an OS beyond repair un-intentionally, unfortunately Ubuntu, like every other piece of software out there, is not bug free. It is also possible to be updating sensitive files when doing something else causes a full blown kernel panic instead of a recoverable oops leaving said sensitive files in an unstable / un-bootable state. Not knowing exactly what the OP was doing at the time means we can't only point and say "it was this".
It is a mixed bag of all the above plus what you listen to as well. If you listen to stuff that is all in the midranges you won't care if the lows and highs get clipped off since there really are no lows and highs... If you have decent ears + decent equipment + decent surroundings + listen to something with a full range you probably won't want bitrates much lower than 256-392kbps, depending solely on what you personally hear.
Once you start removing parts of the equation the needed bitrates can drop since you won't be able to hear / reproduce the sounds. Ideally when buying ( for audiophiles anyways ) you would get a very high bitrate lossy file or flac file for your main listening area and a lower ( smaller file sizes ), but still decent, bitrate file for your mobile setup.
I propose an exchange. We in the U.S. will adopt metric if the Europeans start driving on the right ( as in proper, pun intended ) side of the road.
That said, it is not only the sciences that would make decent use of the metric system. Any type of precise work can benefit from it, be it anything in the sciences all the way down to machining. I know this for a fact since I was a fabricator for ~11 years, much of the smaller work towards the end was in metric units... and let me tell you it is easier to visualize in your head a length or width of 4mm than 5/32nds or 11/64ths of an inch.
Plus, once you really start getting down to tight tolerances there isn't a major difference in how you calculate everything. It really doesn't matter if you use hundredths or thousands of an inch or start using smaller and smaller metric units, so why bother using fractions for larger parts when you will go to decimals for the fiddly little super precise parts? It just causes more headaches.
All that is beside the fact that 9 times out of 10 if you are building something for an international company not based here in the U.S. it will be in metric anyways.
Stop spewing shit you don't know. We have a adequate representation through different geological strata to identify times across the globe to within ~50-100MA. Add in radio isotope dating into the mix and you get a very accurate progressive timeline meaning we can view taxonomical changes in a species throughout it existence even in geological time.
As for your "proof" drivel rant, you just "prove" even further that you have no comprehension of what science really is. Go take a good science class at a local University and find out. Science does not seek to "prove" "facts" science just attempts to answer a question with our best observations, test and change our ideas until they are well tested and accepted, change our ideas if new data comes to light, and get more questions to answer in the course of our research to answer the question that we researched last.
You are a fucking idiot. Not only do you NOT know what a theory is, you also know absolutely nothing about evolution. Evolution doesn't say an ape got pregnant and popped out a human, so there is no "missing link". Evolution just means that there had been ( beneficial ) mutations in the offspring from a founder species, and the mutated offspring had a better or equal chance to survive to reproduce more offspring with the same mutations. The founder species does not even have to die off, although genetic drift between the species may mean that after time and more inevitable mutations they may not be able to inter-breed anymore. Thus your argument is invalid.
As for your "science / evolution falls flat on how a dinosaur can evolve into a bird", well you might want to do a little logical research into the relatively recent collagen testing we have done from some of the better preserved fossils. It supports the hypotheses that dinos DID in fact either have a common ancestor to or ARE the common ancestors to our modern birds. Add in the other shared bone structure characteristics and you have a pretty well defined argument for that particular evolution.
But go ahead, what do I know, I'm only one of those fool scientists that believe in what I have empirically observed rather than blindly believing what I read out of a book written and pieced together out of many many disparate and un-sourced stories, hundreds of years after the main character was supposed to live, by a group that was in ascending power.
And you are aware that an ordained CATHOLIC PRIEST* came up with the big bang theory, in a catholic university no less, in the first place? Yeah, the thing that you hear the most trotted out by theists was proposed by one of their own clergy
PERL has lines? I always just sat on my keyboard.... it looks the same.
Realistically, what more do you need from what the BIOS already does?
Processor available: check
RAM available (and optional quick check): check
Network boot / disk boot available / USB boot: check
minimal power management: check
password protection / boot device selection without having to change boot order in BIOS: check
Hand everything else off to the OS boot sector / boot sequence.
Personally I think having the winflash utilities is a very stupid idea too. It's just another way for something to go wrong when writing to chip, it should be done from within BIOS from disk / USB just like it used to be.
The kernel has ( or had, I haven't built one in a while ) experimental write support since at least the 2.6.18 branch or prior... I don't remember exactly when it was introduced. It is completely useless, but it is there.
That said NTFS-3G really is the way to go. It also, as an added bonus, fits the UNIX philosophy: "Do one thing and do it right".
 The in-kernel write support for NTFS only allows you to write to an existing file, and only allows you to write the same amount of data as the exact files size and name on the disk. Not very useful since you can not even create files.
No, I'm not lying. They are not pumping a million+ gallons of any of the pure chemical listed. They are dilute in a water solution since water is "cheap". Even if some of those chemical manage to migrate to an aquifer the molecule count would most likely be in the parts per trillion, and that is assuming that chemical leeching and natural filtration didn't turn them into something harmless by the time they managed to get to the aquifer.
As I said, you are more likely to find sulfides and arsenides that occur naturally.
If every one of those chemicals is so dilute that it only makes up 1-2 parts per trillion, they would not have the effects that are listed beside them.
That was the whole point... the chemicals MAY cause those effects in pure form, but the forms that you would see _if_ they migrated are going to be extremely diluted.
Even someone with virtually no knowledge of chemistry whatsoever probably has a clue how much propylene glycol it takes to have an anti-freeze effect since they put it in their car.
Most people are lucky to understand the difference in Octane rating of the fuel they put in their cars much less what Antifreeze is made from. The dealership takes care of all that messy stuff when they get their oil changed.
I also didn't say anything about the sky falling or what the actual effects of fracking fluid might be. I simply showed that your claims that all those chemicals are harmless was a lie. Your response to my post demonstrates why you get modded as a troll.
One, I never claimed anything upthread. You replied to my first post on the page. Two, can't be a lie... see point # one. And three, you might want to look again. There are plenty of mods that can understand satire and hyperbole, hence my comment standing at (score:3) as of this writing.
Well there is four, your writing of "all those dangerous chemicals" makes you come off sounding scared and screaming the sky is falling. As I pointed out, there are quite a few worse things in our groundwater that naturally occur. That doesn't mean we should dump anything and everything in our aquifers; but using something that is harmful in highly pure concentrations doesn't automatically equal poisoning our drinking water supplies, especially since this is only a possibility situation.
A non-Geologist speaking.
What could you possibly get right?