To be fair, the original poster said it's easy to avoid being a bad manager, that's not the same as easy to be a good manager. There are very specific tendencies that bad managers tend to have.
Having a large house battery has other uses, such as for power outages. But consider this: an electric car has a huge battery capacity, but can charge from mains at only a trickle. That's okay for a commuter where you charge overnight, but if you have heavy use (say, a moving weekend) you need to charge it faster. Having a battery that has a rapid charging connection to the car (like the stand-alone chargers) fixes that problem - park for half an hour, and you're ready to go with almost a full charge.
The early hype around microkernels was that they could emulate other OSes, and that's what led to IBM's attempt. It turns out what killed that and all the other attempts was primarily the fact that to emulate an OS, you essentially had to re-implement the target OS on top of the microkernel - that is, the microkernel did not end up replacing much if anything in your emulated OS. So it was just an added abstraction layer that you would need as part of the OS anyway, and you saved nothing.
That in itself wouldn't have been bad - a waste of time, but not otherwise bad. But microkernels do have inherent performance problems. Specifically any time you have tightly coupled components that have to share data, in a monolithic kernel you share the data and use locks to protect it, in a microkernel it's passed by messages. Even when optimized to just buffer copying, that can add up to a lot when the data is huge, like a video display.
The overhead normally is not really big, so in itself a microkernel can be competitive with a monolithic kernel. There are also technical advantages to compensate. I'm sure you know QNX (used in Blackberry 10, the more reliable automobile entertainment systems, etc.) is an example of that. The problem is the overhead just makes your emulated OS worse, any microkernel advantages are not part of your monolithic kernel code above it, and that makes it pointless. It's just as much work to implement, there's little to no saving, and it's measurably slower.
So in summary, the hype around the stupid idea of emulated OSes tainted the idea of microkernels, but there's nothing wrong with microkernels themselves.
A fair trial is what he asked for since the beginning. But under current U.S law, almost all evidence would be hidden under the claim of "national security" - essentially a secret trial, apart from knowing that it took place. That is, if it was even a trial as opposed to a "tribunal" as happened to Manning - no discovery of evidence, no jury, no impartial judge, just a panel of officers, all hidden from view.
The government wouldn't even have to charge him with anything related to the issues involved. Chances are he hasn't filed a U.S income tax return as required by all U.S citizens, even outside the country. For that matter, an obscure and rarely enforced law requires government papers to emigrate legally. He could be charged with any number of laws which don't allow any "public interest" defence to bring up the issues he wants to raise.
Are there any filesystems still in regular use that use extensions to file names? Or is everyone just talking about suffixes - that is, any characters that people append to the end of a filename to make a new filename?
Fiel name extensions were something subtly different than that.
The article lists his time in office as ending in 2011, not his death. He's still alive as far as anyone knows for sure.
Vaccine trials first started in 2003. The current best candidate was first tried in a human in 2009 in Germany.
The first vaccine to be used was developed in Canada.
An armed populace practically can't be subjugated by any outright oppressor, be it foreign of domestic. If you have to have a gunfight with, and kill most of the populace, then you didn't really 'win' as an oppressor. You can't kill them all.
First, subjugation has many forms. Can you buy a non-low flush toilet in the U.S (federally mandated by George Bush (first) since 1997) no matter how many guns you own? Can you deposit over $10,000 without being reported to the federal government? Can your land be forcably purchased to build a shopping centre?
Second, "force" can be coersive, not just physical. So you have guns. Do you have money? Not any more you don't. Do you have electricity, water, internet, phone service? Nice while they lasted. Can you leave home and go anywhere to get food, gas, or other supplies? Those were the days. No matter how many guns you might have, a seige will eventually end - and not worth it for most people.
Third, both George W Bush's war in Iraq, and Putin's actions against Ukraine shows that even in a modern internet-connected world, the vast majority of a country's population can be completely convinced of something that is demonstrably not true (Iraq had no weapons of mass destruction, Ukrain wasn't overthrown by Nazis putting Russians into concentration camps). When Iraq invaded Kuwait, the daughter of the Kuwaiti ambassador to the U.S testified to Congress that she was actually a nurse in Kuwait who watched Iraqi soldiers dump babies out of incubators to die on the floor (no such event was ever confirmed) - nobody asked even the very first question that would have exposed this lie. Opponents of the U.S government can be adequately demonized, then taken down with overwhelming public support.
Fourth, acting against the entire population might be impractical, but it's much easier to target specific groups one at a time. A large percentage of the U.S population already has nearly no rights already, as a result of nickle-and-diming laws that build up. For example, some states charge court fees to the accused, even when they are found innocent (i.e you used the court to prove your innocense, you must pay for that service), even for a minor crime like tresspassing. The poor often cannot pay, and can be imprisoned for that. There are prison fees, and failing to pay those can extend the term or result in reincarceration on release. There has built up a population of "un-people" who are otherwise law-abiding, but must avoid arrest, relying on a growing underground society of family, friends, and criminals to get illegal work, handle finances, find places to live, and so on. When sick they can't go to the hostpial or be turned in (they have back room "clinics"), when a victim of crime they can't go to the police. They can't use banks (so need cash, which the police can take as mentioned in the posted story). For other people, many are denied voting rights due to technicalities like lack of a drivers license or permanent residence. People caught urinating in public are put on a sex offenders list, which has such impossible restrictions on where to live and limits to work these days that many need to go into hiding just to survive. Minorities are stopped and searched on New York streets for no reason other than being black or hispanic.
Those are things that are already done. Those laws and actions are supported because the victims are "criminals" and in a black-and-white viewpoint, a "technical criminal" is as much a criminal as a murderer, and deserves no rights (and to be accused is to be a criminal).
All put together, this means even if the entire free population of the U.S were armed and trained, they could still be subjugated completely by a government that wanted to. Keep in mind that the repressed population of Iraq (pre-2003 overthrow) was also heavily armed (rifles mostly), but that didn't help them against Saddam Hussein's well organized repression.
Best description I have is that dark energy isn't the explanation, it's the description of the problem.
When a government tries to censor something, it usually means two things:
- 1. It's true.
- 2. They want it to be true.
The problem with software efficiency has always been this: There are millions of applications, programs, libraries, etc. created, often redundant and amateur. They all run on one of a handful of CPU core designs. Spending the effort to optimise a CPU speeds everything that runs on it. Spending the effort to optimise a program speeds up one program (most libraries, maybe a handful, and only sometimes).
There are still possibilities for CPU improvement. Transfer triggered architectures, dataflow, counterflow, asynchronous, content addressable memory, smart memory - many ideas had promise, but Moore's Law (and incompatability) meant that established techniques improved CPU speeds faster than the new ones could be commercialised (you might remember RISC as the only one that made it, barely). Without Moore's Law, there will be opportunty to work on the alternatives.
[...] I don't see a point to remotely control a washer or toaster over the internet.
There's a sort of blindness that people have when they see what exists and can't imagine it being different. The creator of Babylon 5 once described seeing an old SF movie (Flash Gordon maybe?) where the crew had to abandon a space ship. They grabbed their laser blasters (handheld), anti-gravity belts (little box ona belt), and the portable radio, a giant box that needed two people to carry it. Because nobody knew what laser guns or anti-gravity belts look like, they could imagine science making them arbitrarily small, but everyone knew what a radio looked like (tubes and all) and couldn't conceive of the science that could shrink it into, say, a $5 item that you can lose in a purse.
Phones are a good example of how they can change so much, it won't be long before "phones" of the past won't even qualify as what anyone understands a phone is (it's already happened once - even if you think of a phone as something with buttons or a dial you use to connect a voice circuit to someone, older "phones" that did nothing but ring an operator who you talked to used to be the standard for decards, but wouldn't qualify if you went to buy one).
As for washers, toasters, fridges, etc., don't think of them as they are now. Think of a future where displays cost about what a laminated decal does. You could look up washing instructions on the washer lid. For that matter, the washer could look up washing instructions for clothes based on microscopic RFID tags (like how those "Tassimo" coffee makers read bar codes from coffee packets now). The fridge could display a recipe - yours or one you looked up - in an app with checkboxes you tap when you've taken an ingredient out, used it, or need it (links to a shopping list on your phone). The toaster? Same recipe - if displays are nearly free and you have a dozen, why not use them all? Heck, put displays on your coffee cups, for no other reason than you can tap it and tell your coffee maker to start a new cup before you walk over to it - and display someone's picture on the mug meanwhile.
Every day I see things that are awful that people accept as normal. Mostly device controls and interfaces, but other things. Like a digital monitor. It has it's own display memory, so why does the computer send the same image to it 30 times a second? Worse, the same image! Evenin a laptop! Something's fundimentally wrong about that very concept.
There's no end to things that need improvement. And as technology gets cheaper, I sure hope that'll finally happen.