But the new corporate headquarters overseas would presumably not be bound by the subpoena. In that case, it sucks to be the guy in the US subsidiary who just got subpoenaed. Because he is unable to comply.
That said, the real papers you want to be on the lookout for are cathode improvements, there's a lot more potential for volume/mass reduction there than in the anode.
Exactly, all articles I can remember offhand for the cathode talk about a capacity of less than 200 mAh/g for existing cathode chemistries. So the cathode would make up most of the weight of the battery.
If the technology from TFA works out, maybe we can get a 20% - 30% improvement in overall energy density.
Many years ago, I joined a group of audiophile students at my university and we were building a pair of two-way loudspeakers with semi-expensive chassis. Students' budget, but it still had potential. When experimenting with crossover components, we soldered things together at first, then someone had the idea to use alligator clips (two each connected by a cable soldered to the clips) for faster turnaround.
The sound quality, which had been quite good up to that point, suddenly dropped to that of a cheap speaker from some supermarket. The ohmic resistance of the cable between the alligator clips was IMHO too low to have much of an influence.
It must have been the alligator clips, and good contacts matter. Since that experience I like to use gold-plated connectors, but with standard cables to connect them. That combination tends to be cheap enough and works for me
Depends on your main board.
My last purchase from 2011, an ASUS M4A7LT, has an onboard sound chip that cannot drive headphones at more than low volume.
At low volume it sounds good, and I'm sure it would be adequate to drive the input of an amplifier. But when I put in my (low impedance, maybe 30 Ohms) walkman headphones it fails miserably. Severe clipping as soon as I turn up the volume a bit.
Instead of putting an external amplifier on my desk, I put a sound card into the PC. Problem solved.
Tepco itself has estimated a damage of $137 billion, see http://www.bloomberg.com/news/2012-11-07/fukushima-137-billion-cost-has-tepco-seeking-more-aid.html
Considering the low probability of a serious reactor accident, individual utility companies might bet on not having one in their powerplants and have no insurance unless it is mandatory, like compulsory vehicle insurance.
Set the minimum coverage to something that would cover Fukujima (estimated $100B) and there is your market-based solution
Sure there is some monkey work at the lower levels of support, especially in a "free" hotline where you don't get billed for calling. Several years ago I met a guy who did first level "support" for Microsoft, following a script from a database. But even there, I think second level should have some actual skills, as they are the ones who handle the cases that are too complex for the script monkeys.
At my current, relatively small company, the hotline (which is AFAIK costing more than peanuts to call) offers what you might find at second level support in a company that follows the above pattern. People who are familiar with the product and don't need to follow a fixed script. Some of them are actually quite good, based on years of experience.
Cases that are too hard for the hotline go to the "repair team", those are software testers who otherwise do QA on upcoming releases. I guess they are at least the equivalent of 3rd level support at a place like Microsoft. The "repair team" can talk directly to software development and ask for fixes, we trust them to distinguish bogus calls from real bugs.
A stagnant unspending base of users damages the entire tech ecosystem. They hold back technological progress creating a tragedy of the commons when it comes to software and web services features.
Unspending users can only hold back technological progress if software vendors keep maintaining obsolete technology to please them. Which doesn't make much sense, except in the context of trying to keep meaningful competition from arising. But maybe that is exactly what Microsoft is trying to achieve, even at the expense of earning less from the well-paying customers who might embrace faster progress.
There is the following Bill Gates quote:
"And as long as they're going to steal it, we want them to steal ours. They'll get sort of addicted, and then we'll somehow figure out how to collect sometime in the next decade." (Source: http://articles.latimes.com/2006/apr/09/business/fi-micropiracy9)
A clear case of trying to keep competition down even among the ultimate unspending non-customers.
This is beyond obvious by now. I'm somewhat surprised that the two major political parties don't suffer a larger loss of popularity over this (the SPD is gradually losing in the polls, but arguably for other reasons).
That would make it the next flop. Lots of applications are still 32bit, and there is no reason to enforce a quick change here. 4 GByte are not enough for everyone, but for many users they are. Take x86 support away, and the complaints will be enormous.
It will take at least another 10 years until a Windows without x86 support is accepted.
For the most common purposes, like text documents and spreadsheets there is already ODF.
It is even an ISO standard. Unless there are unexpected problems with things like Asian fonts, that should be a no-brainer.
A German company named Alphakat is developing a similat technology.
On their website they claim to have some pilot plants already in production:
In other ways too. A few years ago they started using translation software for the non-english pages of MSDN. The quality is as expected.
Older Word versions (Word 6, Word 2000) were error prone enough that the number of software crashes exceeded my number of stupid mistakes.
Now Office 2010 has changed that for me, so *today* you are right (and Libre Office is also pretty stable). But historically, GP had good reasons for his attitude.
Depends on how much that percent of the CPU die holds the rest back in terms of complexity and maybe performance limitations (not really my area of expertise). You may be right that it does not really matter.
On the other hand, "prior to 1992" means DOS and maybe Windows 3.x software. I'm aware that there are still a few DOS-based maintenance tools for the PC around, but otherwise I don't know anyone who still works with DOS software.
I used to work for a company that was really backwards that way, until a few years ago they produced a medical device with DOS-based software as "implicit real time system" (no other thread that can steal the CPU). But even they have given up on DOS, as the technical limitations became too bothersome. The successor of that device, now on the market, uses Windows 7 with a real time extension to the OS.