The quickest numbers I could find say that at the scales of large power-plants, the generator is very efficient, but the turbine not so much, around 50%. This would put the system as a whole at around 40% efficency sunlight -> electricity. That's competitive with the best solar voltaic systems tested in the lab, and 50-100% better than practical systems on the market. Assuming their system really does scale up to power plant sizes, of course.
Microsoft brought over 25,000 Nokia employees in the merger of which 12,500 are to be laid off in the next 6 months. Probably all that's left is the hardware engineers, with nearly all of software, marketing and management getting the boot.
We'll have more information about the gravity attributes and locations of dark matter,
Both of these experiments aim to detect collisions of dark matter particles with their respective detectors, and if found give an estimate of the particles energy. Neither are astronomical surveys that would tell us anything about the gravitational properties or distribution of dark matter.
I would even argue that as long as the students who did most of the work have their name listed as first author, there is nothing wrong with this arrangement. I dropped out of my master's program after the first semester because I was being pushed to publish, but wasn't being plugged into any research existing programs. Every "unique" idea that I thought of turned out to have already been studied exhaustively back in the 70's or earlier. All the favorite students in the grad program were people who ignored this inconvientent fact and managed to get rehashed bullshit accepted into conferences.
Several years later I went back to school at a large state U that plugged me into the work they were doing, showed me what the state of the art was and where there were gaps that hadn't been researched in detail. Without building off the ideas of my advisor I would have never been able to do meaningfull research that progressed the state of the art, and would have had nothing worth publishing. He deserved to have his name on my papers.
He recommended deploying an alternative browser, not replacing IE altogether. That way when IE has a bad vulnerability you notify everyone to temporarilly use the alternate on external sites, use group policy to disable vulnerable features, or even block it at the firewall depending on the severity. They can keep using IE internally during that time. Then when a patch comes out you deploy it and lift the restrictions. The next week when firefox has a zero-day, you do the same for it, and recommend people use IE for the time being. It is a very sensible way to allow the most productivity possible while staying secure.
If they really need to use Active X on externall websites during a vulnerability, you can whitelist those sites in Group Policy if needed, but honestly I would just consider the downtime a cost of doing business with outdated insecure technology in most cases. Cleaning up a bad worm/virus that spread through the entire campus could be much more expensive.
Agreed. I've done this in the past and starting as close to the original analog telemetry stream as possible is essential. Even if the noise is so bad that analog filtering doesn't recover any new data in the preD, simply knowing where there is missing data and exactly how much can help tremendously in reconstructing the data. Their raw mpeg files don't provide any of that information.
Same here. I tried TurboTax one year and it didn't save me any money, didn't really save me any time, and had annoying DRM. You have to research what you can deduct on your own anyway in advance anyway so you can preserve documentation throughout the year, and that is the time consuming part. So paying money just to have software fill out and submit the form doesn't seem worth it for me.
The translation of a literary work can be purely scholarly or purely artistic, but usually it is a mix of both. Given Tolkien's mastery of both worlds, and the fact that his love of Beowulf went far beyond linguistic and historical study, it is pretty clear that his translation will be of broad literary interest, not just scholarly.
Where I live the community colleges are inexpensive, but do not have flexible class times for working people, and most of the tracks that have good job prospects have 2-5 year waiting lists. So many students choose to rack up the debt at TVI, PMI, UoP, where they can start immediately and continue a full-time job.
The problems at our CC are mostly because they can't attract enough instructors. The community college pays them half of what of what they would make working in the field or teaching at a for-profit college, and are horribly mismanaged. In the electronics department, I frequently heard the instructors compain about pressures to dumb things down to pass more students. The place where I work has started to favor techs from TVI & DeVry because the quality of students from the CC has decreased. When my wife was doing her nursing degree, the department head would be constantly changing things (like room locations, curiculum dates, rules about how to evaluate students, etc) literally the night before class, so the instructors could never be prepared for class. Many people are willing to take a pay cut to do something that they enjoy more, or work under a horrible boss if the pay is good, but very few are willing to do both.
It would be a delicious irony if people were able to recover some of their lost value due to government regulations.
You mean like what would have happened if they were regulated like a real bank?
This has nothing to do with applying banking regulations to Mt Gox. It is about applying laws about fraud and theft. The difference is that regulations put a burden on innocent and guilty alike but potentially prevent problems before they occur, whereas laws simply attempt to punish the guilty and compensate the victims after the fact. If people do in fact recover any money as a result of this, it won't be particularly ironic since libertarians fully support laws on fraud, just not banking regulations, and complete federal control of currency.
Okay, the cybersecurity negotiator ignorance is bad, the rest less so.
I have been a happy man ever since January 1, 1990, when I no longer had an email address. I'd used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime.
Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don't have time for such study.
- Donald Knuth
The role of Supreme Court Justice is also "to be on the bottom of things". It is possible to understand enough about email to make good judgements about it without using it on a daily basis. The justices have to make weekly about subjects which they have absolutely no interaction with in their normal day-to-day life. From technical to finance to agriculture, no one can possibly be an expert on all the issues they hear. It is their job to constantly learn enough about a subject to know what is important from a legal and constitutional point of view. If they are failing to do this, then that is a legitimate complaint. The fact that they weren't familiar with "common knowledge" technologies before encountering them in court, or haven't chosen to incorporate them into their life isn't.
The cairo-ickle blog has maintained very interesting benchmarks of the different cairo rendering backends. The short story is that every hardware accelered backend except for sandybridge SNA has performed worse than the software implementation. And in some cases the hardware acceleration is significantly less stable. I'm curious to see if this finally pushes Glamor over the hump and makes it faster than the software path.
XP is over 12 years old, that's one hell of a *free* long term support package.
How long it has been since a company sold a product to their first customer is irrelevant. What matters is how long it has been since they sold the product to me. Microsoft stopped retail and OEM sales of XP in June 2008, which was shortly after Vista SP1 was released and most if it's problems had been fixed, and a bit more than a year before Windows 7 was released. Those customers got just shy of 6 years of support, which is still pretty darn good. In comparison, Ubuntu offers 3 years of support for an LTS release after it's replacement comes out, and OS X tends to be about the same. However, those both offer free or cheap upgrades so a shorter support cycle is at least somewhat justified.
For corporate customers, the support provided by a RedHat subscription is entirely comparable. No moderately sized company can get away with using OEM/retail licenses of Windows/Office; they all pay some sort of subscription to MS. RHEL 5 will be supported for just over 6 years after RHEL 6 came out. RHEL 2-4 were each supported for 5 to 5.5 years after their successor. Both MS and RH have extended support for critical security bugs beyond that, but both cost extra money. Recent Solaris releases are as good or better (depending which support phases you consider comparable).
So for corporate users, XP's support duration was reasonable and in line with the rest of the industry. For consumers it was much better for people who have to stick with older OSes for compatibility, and hard to compare once you start considering free upgrades (is an OS X point release comparable to a windows SP release or an OS release, etc).