The cairo-ickle blog has maintained very interesting benchmarks of the different cairo rendering backends. The short story is that every hardware accelered backend except for sandybridge SNA has performed worse than the software implementation. And in some cases the hardware acceleration is significantly less stable. I'm curious to see if this finally pushes Glamor over the hump and makes it faster than the software path.
XP is over 12 years old, that's one hell of a *free* long term support package.
How long it has been since a company sold a product to their first customer is irrelevant. What matters is how long it has been since they sold the product to me. Microsoft stopped retail and OEM sales of XP in June 2008, which was shortly after Vista SP1 was released and most if it's problems had been fixed, and a bit more than a year before Windows 7 was released. Those customers got just shy of 6 years of support, which is still pretty darn good. In comparison, Ubuntu offers 3 years of support for an LTS release after it's replacement comes out, and OS X tends to be about the same. However, those both offer free or cheap upgrades so a shorter support cycle is at least somewhat justified.
For corporate customers, the support provided by a RedHat subscription is entirely comparable. No moderately sized company can get away with using OEM/retail licenses of Windows/Office; they all pay some sort of subscription to MS. RHEL 5 will be supported for just over 6 years after RHEL 6 came out. RHEL 2-4 were each supported for 5 to 5.5 years after their successor. Both MS and RH have extended support for critical security bugs beyond that, but both cost extra money. Recent Solaris releases are as good or better (depending which support phases you consider comparable).
So for corporate users, XP's support duration was reasonable and in line with the rest of the industry. For consumers it was much better for people who have to stick with older OSes for compatibility, and hard to compare once you start considering free upgrades (is an OS X point release comparable to a windows SP release or an OS release, etc).
The way this is setup isn't that that you code everything in natural language, rather it is just a shortcut to look up the correct formal language. Instead of searching/browsing documentation looking up the exact names of the functions you want and how to chain them, you just type what you want in natural language. If it interpreted you correctly, then great it saved you several minutes, and now you know the real syntax to use in the future. If not, well you only lost a couple seconds.
The idea of mixing natural language like this isn't so weird; the first step that most programmers would take in looking up documentation when they don't even know the name of the library the functionality is located in is to perform a natural language search on web browser, and then go from there. This just takes it one step further and streamlines the process, which is perfect for a interactive language.
Homeopathy is not silly; it is a lie. If you sell it, you're lying to people. So it matters that Whole Foods sells it, as it casts doubt on their grasp of science, which indicates their "healthly" foods are just marketing to the credulous.
Products in regular supermarkets are also filled with lies, and both have products that better than the other in some way or the other. Solution: make your own decision rather than expecting a corporation to base their decisions on science rather than on what sells best.
You don't need a multi-billion dollar plant to make the processors. You just need to pay someone who does. You can get small quantities of ASICs made for around $2-5k by taking advantage of programs that put many designs from different people on the same wafer.
I just did some queries and the only copyright statement I see is the standard one at the bottom of their page. They do have a legitimate copyright on their pages, including the layout, design and the content which they created. That notice doesn't necissarily imply that they claim to own the facts that are being displayed. In fact, they frequently provide citations for those facts, which implies that they don't claim to own them.
Personally, I think the Doom concept translates poorly to modern gaming. Tolkien-esque fantasy is to RPGs -- revolutionary in its time, but bland and generic today. Modern games need distinctive characters, settings, stories, and gameplay to succeed (artistically, anyway). Modern games need distinctive characters, settings, stories, and gameplay to succeed (artistically, anyway).
I disagree. The recent Serious Sam releases were great, and showed that the old-school FPSs formula still makes for a good game in today's world. Fast paced, lots of shooting, and meaningless plot. The good character helped, but it was the gameplay that really made it stand out. The problem with Doom 3 wasn't that it failed to add all the things that modern FPSs have, but rather that it failed to replicate the fun gameplay of the originals.
The biggest part of his announcement is that this checking is done client side; your DNS history is not sent to Valve. They also only record MD5 hashes that match the cheat sites they are looking for, not your entire DNS history. Finally, they claim to only check for DNS lookups of servers used by the cheat software itself, not just websites where you might read about and download cheats (although in some cases I imagine these could be the same), and use this as a second check after the client has already detected a cheat installed on you machine. So simply visiting cheat software websites without using them shouldn't get you banned.
Nothing in this bill will require federally funded research to be public access, nor will it fund archival efforts for the raw data. Rather it requires that the EPA ignore any science that isn't open access, after the GOP spent years of fighting against open access.
If you consider just current employees, about half started before me and half after me. But if you consider everyone I have worked with at this job, probably close to 4/5 have since moved on to other jobs. Since most people here either stay all the way till retirement, or for just a few years, without much middle ground, the second measure will continue to grow much faster than the first.
Some of the older Tesla charging stations have SAE J1772 connectors, in addition to superchargers. There is no reason to doubt that he has used them with his Leaf. He's not lying, just being willfully ignorant about the difference between a supercharger and a standard slow charger.
The W3C standard for Regions has mostly been created by Adobe
CSS Regions is not a W3C standard. It is a Working Draft. The entire point of publishing a working draft is to solicit feedback from the community. There have been several working drafts that were never promoted to final recommendations, because there was no community consensus that they were a good idea. What Google and Mozilla are doing is a perfectly constructive part of the standardization process.
A new error code won't help, because for that to work the original website would have to send it. But if a link is broken, then they already negelcted to send a usefull response code. This feature is about how the client responds to a 404 error, in which case the most honest thing to do is show the user the 404 message that the site provided, but also let them know that they can access an older version of the page if they wish. Which is pretty much how the existing Wayback plugins work.
None of those examples should result in a broken link if you are maintaining your website correctly. And this feature is only "fixing" broken links; that is links that once existed and are now 404'ed.
If you want to discontinue a product, then replace those pages with one that explains that the product is discontinued, and provides links to simular current products, as well as the support page for the discontinued product. If a users is clicking on links in reviews or forum posts about your old product and receive 404's, or redirection to a completely unrelated and unhelpfull page on your site, they will be frustrated with or without this feature.
In the second case, just redirect the entire demo website URL tree to a current list of examples.
In the third case, you shouldn't do that without redirecting the old url to the new one. Seriously, are you trying to make your content hard to find?
Again, redirect to the new menu.
In no case is sending a user a 404 useful or benificial, nor is it the most appropriate thing to do according to the HTTP standard. If you really want to be pendantic then send a 301 or 303 to perform the redirect, otherwise use URL rewriting, or just change the contents of the existing URL, whichever is easiest. The user should only see a 404 if they clicked an invalid link that was never a real URL for your website. Otherwise, you have failed your users, and it's no-one's fault but your own if they choose to use a service that tries to make up for your short-commings.