Not entirely true. Ubuntu releases a new hardware support package for the most recent LTS release at about the same time as they release a new version of the distro; that's a backport of the kernel used by the new version. In the case of 12.04 they basically FORCED people to install the new kernel after the release of 14.04; they are no longer doing security updates for the old one. There are also sometimes X server updates for LTS systems that have a GUI installed; there is one for 12.04 that uses the X server from 14.04 and is similarly mandatory.
So... you will be able to have new versions of Chrome and Chromium on 14.04... IF you install the hardware update. You won't be able to have them on 12.04 because the 14.04 hardware support is the last version that release will get. Nor can you have them on 10.04, which is near end of life and scheduled to go out of support next month.
The Windows tablets already have HDMI out and USB host ports. The only remaining steps: shrink one down to phone size and add calling capability. These new chips should make it feasible. I hope they go with a separate HDMI port rather than rolling it all up into MHL, because you may also want to connect USB devices when you are using the tablet to run desktop applications. (Using a Bluetooth keyboard and mouse decreases the need.)
A phone won't have the handy full size USB port that my WinBook tablet has (not enough room) but the Micro USB port on the WinBook is also a host port, you just need a cable adapter. The WinBook also has a Micro USB port. Interestingly, the Micro port, despite the form factor, is not a USB On-The-Go port, it only acts as a host. You can't connect the WinBook to your PC via USB to sync. But it's full Windows, so it's easy to move files via WiFi.
Windows 8 doesn't quite have the UI down; it's not quite satisfying as either a desktop OS or a tablet OS. But Windows 10 is moving in the right direction and should get it sorted out. Basically you want a touch-centric phone/tablet UI on those devices with secondary capability to run desktop applications, while on the PC you want a keyboard-and-mouse-centric desktop UI with secondary capability to run phone/tablet apps. The default behavior of Windows 10 is to give you the right interface by default on PCs and pure tablets, and to automatically switch the UI on two-in-ones depending on whether the keyboard is attached.
I got curious when I saw the quote. If I did, I figured that maybe some other reader would as well.
Dijkstra is most famous for his letter to the ACM titled "Go to statement considered harmful". A recent study showed that the GOTO statement as used by current programmers is not harmful - but that is largely Dijkstra's doing (and all the other people who pushed for modular programming and better control flow). Nowadays about the only use of GOTO is as a way of breaking out of loop structures if the language doesn't have another way to do it, but I go back far enough to remember the horrible spaghetti code that people used to write. Heck, I wrote some of it myself.
On your quote... the story is more complicated that "it originated in California". Excerpts from the Wikipedia article on OOP ( https://en.wikipedia.org/wiki/... ):
"Terminology invoking "objects" and "oriented" in the modern sense of object-oriented programming made its first appearance at MIT in the late 1950s and early 1960s. In the environment of the artificial intelligence group, as early as 1960, "object" could refer to identified items (LISP atoms) with properties (attributes)..."
"The formal programming concept of objects was introduced in the 1960s in Simula 67, a major revision of Simula I, a programming language designed for discrete event simulation, created by Ole-Johan Dahl and Kristen Nygaard of the Norwegian Computing Center in Oslo."
"The Smalltalk language, which was developed at Xerox PARC (by Alan Kay and others) in the 1970s, introduced the term object-oriented programming to represent the pervasive use of objects and messages as the basis for computation."
So yes, the term comes from California. But the early work was done elsewhere.
Not really surprising.
Getting the most out of any processor requires processor-specific optimization. Unfortunately for AMD, Intel has the lion's share of the market, so developers pay more attention to getting software to run well on Intel processors. Some of the top tier games that get used for benchmarks have been hand-optimized for Intel, as have productivity applications such as video encoders and Photoshop. (The last two have also benefited historically from Intel having better SIMD implementations. That is probably still true. But an A-series AMD processor with properly optimized OpenCL code might be better still.)
Intel is in the developer tools business as well. They sell a compiler that generates code that is very good for Intel processors and very bad for AMD. Any application that is built with Intel tools is going to make AMD look bad.
Finally, there is the OS issue. Because of the way AMD used paired cores with some shared elements (cache and FPU), getting the most out of the FX series processors requires changes to the process scheduler. (The simplified version: threads of the same process and multiple instances of the same application should be assigned to paired cores; unconnected applications should be spread to different core pairs whenever possible. That maximizes the effectiveness of the shared cache. The shared FPU is of little concern unless you have applications that do math with long doubles; it can do two 64 bit operations simultaneously but only one 128 bit operation.) The most popular OS on the market, Windows 7, has not made the necessary adjustments, nor has any earlier version. Windows 8 and later have, as have recent Linux kernels. Mac OS probably has not, but Apple has never made a computer with an AMD processor so it isn't relevant unless you own a Hackintosh.
This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian