The Windows tablets already have HDMI out and USB host ports. The only remaining steps: shrink one down to phone size and add calling capability. These new chips should make it feasible. I hope they go with a separate HDMI port rather than rolling it all up into MHL, because you may also want to connect USB devices when you are using the tablet to run desktop applications. (Using a Bluetooth keyboard and mouse decreases the need.)
A phone won't have the handy full size USB port that my WinBook tablet has (not enough room) but the Micro USB port on the WinBook is also a host port, you just need a cable adapter. The WinBook also has a Micro USB port. Interestingly, the Micro port, despite the form factor, is not a USB On-The-Go port, it only acts as a host. You can't connect the WinBook to your PC via USB to sync. But it's full Windows, so it's easy to move files via WiFi.
Windows 8 doesn't quite have the UI down; it's not quite satisfying as either a desktop OS or a tablet OS. But Windows 10 is moving in the right direction and should get it sorted out. Basically you want a touch-centric phone/tablet UI on those devices with secondary capability to run desktop applications, while on the PC you want a keyboard-and-mouse-centric desktop UI with secondary capability to run phone/tablet apps. The default behavior of Windows 10 is to give you the right interface by default on PCs and pure tablets, and to automatically switch the UI on two-in-ones depending on whether the keyboard is attached.
I got curious when I saw the quote. If I did, I figured that maybe some other reader would as well.
Dijkstra is most famous for his letter to the ACM titled "Go to statement considered harmful". A recent study showed that the GOTO statement as used by current programmers is not harmful - but that is largely Dijkstra's doing (and all the other people who pushed for modular programming and better control flow). Nowadays about the only use of GOTO is as a way of breaking out of loop structures if the language doesn't have another way to do it, but I go back far enough to remember the horrible spaghetti code that people used to write. Heck, I wrote some of it myself.
On your quote... the story is more complicated that "it originated in California". Excerpts from the Wikipedia article on OOP ( https://en.wikipedia.org/wiki/... ):
"Terminology invoking "objects" and "oriented" in the modern sense of object-oriented programming made its first appearance at MIT in the late 1950s and early 1960s. In the environment of the artificial intelligence group, as early as 1960, "object" could refer to identified items (LISP atoms) with properties (attributes)..."
"The formal programming concept of objects was introduced in the 1960s in Simula 67, a major revision of Simula I, a programming language designed for discrete event simulation, created by Ole-Johan Dahl and Kristen Nygaard of the Norwegian Computing Center in Oslo."
"The Smalltalk language, which was developed at Xerox PARC (by Alan Kay and others) in the 1970s, introduced the term object-oriented programming to represent the pervasive use of objects and messages as the basis for computation."
So yes, the term comes from California. But the early work was done elsewhere.
Not really surprising.
Getting the most out of any processor requires processor-specific optimization. Unfortunately for AMD, Intel has the lion's share of the market, so developers pay more attention to getting software to run well on Intel processors. Some of the top tier games that get used for benchmarks have been hand-optimized for Intel, as have productivity applications such as video encoders and Photoshop. (The last two have also benefited historically from Intel having better SIMD implementations. That is probably still true. But an A-series AMD processor with properly optimized OpenCL code might be better still.)
Intel is in the developer tools business as well. They sell a compiler that generates code that is very good for Intel processors and very bad for AMD. Any application that is built with Intel tools is going to make AMD look bad.
Finally, there is the OS issue. Because of the way AMD used paired cores with some shared elements (cache and FPU), getting the most out of the FX series processors requires changes to the process scheduler. (The simplified version: threads of the same process and multiple instances of the same application should be assigned to paired cores; unconnected applications should be spread to different core pairs whenever possible. That maximizes the effectiveness of the shared cache. The shared FPU is of little concern unless you have applications that do math with long doubles; it can do two 64 bit operations simultaneously but only one 128 bit operation.) The most popular OS on the market, Windows 7, has not made the necessary adjustments, nor has any earlier version. Windows 8 and later have, as have recent Linux kernels. Mac OS probably has not, but Apple has never made a computer with an AMD processor so it isn't relevant unless you own a Hackintosh.
1830 was also republished recently by Mayfair Games. That's one more.
Hasbro is actually pretty good about licensing games when there is continuing interest but not potential for mass market numbers. But they don't own the rights to all the old Avalon Hill games; some were sold off back when AH still existed or were under contracts where the rights reverted back to the designers.
In the US, broadcast radio stations pay no performance royalties at all. That's right, zero. They do pay songwriter royalties. They are also likely to receive promotional funds from record companies that at least offset any royalties they pay.
Spotify is an interesting case because it has both free and premium tiers, and the rate of pay for the two sets of listeners is very different. A listen by a premium listener is currently worth about 10 times as much as a free listener. Basically, the way it works is that 70% of their subscription revenue gets divided among all the listens by premium members, and 70% of their advertising revenue gets divided among all the free plays. (I suspect there are a few additional complications but that's close enough for our discussion.) The gap between the two rates may narrow in the future if the company sells more ads and/or manages to charge more for them.
Some people think that both of Spotify's payment rates are too low. Some others think the rate for free plays is too low and wanted to restrict their content to premium members, but Spotify won't let them do that; it's all or nothing. The all or nothing approach may be better in the long term, because it will increase the value of the free tier and make it more attractive to advertisers. Spotify also believes that it is good for business, because it's easier to get people into the fold first and then upsell them on getting rid of ads than it is to make them pay from the start. (Reference: http://www.buzzfeed.com/reggie... )
There is also the question of how the expected upcoming product from Apple will affect the on-demand streaming market. Apple already owns Beats Music but hasn't promoted it heavily since the acquisition, probably because they plan to replace it with a new Apple-branded service. Most analysts believe that Apple won't offer a free tier; Beats does not though they do offer a free trial. If significant amounts of music goes bypasses Spotify because artists don't like the low payment rate for free Spotify plays (this has already happened with a few like Taylor Swift), Spotify may have to change its position and allow premium-only content.
I took a computer architecture class where that was the end point. We started by defining a simple architecture. (Each pair of students did their own; the available resources in the FPGA we were using pretty much limited us to 8 bit architectures.) Next was to write an assembler and an emulator for our processors. (We used Java in the class, largely because its cross-platform nature meant that students could code on whatever computer they owned and the TAs would be able to run the programs. Any reasonably modern high level language would have served as well; these were not the kind of programs that used fancy language features.) The final stage was to write a VHDL description of the CPU, load it into a board, and run code on it.
That was the most intense class I took during my education. (It's a graduate level course but I took it as part of an undergraduate degree program.) The one class was nearly a full time job.
Your files are now being encrypted and thrown into the bit bucket. EOF