You did indeed hit s nerve. My funny bone! Your abuse of the word irony, for example.
Since most of the artists I listen to on streaming services are dead, should they be paying me?
Multi-core x86 processors only appeared well after PCI-E had taken hold
True, but SMP systems are older. I used a dual P3-700 for years, which I picked up cheaply on eBay because not many people searched for 'duel processor' (I think eBay now does some phonetic matching in their search). Before that, the ABit BP6 (1999) was quite popular. It ran two Celerons, so you could get a dual-processor machine for cheaper than a single-processor one (though you needed to run Windows NT or *NIX to be able to use the second one, as XP was the first SMP-capable consumer OS from MS). The 300MHz Celerons overclocked to 450MHz by bumping the FSB from 66MHz to 100MHz, making it quite competitive with the P2 (the L2 cache in the Celeron was half the size, but twice the speed, and the core was the same).
God I miss 80's computing.
I don't, but if you want to get the same fun without all of the old annoyances there are two things I'd recommend:
The first is to get an FPGA dev board. BlueSpec is a nice proprietary high-level HDL that is free for academic use, but if you don't qualify for that then CHISEL from Berkeley is also not bad - they're both a nice step above Verilog / VHDL.
The second is the mbed boards from various ARM partners. Some ARM folks handed me one of these to play with a few months back. These are aimed at getting embedded development to people who don't normally do it. They've got all of the fun I/O stuff from the BBC micro (plus some new stuff like USB and Ethernet) and a nicely put together development environment.
I've seen a few things recently that have taken an amusing middle ground and bought ARM cores and used them to run a Z80 emulator, because it was cheaper to get the associated peripherals to attach to the ARM core.
I expect Google to die in the same way that IBM died: it will still be a huge and influential player for a long time, but won't be the company that defines an industry that people care about. The same sort of path as Microsoft.
When I interviewed at Google a few years I was reminded of something that JWZ wrote about Netscape, claiming that it started to decline when it started hiring people who were there because it was a cool place to work, not because they wanted to change the world and believed in the things that the company was doing. Everyone I met at Google told me that I should would there because it was a cool place to work...
Invitations were a mistake in the first place.
It works for email because any email provider interoperates with any other. Having an account on gmail when nobody else does doesn't create any problems for the user.
On the other hand the very point of social media is that everybody you know is there. Being alone on something like Google+ is completely pointless. Such a service should be grown in the completely opposite way of the "have people invite each other" idea, using any excuse possible to get people to sign up.
This is going to be a continuing problem until they figure out how to get some separation between church and state. This separation will be difficult to achieve so long as assassination of potential political rivals remains commonplace. The christian world had the advantage of making the separation back when a King could be reasonably protected against assassination by simply living in a castle and keeping a close eye on his advisers and family. Today with high power sniper rifles and small but powerful bombs available to any random stranger it is much harder to avoid being assassinated.