"disenchanted/upset customer"? Clearly, you haven't worked in tech support, or known anyone who has, or read any of the blogs or horror stories, or, really, informed yourself in any way about this. Humans have a bell curve of both "crazy" and "mean", and the tail end of either is not something you'd ever want to come into contact with.
My sympathies are with the Comcast tech support on this one. As bad as Comcast can be, which is at least 300 milli-Hitlers bad, the tail-end of the worst people to call customer support can be worse still. Or just too stupid to be allowed to own a computer.
Brady Haran is neither, but he puts actual scientists on his YouTube channels, and they talk about honest science (and occasional amusing trivia), with no CGI or celebrity required. No politics, no manufactured quotes, many Nobel prizes.
I'm not a fan of the television series, but do enjoy the books
I enjoyed the first few, but the latest book was rubbish and I've entirely lost interest in the story thanks to the pace of his writing. He doesn't seem to have much in the way of original plot ideas, so it's mostly about character moments, and you have to keep that sort of writing coming for me to stay interested in those characters.
The series, however, I rather enjoy. While it's probably the first series to ever make me say "there is such a thing as too much gratuitous nudity", the pacing is vastly better than the books, the important character moments are all there, and the gaps between seasons aren't so long that I forget who everyone is.
More fundamentally; the only reason to insist solar do baseload is quasi religious.
It's the only thing that can scale, unless fusion ever stops being "just 20 years away". Think of the energy needs of 11 billion people at American consumption levels (~40 TW), which isn't at all a far-fetched projection and of course it won't stop there. Even ground-based Solar hits scaling issues there - it's one thing to shade everything that's already paved, and maybe all the salt flats, but at some point you get significant ecological effects.
Oh, sure, for now, but Solar for now can't be baseload anyhow. Orbital can. It will be a while before panels get cheap enough and enough not reliant on scarce materials to scale. It seems inevitable now, but it's still a ways off. Meanwhile, private space efforts keep making progress. In 50 years, when solar has wide adoption and we're struggling with baseload at night, and in bad climates, I think orbital will be a viable choice vs nuclear or gas.
The only argument for space-based is "it's a way around NIMBY". PG&E did some serious research into it, as there's just no where in Northern California they're allowed to build a new power plant, and demand keeps rising. The main reason the plan failed is still NIMBY: They'd need a 1-block receiving station for the incoming power, and could never get that approved. Fuck California.
It's also useful in Northern latitudes. In Texas, ground-based makes perfect sense: lots of land, far enough south. In Seattle, not so much - even on the 12 clear days each year, you're too far north for much efficiency.
The chip requires a PIN to be entered. If you don';t do that correctly within three times, the card is rendered useless.
And this does not have to be three consecutive times.
So even if you have the card, you are unable to do any purchases with it.
Turns out: not so much. As was predicted by the security community, there are flaws, and after a couple years the flaws were exploited, and the PIN is retrievable. This cycle has repeated (is chip-and-PIN in its 3rd generation now? it's at least the second).
Chip-and-PIN means only that the bank makes you liable for your stolen money, claiming "the card couldn't possibly have been stolen because magic". It solves a problem for the banks, and makes it worse for the consumer - shocking, I know.
Just because the shady back-alley freeware does it, does not in any way make a good excuse for a AAA software vendor to do so
And AAA vendors don't. Adobe products are simply shady back-alley freeware as proven by their installer. Java too, of course.
Newfags can't triforce
Slashdot supports too few entities to do this right, and forget about UTF8. But you can get sorta close.
Unless someone can do better?
For lack of mod points let me just say: beautiful!
It's like this in any engineering discipline:
* The apprentice doesn't do things by the book, for he thinks himself clever
* The journeyman does everything by the book, for he has learned the world of pain the book prevents
* The master goes beyond the book, for he understand why every rule is there and no longer needs the rules
Or put another way - the apprentice thinks he knows everything, the journeyman known how little he knows, the master knows everything in the field, and still knows how little he knows.
It's a problem when the default ASSERT macro expands to code with such #ifdefs (no joke - that was the norm everywhere I worked with C/C++). At one place it got so bad that we made using the ASSERT macro a firing offense (not sure why we couldn't just fix the macro, some corporate thing no doubt).
And I've been there and done that with the "no resource leaks" in C++. When you provide library code that's easier to use than doing it the wrong way, it's easy to enforce the standard in code reviews (since then it's only the new guy who hasn't seen how easy the tools are yet).
For example, if you have a good FileHandle class, it's simple to educate people to write FileHandle foo = fopen(...); instead of FILE, and then that's it, the file closes when you exit scope. Works perfectly as a member variable as well - no need to remind people that the destructor isn't called if the constructor throws, as members are always cleaned up.
A really minimal system, like a virtual machine running a site, can be reduced to far less than that. For instance, Mirage OS (http://www.openmirage.org).
We've seen a web server running on a Commodore 64. Wasn't that a 12 kB OS? It's been a while, but IIRC the OS was in memory from D000 to FFFF.
I've worked on mainframes where the "recovery OS" fit on one tape block - a hard 32 kB constraint (used for disaster recovery - it would load a program that also had to fit in 32 kB which would restore a system from backups). The normal OS wasn't much bigger. Most device drivers weren't memory resident, for example, and shared 4 kB by swapping in and out, which could lead to some mighty odd behavior by today's standards.
Without exceptions, you would put in an assertion
Oh? You check for errors in code that gets #ifdef-ed out in a production build? What could possibly go wrong with that plan? (Or do you mean first the check, then the assert, following every function call, further hiding the few lines of business logic in a huge function).
It's quite easy to write "all exception safe all the time" code in C++, in ways that even the junior guys can't screw up. It's not obvious what that coding standard looks like. That's the big problem with C++. Many have never even seen it done right - it's very understandable why business largely moved to managed code.
People see RAII and think "oh, instead of allocate at the top and free at the bottom, I'll allocate in the constructor and free in the destructor". No, you're still doing it wrong if you have any non-trivial destructors outside of a bit of well-reviewed library code.
If you're doing it right, the only avenues for screwing up resource management are adding stuff to a global object and forgetting it there, as with every language.
The cloud makes a great backup. If what you're archiving is small, encrypt it and upload it to a variety of cloud file companies with free offerings - Cloud Drive, OneDrive, DropBox, etc.
For a moderate amount of data, use (encryption and) Amazon Glacier. If you don't know the trick: Amazon offers mail us a hard drive as an upload format for S3 and Glacier, and it's as good as way to do offsite backups as any.
I wouldn't use the cloud as my only archive, but as the offsite copy it's probably more disaster-survivable than most other choices most of us have available. (And affordable if we're talking a few hundred GB of personal stuff, not the entire multi-TB geek archive of "binaries").