Unsurprising after the Windscale Fire that nuclear power is unpopular in the UK
Windscale was 60 years ago, in an air-cooled open loop pile who's only purpose was to produce plutonium and other nuclear isotopes as quickly as possible and damn the consequences.
Most people now don't even remember what Windscale was or even recognise the name. Out of those that do, a lot of them understand the different between Windscale and their local nuclear power station.
To the best of my personal knowledge, nuclear power is not unpopular in the UK, Windscale or otherwise. If anything the attitude appears to be "Get on and build the damn things!" and "Why are we letting the French/Chinese build them, I remember when the UK used to build things!".
In every other language you have to put in braces to make it easier for the parser to understand you
Anyway, I prefer the Hobgoblin myself. Pool tables, you see.
Should nobody be hugging THOSE servers either?
As a former cloud administrator: no. When you have 2000 physical servers, why do you care that 50 of them are currently broken? Why would I care that the hard drive failed in one and I had to re-install it (with an identical image and configuration to the other 1999 servers)
Hell, we had servers that never worked from the day they were delivered and no one gave a shit: it went on the backlog for the DC guys to diagnose and RMA. Some of them got fixed after 6 months.
How many gasoline cars the same age as a typical tesla have caught on fire?
The Dev side of me loves Ruby. It's a nice language, it's powerful, the standard library is nicely complete and there are Gems for pretty much everything I could ever need.
The Ops side of me hates Ruby. Managing all those Gems on any given server is just horrible, rbenv & rvm need to die in a fire, there are a apparently one hundred different ways to run an application and proxy requests to it, and of courses Gems exist outside of the system package manager and that's always bad.
While I agree that some developers are cavalier with rules, consideration of resources is fundamental to writing software
There have been a number of occasions where I've had to say things like "No, you can't have 10 VMWare instances with 1TB disk and 140GB of RAM each. Because the VMWare cluster doesn't have the resources available, that's why." and "If you'd asked, you'd already know we don't have 2 DL380's with 192GB of RAM and 4TB of RAID1 disk in each datacenter. No I know you 'need' it, but it doesn't exist."
Usually the conversation then has to diverge into an overview of the concept of capacity planning and horizontal scalability.
Thankfully those kinds of conversations are rare these days.
I've worked at a Fortune 100 company
Ditto. My previous role was at HP, and our group couldn't have done the work we did in the time we had if we hadn't have used a DevOps model to do it.
Developers don't know how to run a production environment.
Yes. That's the problem that DevOps attempts to solve. You're supposed to have both "Developers who do Ops" and "Ops guys who develop" in one team to do "DevOps".
If you're working in a place that's done "We'll just get the developers to do Operations" then they're doing it wrong.
The packet-switching technology was military in origin - they were seeking a new form of communication network that could continue to operate without downtime in the face of massive physical damage, like cities being nuked. Academia soon adopted the technology, and the early internet culture came from there.
No. Wrong. Stop perpetuating this myth. Please, go read Where Wizards Stay Up Late
The vague concept of packet switching was developed simultaneously both by a British Post Office engineer (which is where we get the term Packet Switching) and a RAND researcher (which is where we get this ridiculous myth). However at no point did ARPA care about building the network to survive a nuclear war; it just happened that packet switching was a good way to make maximum use of the AT&T provided switched circuits that created the backbone.