While I agree that some developers are cavalier with rules, consideration of resources is fundamental to writing software
There have been a number of occasions where I've had to say things like "No, you can't have 10 VMWare instances with 1TB disk and 140GB of RAM each. Because the VMWare cluster doesn't have the resources available, that's why." and "If you'd asked, you'd already know we don't have 2 DL380's with 192GB of RAM and 4TB of RAID1 disk in each datacenter. No I know you 'need' it, but it doesn't exist."
Usually the conversation then has to diverge into an overview of the concept of capacity planning and horizontal scalability.
Thankfully those kinds of conversations are rare these days.
I've worked at a Fortune 100 company
Ditto. My previous role was at HP, and our group couldn't have done the work we did in the time we had if we hadn't have used a DevOps model to do it.
Developers don't know how to run a production environment.
Yes. That's the problem that DevOps attempts to solve. You're supposed to have both "Developers who do Ops" and "Ops guys who develop" in one team to do "DevOps".
If you're working in a place that's done "We'll just get the developers to do Operations" then they're doing it wrong.
The packet-switching technology was military in origin - they were seeking a new form of communication network that could continue to operate without downtime in the face of massive physical damage, like cities being nuked. Academia soon adopted the technology, and the early internet culture came from there.
No. Wrong. Stop perpetuating this myth. Please, go read Where Wizards Stay Up Late
The vague concept of packet switching was developed simultaneously both by a British Post Office engineer (which is where we get the term Packet Switching) and a RAND researcher (which is where we get this ridiculous myth). However at no point did ARPA care about building the network to survive a nuclear war; it just happened that packet switching was a good way to make maximum use of the AT&T provided switched circuits that created the backbone.
After a month of this I was moved to Citalopram. This seemed better; there was less staring at walls, certainly. I spent over two years on Citalopram.
Then one day I stopped. It was kind of an accident; it was Easter weekend, I wasn't paying attention and ran out without a prescription to get more. So I ended up going cold turkey, which is the thing you're really not supposed to do with any SSRI.
I can tell you now, within three days I felt like I had woken up from a trance. I didn't realise it at the time, but the Citalopram made me feel like I was wrapped in cotton wool and wearing ray-bans. The feeling was exactly like the feeling of a dental aesthetic wearing off, except all over. I hadn't noticed because I'd come from Fluoxetine, which was even worse. So I thought I was onto a good thing with the Citalopram.
So please don't go around calling it a "myth". For some people, SSRI's really do have that kind of effect. In fact I suspect it's more pre-valiant than people realise, either because of long term use, because people think it's "normal", or because most people come down gradually and never really notice.
Oh and for all that, I really do believe I was much better off taking the SSRI's at the time than I would have been without them.
He was specific and correct based on my experience in the UK of 2007.
Based on my experience of the past 34-and-a-bit years in the UK, he was talking complete bollocks, but continue to talk bollocks. That doesn't change the reality.
Has an intermediate species ever been found?
What the hell is an "intermediate species"? Just out of interest?
The point of hiring contractors is that they're supposed to bring instant expertise to a project. If they don't actually do that, why bother with the extra expense of the middle man?
it's possible that the situation has improved
There are implementations of both available.
There are various incomplete reference encoders for both, but I'm not aware of a single non-alpha and complete implementation of either. vpxenc is close but lacks things like multi-threaded encoding, which will obviously have an impact on encoding speed.
If you know anyone who cannot legally play an MP4 video, I would like to meet them.
I've got a better question for you: why are there still people who are unable to open and play back a WebM video? What's the driver behind not including WebM support in the handful of OSes & devices that have refused so far? It isn't technical. It certainly isn't a cost issue. It surely can't be licensing. So that only leaves, what, ideology?
Are Apple & Microsoft going to continue to make their users lives more difficult because of their own ideology? Why are they doing that?
the terribly performing FLOSS codec of the day
I'm not sure which codec you're referring too, so I can't answer you there.
I guess my optimism is based on WebM being an open format, thus allowing anyone to implement it on any future platform. Unlike various proprietary formats, that won't. I mean, does your 'phone support Intel Indeo or RealPlayer G2?
VP9 doesn't even match h.264, let alone h.265
That's really odd, because the benchmarks I've seen show VP8 & h264 to be evenly matched, and no one has produced a finished h.265 or VP9 codec, so I do wonder how you think you've seen those two codecs fairly benchmarked?
Of course if your outlook is limited to the short term of less than the next 12 months then I guess the decision looks bad, but then you're not thinking about the long game.