...when every programmer (and tech support person, and manufacturing person) in the US can get a job, that's the time for US operations to be looking for foreign help.
But since age, health, formal schooling, in-country location, and credit score are widely and consistently used to deny highly skilled US programmers jobs -- I am very confident in saying that Mr. Graham has not even come close to identifying the "programmer problem" from the POV of actual US programmers. All he's trying to do here is save a buck, while screwing US programmers in the process.
Do it his way, and the US economy will suffer even further at the middle class level as decent jobs go directly over our heads overseas, while, as per usual, corporations thrive.
This is exactly the kind of corporate perfidy that's been going on for some time. Graham should be ashamed. He represents our problem. Not any imaginary lack of US based skills.
2. Exchange is overrated crap.
3. Sure, if 'management' is limited to pretty much locking the desktop background. Try to install and configure non-trivial third-party software through GPOs. Hell, even try to install Microsoft's own VisualStudio.
4. BitLocker is indeed nice.
5. RefuseFS is still very experimental. BTFS supports integrity checking on the filesystem level and DM supports it on block level.
6. DHCP - the original RFC states that is was written by Ralph Droms ( email@example.com ) in 1993, when Microsoft didn't even have their own IP stack!
8. BTRFS has dedup. NTFS doesn't support dedup, it's done on the block level.
9. Is anybody still even interested in file sharing?
IBM / Rational, Forrester research, GE. I could keep going.
Google uses multiple container systems and a lot of their infrastructure isn't Linux but Docker is a key component of their containerization strategy. If it were even 10+% Docker you talking well in excess of 100m Docker containers in production.
We've seen too many "if only" complainers. Every complaint met just creates a new complaint.
the best theory so far is that of Widom-Larsen
Widom-Larsen requires an implausible mix of scales. The effective mass of heavy electrons in the solid state is a collective phenomenon happening over distances and time-scales that are large relative to the nucleus and nuclear time-scales and affect the dynamics of the electron's interaction with the lattice, on those scales. To impute to these large-scale effects efficacy at the nuclear scale is very unlikely to be correct.
Consider a car analogy: a car moving along a freeway in dense traffic interacts with all the cars around it. If the driver accelerates, they will pull up close to the care behind and that driver may speed up a bit too, sending a diminishing wave of acceleration through the traffic, so compared to the same car alone on the road the car in dense traffic appears to have a much higher effective mass. Alone, you hit the gas and speed up a lot. In traffic, you hit the gas and speed up a little bit. That's what the electron in the surface looks like: a car in traffic.
But on the scale of car-car interactions, the "bare" mass of the car is what matters. If two cars collide you get an energy of 0.5*m*v^2, not 0.5*Meff*v^2.
Yeah, there are multi-car pileups that muddy the analogy, but they add up to nothing like the effective mass of the whole traffic block, so there. And the difference in scales between "cars and traffic" is tiny compared to the difference in scales between "nuclei and the lattice", so the effect that analogy hopefully makes obvious will be that much larger in the latter case.
This smells like a scam of some sort
While I don't disagree on the smell, Gates is richer than God, and the first thing I thought on seeing this was that if I had that kind of money I might spend a bit of it on wigged-out ideas, just in case. It's like me throwing a panhandler a buck just 'cause I can.
Except that both languages and "application architectures" are, so as to speak, both based on usefully constraining the set of valid programs.
Sorry I don't understand what this means. If you design a data schema that can't scale no language selection, amount of clustering, sharding, money or associated BS is going to be of much help... this is just reality.
Only when machines become smart enough to do the designing will this ever change. Computers can do a lot on the margins but ultimately if you want scalability and performance in a non-trivial problem space YOU will have to work for it.
In the long run, though, stuff tends to move into languages, among other things because it allows checking of correctness at the earliest possible moment during development.
What does constraint validation have to do with scalability and concurrency?
Wake me up when you have something falsifiable to say.
If the main text of a religion isn't a reliable guidebook to that religion, how can we determine if anything is?
Obviously, we can't.
What made you think we could?
All major (and most minor) religions present huge diversity. Within Christianity, the bible is taken as everything from vague metaphor to the "inerrant word of God." The Koran for Islam, the same. Buddhist practice ranges from meditative to non, from vegetarian to non, from rigidly scientific to the most laughable crystal-gazing nonsense you've ever heard of. New agers.... that's a basket so broad I don't even have a clue as to what it really means, although I have to say, I've rarely come away from someone's description of their new age ideas thinking "wow, that made sense." OK, actually, never. But I figure it could happen.
In addition to actual sect differences, there are practitioner differences, and they range all the way from non-believers who are there for the social aspect, to rigid adherents to every jot and tittle in every book (and some, like the Catholics, have quite a few books.)
For my part, I figure, if I want to know what someone thinks, just ask them. Unless I have specific relevant evidence, I don't assume people fit into standardized boxes. I have found that to very rarely be true.