Ideally, for most high level applications, the programmer doesn't have to use a programming language that forces him or her to do manual memory management. The programmer can use a language like Python or Java where memory management happens in the background. And that is how it's supposed to be. There are more relevant problems to deal with.
Now, on the other hand, let's say that the project is writing some sort of device driver or real time software. Perhaps something operating close to the 'core' and in an environment with limited resources. Then that programmer will have to use a language like C or C++ and do their own memory management.
It's a good thing that we have an education system that produces both these types of programmers. However, I think that most programmers, during their career, learn to do both these types of programming. For me: I started out programming in C/C++ and I've since moved on to Python and Java, because that's the kind of projects that I usually get these days. Now, if someone hired me to work on an application that required tight resource management. Well, then I would use C/C++ and do memory management myself
From the article: "C#: Microsoft created C# about 15 years ago as a new kind of programming language similar to Java; since then, the platform has grown several times over. "
So to summarize: From the creation of C#, when it had close to 0 users, the platform/language has grown "several times over"...
Fastforward today and I think Steve Jobs is fucking a more greedy monster than Bill Gates ever was.
What greedy monster is he f*cking? I'm also curious about what "greedy monster" Gates was f*cking.