From the linked piece:
In hindsight, his remark was a clear sign that the marketing hype around "big data" had peaked.
This is true, and it provides the context missing from TFS: "Big Data" is over as a marketing term. But as technological term and as far as actual implementation, it is the status quo and forevermore will be.
From a technological perspective, "Big Data" has a simple definition: more data than can be stored on a single machine. And this need will only grow as hard drives and maybe even SSDs plateau while of course enterprise data only grows.
Indeed, TFA itself states (that TFS omitted):
A particularly hot sector has matured around Hadoop, an open-source analytics software platform. Many tech companies are writing software to make Hadoop industrial strength and integrate it with new and existing types of databases.
So, from TFA itself: Hadoop is hot, but the term "Big Data" is not.
Much that is taught in CS today I had to learn on my own because it hadn't matured enough yet to be incorporated into CS programs: multi-threading, unit testing, OOP, SQL, data mining, all of the web technologies, etc.
But perhaps today's graduates will be complaining ten years hence how new graduates just rely on quantum computing searches and don't know anything about pruning search trees.
Seriously, though, to the point, I'd be more leery of those who graduated ten years ago and had not kept up their skills as opposed to those who graduated recently and did not learn skills from ten years ago.
Should be no surprise to anyone who's every played a videogame: he's in "flow" mode.
Which raises the question: how is this news for nerds?
Yes, it is true coders have little time to code. But the author misses the primary cause: the ratio of library/framework code to self-written code.
In the old days (say, 25+ years ago), you would pick up a book -- a single book -- of the OS API calls, memorize, and start coding. Today, with github, it's as if everyone in the world were working on the same single project. Today, a developer needs to learn all these libraries that are coming out daily and how to work with them. In the old days, there was a lot of reinvention and co-invention of the wheel. Today, that is not allowed, because one has an obligation to "buy" (for free) instead of build because of a) of course, development time and b) more importantly, one gets updates/upgrades "for free" without having to invest (much) additional development time, and c) one's organization can advertise in the future for developers who already have experience with that particular library/framework.
To address specifically the reasons identified by the author:
But this new development paradigm of the global github hive -- where we're all essentially working on and contributing to this one massive codebase that we all have to understand -- is what the author missed. The amount of custom code to actually code is small now, and the majority of time is spent figuring out how to get the various libraries and frameworks to work.
I tell people I will change jobs for a 30% increase in compensation. That results in a job change every seven years, and here's why. There is a difference between the reported and actual rates of inflation. And annual increases at an existing job more closely track reported inflation, whereas job offers from other companies more closely track actual inflation.
For example, if reported inflation is 3% and actual inflation is 7%, then after 7 years that's a 32% difference.
Without life, Biology itself would be impossible.