I'm mid-career, and this question feels very relevant to me. For the past 15 years or so, I've focused on cross-platform (mostly Linux) C++ programming, with a decent SQL understanding as well.
But a bigger trend seems to be a shift to frameworks, rather than languages + OS's, as the focus of job-posting requirements. It's no longer true that C++/Java + Linux + SQL experience lets me quickly find a well-paying job in the city of my choice. For that, one also (or instead?) needs competence in something like Hadoop, Cassandra, Ruby on Rails, Amazon Web Services, Django, etc.
This makes sense, as CPU's get faster, and as software development continues shifting to a more connected, more web-oriented world. But it's a little scary for someone like me, whose day job doesn't offer much opportunity to work with those newer frameworks in order to develop a marketable level of skill.
I'm guessing this is a periodic problem. Ace mainframe programmers found themselves less marketable when desktop development became popular. Desktop developers were partially outmoded by client/server, and have a more serious marketability problem now that most development is aimed at (ultimately) web browsers. It's not that serious C++/Linux/back-end geeks like me can't find work, it's just that I(we) feel a little trapped compared to those whose skills are currently in broader demand.
I guess the question, then, is do I(we) double-down on our current expertise and become indispensable in a small fraction of the job market, or do we accept the pros and cons of partially re-inventing our careers (and setting back our salaries) to retool?