I have seen both sides of the story about junior programmers, having served 10 years in the academia and several years now in the industry. The problem of failing to adapt to new problems didn't happen only after students graduate. At school, we try to expose our students to a variety of problems. They either aren't interested or simply get overwhelmed. I think only about 10% of the students really took advantage of the education they got. Those really good ones are self-motivated. To the rest, you simply have to spoon feed.
Since we can't just fail 90% of the students, to make the best of the situation, there are a few key messages that we try to hammer into the student's head: memory hierarchy (amortizing on storage bottleneck), queuing theory (service load modeling), and of course the big-O analysis. I can comfortably say these are some universal principles that remain true even as technology evolves. We didn't do critical path analysis, but that has become important since parallel and distributed computing took off---I think this is likely the change in data requirement and bottleneck you observed.
Now in the industry, I still see limits of this mostly theoretical approach to computer science. We hired a fresh college graduate who delivered code where all unit tests pass, but the code doesn't run. It's just not doing the things real code is supposed to do; the unit tests mocked everything out. But we hired an industry veteran who also wrote code like that. I'm not sure what the problem is. Maybe some people really just shouldn't be programmers.
The real good ones design optimal algorithms in their first attempt, and the optimizations are about how to make the code understandable to others who don't know programming as well as they do. But even the best programmers fall into the temptation of writing code they shouldn't have written in the first place. :)