Because whatever you do in the computing world, you are affected by processing power and cost. Growth in these regions drives both new hardware and new software to go with it, and any hit to growth will mean loss of jobs.
Software (what most of us here create) usually gets created for one of two reasons:
1. Software is created because nobody is filling a need. Companies may build their own version if they want to compete, or a company may contract a customized version if they can see increased efficiency or just have a process they want to stick to. There used to be a lot of unfulfilled need out there, but this demand is much sated in the 21st century.
2. Software is created because a company desires increased performance/new features (basic need is filled, this is a WANT). Once a new processor/feature becomes available, you either wedge it into existing code. Or, if it's a massive enough of an improvement, you create entirely new software enabled by the new level of performance-per-dollar.
Without continued growth, the industry is in danger of cratering because there's only so much processor architecture optimization you can do in the same process node, and the same goes for optimized libraries on the software side. In addition, brand-new industries enabled by cost reductions (e.g. digital FMV explosion in the 1990s, or the movement to track your every move in the 2000s) will no-longer be so common, and that will again force people to look elsewhere for employment.
Software engineers won't disappear, but they will be culled. The industry has not had to deal with that yet in it's entire history, so it will be painful. I'm hoping they can hod this off for as long as possible!