There has been a way fundamental change of attitude in software development places over the years, in my experience:
In the 1990s, things were being "paved". So, most software was being focused on breaking new ground. One project, we had to roll our own communications platform, using two rounds of DES for speed, store shared secrets because of RSA patents. It wasn't great, but we did things nobody did.
After the 2000s, there was a commercialization, especially making stuff "secure" due to Sarbanes-Oxley [1]. Then came 2010. Fundamentally, after the smartphone wars and patent scramble, after ~2013 or so, we still saw a lot of programming to make tools and useful stuff in the DevOps arena, although everything else in the computer world was static.
Now, we don't see anything new and cool coming our way, outside of (for the most part) smoke and mirrors AI stuff. Do we see something as groundbreaking as Jira (especially before 2019 and the license changes), Git, or even stuff like Artifactory? Not really. DevOps has all but died with nothing other than maybe a new feature... if lucky.
I would say that until we get money coming in for startups, so we can get another Veeam, or another round of useful tech companies, being a coder or in software development as a whole is pointless, especially because here in the US, if the AI doesn't "replace" you, and the offshore coding firm doesn't, then the H-1Bs or the B1s will. Other countries are wising up and are working on having some sovereignity, so these are the markets that likely will be growing.
[1]: Holy fsck, SOX compliancy back then was a clusterfsck. The suit wearing chatter primates they called consultants had companies rip out entire server rooms of Linux hardware to replace with Compaqs running NT or Windows 2000 because Windows was "SOX compliant", and Linux "wasn't". Nobody even read what SOX was about.