Remember that this, like almost everything else in life, is a case of trade-offs and "some win, some lose." You may be on one side or the other but some use cases/users will benefit and others won't. It's almost never purely black and white - if you think about both sides of the story.
Short version: more than half a century ago, interested organizations - ranging from the military to railroad networks to local police/fire to nascent TV/radio broadcasters - were all given broad swathes of spectrum. This was is low frequency bands because that has a great signal propagation over long distances or through buildings, and because it was all (except for TV) narrowband - there wasn't much more to send than voice.
Everything was of course analog rather than digital, which means that a weaker signal just gets fuzzier and fuzzier until it goes out. It worked well for everybody - the people in areas far from a broadcasting tower just got fuzzier radio or needed to get a roof antenna for their TV.
Flash forward to thirty years ago, though, and digital shows up. Digital makes vasty more efficient use of spectrum than traditional analog signals. You could, for example, take one patch of VHF (in US terms) signal channels and instead of lots of people watching one TV channel over analog TV, thousands of people in one LTE cell site area watching a thousand different shows over Netflix. And a thousand different people watching a different thousand shows in the next cell tower over.
But a digital signal is on/off, so those with a receiver outside a certain area are out of luck. So we have winners and losers. Those getting signals from cellular/TV/radio towers inside a certain area get more service. Those further away get less.
If you assume that there are more people getting more/better service from more digital towers, then this is a win. If you look at the edge cases, it's a loss. It's purely a situation of "where you sit is where you stand."