Unfortunately, I'm not surprised to see CNET do this. While they're a site that's going down the tubes to begin with (bring back the old logo!), they're also, frustratingly, not wrong to be culling old articles.
I've been in the position to see the results of several smaller sites orchestrate similar cullings for SEO reasons. And despite Google's claims, all of the sites benefited with search-driven traffic improving at least a few percent in a week, and rankings on critical terms improving as well. In the case of most of these sites, the improvement is not night-and-day, but it it is measurable, and it is significant to the bottom line.
With that said, I don't believe Google is actively being malicious - that is, I'm sure they're not trying to punish sites with old content. However, what I've noticed is that long-lived sites with lots of historical content started sinking as Google took on some of their more recent efforts to weed out low quality content. Low quality content has always been a problem, but in the last couple of years in particular, unscrupulous operators have been taking advantage of the pandemic employment environment (and more recently the rise of generative AI systems) to double-down on rapidly generating new content. Meanwhile, Blackhat SEO services have been able to help content mills get their content ranked shockingly high; not enough to take the top spots, but high enough to get some real, meaningful traffic out of it.
The net impact, I feel, is that Google is currently losing the war on blackhat SEO operations and their associated content mills. That is, Google's index and ranking systems can no longer reliably tell the difference between good content and bad content, as many of the quality signals they have previously relied on have been copied by mills. Even more recent efforts by Google to prod sites into supplying E-E-A-T info within articles (Experience, Expertise, Authoritativeness, and Trustworthiness) have quickly been undermined by mills doing the same. The mills are all liars, of course, but Google can't determine that.
What we're seeing is an explosion of content at the same time as the effectiveness of search engines is crashing. Google has to weed through more crap than ever before, and the crap is winning; the signal is harder to find than ever before. The worst part is that I'm not sure what Google can do about it. Search is one of the products they make an honest effort to improve upon (since it's what brings all the boys to the yard), but they've been backed into corner with no obvious escapes. There simply isn't a good signal of quality left with websites for Google to rely on.
Which, to bring things full circle back to CNET, is why we're seeing them purge old content. Since Google can't tell the difference between that content and content from mills (much of which is derived from that old, legitimate content), the old content has become a liability. As best as I can tell, the best way right now to show Google that you're not a mill is not to have too many articles, especially old articles, as those are some of the hardest articles for Google to digest (are they quality articles? Or a mill faking it to look better?). Which means that for legitimate sites to survive, they have to start playing increasingly byzantine SEO games to stay ahead of the mills. And often, adopt blackhat-lite SEO strategies to give them the edge over people who aren't playing fairly to begin with.
All of which Sucks (with a capital S). It sucks for the readers, it sucks for the content creators, and it sucks for Google. But Google decides if commercial sites live or die. They are the de facto gateway to the world wide web. So commercial sites will do whatever they need to in order to survive - which is often the same thing content mills are doing to make a buck.
And you wonder why Google is going so hard on generative AI (Bard) now. Their best hope for stopping this madness is to stop directing people to external sites to begin with, that way content mills have no reason to exist. It also means commercial sites will have no reason to exist - and thus won't be there to provide the inputs GenAI needs - but that's tomorrow's problem. Google can always scrape Reddit to get some (usually) human input...