Honestly though that's nothing new, other than maybe the degree of hype. It's *always* been true that the public's expectation of ANY new science or cutting-edge technology they hear about should be, "Maybe something useful will come of this in 20+ years, if it ends up being real".
Science news follows the cutting edge of science. Which means that 90+% of the time it's later proved wrong - that's just the nature of science. The whole point of scientific publishing is to get other scientists interested in trying to prove you wrong - which they usually succeed at. Peer review can't happen until your peers know what you think you've accomplished. And that has little to do with the "peer review" (reputable) scientific journals do - which is basically lenient arm-chair quarterbacking by other people in the field, who are just looking for really glaringly obvious problems in your write-up or methodology.
And technology news isn't much better - since it's generally a mix of cutting edge science (like this one - we haven't gotten it working yet, but believe we can), and "we got this working in the lab, and believe we can scale it up to production levels", which usually proves far more challenging than anyone expected.
If a layman likes following science news, that's great - but they need to recognize that they're watching the sausage being made, and almost everything they hear will later be proven wrong. And of course, the popular science media doesn't like drawing attention to that, since why would their audience pay good money to hear about bad science? Which means you'll only hear about the "Cool new thing discovered!", and almost never the follow up years later that "Cool new thing proven to be nonsense."