...and no one knows what to do to fix it.
In 2010 the new Web was all about "user generated content". Today, the modern mantra is: "Don't read the comments"
Reviews and review sites have almost exactly the same problems as comment sections: there is no way to filter the ignorant and/or malicious from the informed and sincere. Case in point: there are currently exactly two reviews of my book on Amazon. One from a reasonably thoughtful reader (3 stars) and one from a troll who apparently has given Charles Dickens the same rating as me (2 stars).
There was a five-star review which was from someone who had read the book and genuinely liked it, but Amazon determined it was from someone I knew (likely because I bought her a book on the site a few years ago) and removed the review. This is a ridiculous practice--it would invalidate a huge number of reviews in traditional publications--but is made necessary by authors who try to game the review system in the stupidest possible way.
If there is a solution to these problems it's likely some kind of reputation system, but as near as I can tell no one--not Amazon, not GoodReads, not TripAdvisor, not Yelp, not anyone--is even thinking along those lines, which suggests there is no money in building a site that provides honest peer-to-peer feedback. This is a shame, because the Web should be enabling us to help each other, not increasing our distrust of each other (we're plenty good enough at that already).
/. has had a basically functional reputation system for well over a decade, so it's not like there's any real mystery as to how to do this. I wonder if there might be some b2b model where users sign up with a third party reputation system that then sells reputation information (which would exist across all sites that use it, like discus does for comments) to review sites. Without something like that there seems to be very little hope of getting much long-term value from online reviews of any kind.