1) I read the bad reviews first to see how many of the bad reviews are idiots. For example, I purchased a product that adapted VGA to video. There was a switch on the side of the box for NTSC/PAL. A number of N. American consumers indicated the box didn't work and that the picture was monochrome (black & white). Well, those folks obviously didn't flip the switch from PAL (as shipped) to NTSC because monochrome is a symptom of video format mismatch (simplying a bit to illustrate the point). So, I discount the 1* reviews with that reasoning that tells me they used the product incorrectly.
2) I look for the proportion of 5* and 1* reviews. If they're about equal, that's a danger sign. If they trail off in a nice log power type curve, so much the better. Those who mention IMDB being seeded with a lot of 10% reviews, beware because the studios know how much word of mouth matters today so they plant reviews that make Baywatch sound like The Godfather. Any "reviewer" who only gives 1 or 10* reviews is suspect in my book.
3) I look for reviews that closely match my use case.
4) I look for reviews that are clearly fake.
5) Any review that is 1* due to price I completely ignore (prices fluctuate and my price point may be very different than yours. Review the actual product, not the price. *I* decide whether the price is worth it.
Unfortunately, Google Play's app store doesn't allow one to look at the negative reviews only. Very annoying. Maybe Google should look at this research and understand that low * reviews serve a purpose.