The difference between "92% accurate" and "accurate enough for my task" are profound.
If you were using these kind of analytics to bill your customers, 92% would be hideously inaccurate. You'd face lawsuits on a daily basis, and you wouldn't survive a month in business. So the easy answer is, "this would be the wrong tool for billing."
But if you're advertising, you know the rates at which people bite on your message. Perhaps only 0.1% of random people are going to respond, but of people who are interested, 5.0% might bite. If you have the choice between sending the message to 10000 random people, or to 217 targeted people (only 92% of whom may be your target audience), both groups will deliver the same 10 hits. Let's say the cost per message is $10.00 per thousand views. The first wave of advertising cost you $100. The second costs you $2.17. Big Data, with all of its inaccuracies, still improves your results by a wide margin.
Way too often people like this point out that perfection is impossible. They presume that "because it's not perfect, it's useless." The answer is not always to focus on becoming more accurate, but to choose the right tool for the job, and to learn how to recognize when it's good enough to be usable. At that point you learn how to cope with the inaccuracy and derive the maximum benefits possible given what you have.