The conclusion presented in the article however, is not that, I can agree with that (even if it's obvious). Their conclusion is that people stop believing in the problem. Probably not true, probably not even the right conclusion to draw from the data presented. If you present a "fact" and then *a* solution that may or may not address the problem, may or may not be optimal, and may reveal you as having a controversial political bias then you are basically asking for people to screw up your experiment.
First, upon deciding you have a political bias that opposes mine, I immediately doubt or question your "facts". I may think there's a problem, but I am going to assume you are lying or being intentionally deceptive. Rather than internalize your facts, I will replace them with my own (possibly wrong) facts.
Second, I assume that the purpose of the test is to lie/mislead/misrepresent truths for the purposes of accomplishing a goal I may not agree with.
Third, even if I believe your facts, even if I don't care about motives, if I believe the solution causes more problems than it solves, I will choose instead to let the problem continue. If I have chicken pox and the cure might cause cancer 10% of the time, but will definitely heal me, I would just suffer in silence. This is a no brainer, but the issues behind politics are far more complicated and have much more significant impacts and unintended consequences, not to mention that different people will weigh the solutions differently.
Maybe the paper was smarter than all that, I'm not going to pay good money to read soft science. I spent enough time with academics to distrust the paper factory even in hard sciences, in the soft sciences it's just utter bullshit to the core, and the article more or less confirms it.