Let them make what they can of the data. If they can't come up with any substantial, then it validates the data and your theory. If they come up with something you can't counter, then they've found a serious problem with your theory or your methodology which needs to be addressed.
Definitely, let them keep at it. But it's already been a few decades and very little evidence seems to be falling on their side. We can't and shouldn't wait until every single one of them is convinced. 97% of people actively researching in this area are convinced humans are causing global warming*. How certain do we have to be before start to act?
* From wikipedia:
A 2010 paper in the Proceedings of the National Academy of Sciences of the United States (PNAS) (http://en.wikipedia.org/wiki/Proceedings_of_the_National_Academy_of_Sciences) reviewed publication and citation data for 1,372 climate researchers and drew the following two conclusions:
(i) 97–98% of the climate researchers most actively publishing in the field support the tenets of ACC (Anthropogenic Climate Change) outlined by the Intergovernmental Panel on Climate Change, and (ii) the relative climate expertise and scientific prominence of the researchers unconvinced of ACC are substantially below that of the convinced researchers
There's a big difference between the two: a PNG or WebM library may contain exploitable bugs, but they are difficult to exploit because these formats are fundamentally data. GLSL is not, it is executable code which not only has to be run, it has to be run as fast as possible. This means that it's compiled to native code (if you're on an open source OS and not using blob drivers, odds are that it's compiled using code that I worked on). It takes very little in terms of bugs for this to be exploitable, and that's not helped by the fact that the target - the GPU - is typically a horrible design from a security standpoint. This is why 3D was one of the last things for VMs to support, and why they still recommended that you don't enable enable it if you care about security.
Just as with any native code (like a DirectX game, for instance) there is no way to ensure "safety"...although I'd think almost any other attack vector would be easier than WebGL.
I do wonder. Of course it would mean targetting specific GPU vendors, and perhaps specific driver versions as well. But imagine what you could do if you were able to play with DMA... bye bye to any OS security.
I'd rather just believe that it's done by little elves running around.