This BS *again*?!?!
GPU shaders != running code on the CPU.
WebGL allowing shader usage is pretty much a non-issue security wise. GLSL shaders are *extremely* limited in scope. They can't access anything besides model data and textures, and even then only the model data and textures provided to them by the host program. GLSL is very domain-specific and doesn't support pointers or any way to access things outside the purview of the GPU.
Furthermore, they aren't pre-compiled (aside from some vendor-specific methods on *OpenGL ES*, and even those only compile to bytecode IIRC), so WebGL can at least attempt to do some shader validation. OpenGL and WebGL programs literally hand the GLSL source code to the driver, which is then responsible for compiling it. This actually turns out to be good for performance, since future compiler improvements in the driver can result in the same shader on the same hardware running faster. It also means WebGL could do validation on the shaders before handing them off to the driver, to keep an eye out for any obvious attempts to do something bad.
And when it comes to malicious shaders, only two attacks can be executed: try to crash the GPU by running a very intensive shader, or try to peek at other web pages via what seems to be an implementation flaw in WebGL/HTML5 Canvas.
The first attack can be easily avoided. In fact, it *shouldn't* be possible at all on Windows, which is supposed to restart the GPU if the GPU crashes, and when it can't that's a *Windows* bug.
The second is a little harder but, again, looks to be an *implementation* flaw, not a fundamental flaw in WebGL or shaders or anything like that.
Face facts, modern GPUs don't offer any of the old fixed-function pipeline anymore. It's not anywhere to be found on the silicon; modern GPU drivers merely emulate it for old OpenGL programs. This means that if WebGL didn't have shader support it would be completely useless.