My initial reaction was that the engineer sounded like he was losing his mind, and what a nightmare for the company. I didn't read the post but it sounds more like "evil users can make evil drawings" kind of thing, which is true for pencil and paper too.
But, I also remembered an experience I had myself. At risk of a "think of the children" instance, I think it raises a valid point but not one to be legally enforced.
My niece, about 7 years old or so loves owls and I generated owl images with her using one of the AI image generation sites, Dall-e or something else which lets you type in a phrase then it displays some images. Everything was going fine until she demanded to be able to type. What could go wrong, I thought. Immediately she pounded the keyboard with a mischievous grin, making a string of maybe 50 characters looking like a long nonsense word, lots of consonants I think.
The resulting image looked like the complete opposite of anything wholesome, showing a severely cut up, damaged, bleeding corpse that caused her to shriek and duck her head. Basically nightmare fuel. It made me wonder how it was caused. Was there an underlying algorithmic design issue that resulted in abnormal images being hidden in spaces not accessible with anything but nonsense strings? Maybe a string that was disallowed as proper input ended up gathering lots of negative descriptors?
I think it would be a good idea for the possibility of such an input to cause this kind of output, and recommend that children not be allowed to use them without supervision. I don't know if this problem still exists, as it was maybe a year or two ago, but probably it has happened to other people. It still doesn't warrant pulling the code / service off the web.