While I believe that people who are less sensitive tend to thrive more than others, I don't agree that "thicker skin" is a workable solution. Too many people have fragile emotional states and simply don't have the neural hardware psychological capacity required to dismiss the hate and insults that often happen on line. There have been some high-profile suicides among teens who were attacked online, and who knows how many people remove themselves from public comment because of the hate they've received? For safety reasons I don't think society should completely abrogate the forums to the trolls.
Does that not mean some people are overly sensitive? Sure. But just as we shouldn't velour-line the internet to cater to absolutely every person with a psychological disorder; we also don't have to tolerate the diarrhea that spews forth from the trolls. We don't have to draw a hard-and-fast line on the ground, either, and define "these words are always 100% bad in 100% of situations". Instead, we should be welcoming humans in the loop, asking them to pass judgment when needed. That gets us to a more fluid state than full automation. It also lets the user choose. Don't like the judgment process on Slashdot? Don't hang out on Slashdot.
I know full automated filtering is the holy grail of internet forum moderation, but as soon as you deploy a filter it becomes a pass/fail test for the trolls, who quickly learn to adapt and evade it. Human judges can adapt, too, and are about the only thing that can; there are simply too few for the volume of trolls out there. A tool like this might help them scale this effort to YouTube volumes.