There needs to be acceptable nudity policies. These should require users who upload photos with nudity to tag them as such, including whether sexual or non-sexual (the napalm girl is clearly non-sexual), and even pornographic (if there is a service that allows pornographic images). The rule then is that the uploader must tag certain tags if appropriate (e.g. non-sexual nudity), and so on. Then users have users settings on whether to block such tags, and if they see untagged images which should have been tagged, and would have been blocked given their settings, then there is the 'inappropriate image' system. When it comes to sexual nudity stuff, if present at all, there should be checks on users. Then AI can flag possible non-tagged images. This really ought to be well within what Facebook can do. In addition, with sensitive stuff (like the revenge porn stuff), there should be terms and conditions where blatant stuff like that european lawsuit is about can lead to details of uploaders being sent either to police or the victim's lawyers.
The problem is to try too hard to have an idiot-proof one-size-fits-all acceptable image policy.