Like many people, I am very much in the dislike camp. In my case, I have recently acquired a Samsung device (you can take me at my word, or don't..) and am in the process of learning how to use it and will in due course start backing up folders and files directly to my NAS.
With that said, I am curious as to why Apple is so insistent that this won't and can't be expanded to other use cases, as well as why those in favour of the concept aren't in favour of expansion.
Let's start by saying that child sex abuse is a heinous crime. I am in favour of criminalising those who abuse children and subjecting them to the full force of the law. However, child sex abuse is not the only crime that causes harm, and detecting images of such acts after they have taken place only serves to criminalise the viewers; if the images exist then the child has already been abused.
I'd like to explore the ways in which the two technologies which Apple is deploying could be used to great benefit. I may well stray into the use of technologies which aren't in scope of this initiative, but which do or will exist in the near future.
Revenge Porn
The act of sharing nudes or explicit videos after the end of a relationship affects both children and adults. There are also mobile apps such as Snapchat where people share suggestive or nude pictures with the expectation that these will 'self destruct'.
With the neural hashing technology, we could see the process become more secure and eliminate cases of revenge porn. When a relationship ends or when a self destructing message is shared, a hash of 'cancelled' images could be added to iOS devices worldwide and thus preventing the sharing or viewing of these private images.
The same principle could be used for images which present a national security concern. The exact definition would vary for each country, but expected results could well include photos of prisoners in orange jump suits, videos of bombs hitting hospitals or even photographs of tanks parading in city squares.
Missing Children
Child abduction is not a new thing. We have seen photos on milk cartons, AMBER alerts and other such initiatives which have varying rates of success.
Apple says that there are one billion iPhones in use worldwide, so let's do a modern take of what the search for missing children could look like.
We know that the technology to scan for particular faces exists in iOS because the photo gallery helpfully categorises all faces of a person together. We also know that iMessages will acquire this new feature which can scan images for particular content types.
So let's marry the two together: in the minutes after a child abduction, the parent can upload as many photos of the victim. Apple will then push an alert to all the iOS devices around the world. Hopefully someone, somewhere has or will take a photo where the missing child happens to be in the background and boom: we get a time and location for the police to act.
Fugitives
The same principle as outlined for missing children could apply here, except this time with images of criminals uploaded by law enforcement.
Naturally, we would need to trust law enforcement to use this feature correctly, but if we could quickly identify their appearance in the background of images, we could get a location of any and all fugitives including those suspected or convicted of violent crimes or even those who committed a minor crime such as shoplifting or drug use.
Unlawful Protests
The concept of an unlawful protest has started to make its way to the western world. Even the UK, which purports to be a western democracy, has introduced laws around curbing protests and there is even the concept of needing to apply to the police if you wish to march.
The concept of face identification, which does exist today, could be used alongside the technology deployed to the one billion iOS devices out there in the world. Those who dare to protest unlawfully could easily be identified and, through the use of location data, captured.
As an extra step, when photos of the unlawful protest or march start to appear on social media, these could be added to the 'blacklist' of unlawful photos with the aim of disincentivising others to unlawfully protest. After all, why protest if the authorities can make sure no-one hears about it?
Unlawful Gatherings
Over the last 18 months due to coronavirus, there have been varying restrictions on the number of people who can meet together. At one point (in the UK), there was a rule which said you could only meet one other person outside and no people inside.
People in breach of these rules were seen to be potentially contributing to the risk of coronavirus spread.
Again, technology we have already discussed could start looking for photos taken or sent which include a number of people greater than allowed. Automatic reporting could trigger a fine to those concerned (taken from your card on file in iTunes) or a police visit for repeat offenders.
In summary...
Child sex abuse triggers an emotional response. It's natural to be against it and it's also natural to want to criminalise those who make it or view it. That's why the "think of the children" argument works so well.
By creating technology which can be useful in its detection, as well as the secondary technology which detects nudity in iMessage, we do create opportunities which would have much greater benefit to the world and its inhabitants. The small number of examples given above are one view of how we could progress from CSAM detection to other serious items such as revenge porn, finding missing children and fugitives, and general crime detection and prevention.
It is illogical to create this technology and insist that it will only be used for one narrow field, as that would waste its potential.
Some people might call that a slippery slope. I encourage other readers, who are also against this technology, to take their friends, family and co-workers down their own slippery slope - take them on a similar journey as I have described above.