It is the inflection, the pacing, etc., none of which AI is very good at (or at least it wasn't good at it the last time I checked)
Exactly, for now. With enough data, it will change. If we're about to be enriching and improving models, they should at least be FOSS.
I would like to see these credited as "[Living Actor] in the voice of [Dead Actor]"
Interesting, and it will be interesting to see how this will be handled.
because it ignores the existence of free will, and the existence of people actually having moral boundaries that they won't cross
If people are poor enough, moral boundaries shift. I would dare say that a lot of people also use that as an excuse, e.g. "well, I got to make a living somehow" says someone who works on making apps more addictive by using well-known dark patterns. Plenty of those. And given our ever-repeating cycle of engaging in wars that affect supply chains, the post-covid economy hit and the latest threat to jobs from AI (although you might consider that as disguised staff reduction because of other factors)
It is never necessary or useful to prevent reasonable behavior because of slippery slope arguments under the premise that someone coming to accept those reasonable behaviors could come to accept some unreasonable behaviors that are somehow related.
Forced prevention does not work. Awareness raising is good though, and somebody has to educate the kids (who will be adults) that are malleable targets with devices/software that prey on them.