Analogies with the human brain don't work that well. In our case, every time we remember we rewrite that memory, altering it from slightly, to a lot, to completely. AI systems' baseline memory is read-only; it doesn't change during reuse, so it can be equated more with the way saving a PNG into a JPEG is still a direct derivative copy of the PNG content, no matter whether one cranks the compression up so the resulting image becomes way blurrier than the original. Being blurry doesn't make it not a copy. And, in being a copy, legal copying rights apply.
Now, if AI memory startes changing globally every single time it receives a request from any source, no matter how many sessions or API calls are happening, so that any new subsequent call is dealing with that altered memory and in turn altering it, so that its entire memory space is in constant flux, and there's no snapshotting to roll its state back to previous configurations, so they don't act as mere static lossy compressors, then it becomes an analog of a human brain with human-like memory, at which point accusing it of simply making derivative copies cannot be done anymore without also accusing humans.
The problem with that, evidently, is that when they start working like that, since they're functioning exactly as real persons do, they too become persons, with legitimate claim to personhood and to personal rights. Which is a legal can of worms no one wants to deal with.