It's also a bit rich to say, "This is publicly available, you can read this, but you're forbidden from learning anything from it."
Bear in mind, large laguage models are often trained in a single pass these days. This means that the model looked at the documents once, and updated its weight imperceptibly. If it's capable of repeating something verbatim from that, it's only because it's model of the world was already so good from all its other training data, that what was in the Britannica document was extremely predictable. In other words, that there was almost no additional information there.
And that's as expected from an encyclopedia! If there's anything surprising in an encyclopedia, that's a bad sign.