For example, I used to annoy my wife my predicting the ending to NCIS episodes long in advance. The killer would almost always be (1) the person you least suspect, because the character has no real rational justification for the crime in the early episode and (2) the one superfluous character introduced in the episode, since they didn't want to pay much and dedicate screen time to characters who did not push the plot forward. In short, because the writers of the show were trying so hard to be surprising, they would cook up contrived motives that could be presented in the last five minutes of the episode, and then make the one person who seemed most innocent actually turn out to be guilty. But then, this is predictable. It's just like how if you've watched enough episodes of House, M.D., you can easily guess that the person who first gets sick during the prologue is not the actual person who's going to pass out and end up in Dr. House's care.
In short, it is somewhat silly to analyze literature in terms of a kind of Asimovian statistical "psychohistory," when the real principles that structure the literature are so evident. For example, whether or not a particular character appears in future books is not determined relative to characters' appearances in prior books, but according to the MO of the author, which is not something that remains static over the years but which develops and fluctuates according to his historically-conditioned priorities. Vale is honest about the limitations of the statistical approach, but what I think is necessary is to recognize how that which derives from human freedom but ultimately manifests in statistical ways is always also at the same time codetermined by implicit principles and formulae (e.g. the economic viability of such and such kind of writing), especially economic ones.