I'm not- you're doing that for me.
You think I don't see you trying to snipe people out of spite? I'm not fucking stalking you, you self important bufon. I'm making sure you don't come into my spaces to talk bullshit about other stuff you also don't fucking understand because the chip on your shoulder is so damn big...
Ya, you are.
Thankyou for again explaining that you got the words you're saying here from marketing from a company that stands to make money off you believing them. The fact you're relying on nobody involved understanding the contents of the page you posted was already taken into consideration.
What in the fuck are you talking about?
From the page:
A stochastic process has the Markov property if the conditional probability distribution of future states of the process (conditional on both past and present values) depends only upon the present state; that is, given the present, the future does not depend on the past.
Emphasis mine.
A Markov system's evolution depends only on the current state.
i.e.,
If the last token was A, the next token's probability stems only from that fact.
If that is not a true statement, then your system is not Markovian.
To demonstrate the difference, a Markovian token predictor would have a next state probability field of V^n, where V is the vocabulary, and n is how large your n-gram is.
A non-Markovian LLM has a next state probability field of V^c*V, where c is the configuration space of the context, so that a small model easily has multiple googols of probability space for its next token. They're literally not comparable.
One of those is simple and predictable. The other has a state space that's astronomical.
You can, if you really want to abuse the definition, say that an LLM is Markovian, in the same way that the universe is.
Again, feel free to educate yourself.
That is why it is stupid to compare an autocomplete with an LLM.