ChatGPT et al are nowhere near ready to do any "heavy lifting" in Wikipedia.
But give it a few years and it will be.
The first "productive" will be high-quality author/editors using AI to assist with the grunt work of writing a high-quality draft. Things like finding possible references come to mind. That may already be happening and nobody knows it because the only "difference" AI is making is that established author/editors with reputations for producing high quality content already are doing are more productive than they were before.
Next, you will see AI doing routine things like proofing existing articles and generating lists of recommended changes. For example, flagging possible mis-spellings, possible incorrect links, possible inconsistencies, possible "not supported by cited reference" issues, and the like. The output of this work won't ever result in an "AI edit."
You may also see AI replacing "non-AI" machine translations once AI gets to the point that it's consistently better than conventional "non-AI" machine-translation.
Eventually AI will be as good at generating new content as your middle-of-the-road "author/editor." At that point, either Wikipedia will have to allow AI article creation or some other organization will create a new "AI written encyclopedia" full of quickly-written high-quality (nearly zero hallucinations, etc.) content that people will flock to. Or maybe the entire idea of an encyclopedia will be replaced by "on-demand content generation" similar to the "AI results" we see today in some search engines, but better quality than what we have today.
I remember a few years ago someone predicting the "end of Wikipedia as we know it" claiming that Wikipedia may still exist, but it will have a lot fewer human visitors since people searching for information will get information from search engines that will summarize Wikipedia content for them. If that day isn't here yet, it will be here in a few short years.