No, they cannot. LLMs cannot do logical reasoning and that is a requirement for any type of fact-checking or verification. All they can do is correlations and that is not enough.
While using non-LLM AI or non-AI tools is a possibility for fact-checking (in simple cases as the one here), you overlook that basically the only advantage of LLMs is that they are comparatively easy to create. Creating these "other tools" that would be needed here would be a lot more expensive and a lot more work than creating general LLMs. And hence, for most cases, these "other tools" do not exist.