I argue that truth matters. Religions are more than just stories that make people happy. They tell people what to do, and how to live. Sometimes those things are harmful. "Abraham, I want you to kill your son for me." Now, if that's really God saying that, and there's an afterlife and a heaven that you're both going to, then fine. Abraham's son will feel momentary pain, Abraham will miss him for the rest of his life, and then they'll be reunited in heaven.
What if it's not true, though?
This happens today. Some nutjob hears voices and kills their kids because "God told me to" or "they had the devil in them".
Some religions tell you to kill the infidels, or conversely, Christians had the crusades where they did the very same thing. In both cases, because their religion said so.
That's the most extreme example, but religions aren't just going to a place of worship and being nice to each other, and so it matters very much if they're true. That truth underpins their authority to tell you how to live, and the correctness of the choice to live like that.