The problem physics faces is that it is using mathematical methods which assume physically implausible foundations. It is then faced with the problem of incomplete knowledge. I shall illustrate the issues using metaphors that anybody with half an ounce of computer science common sense should get. (For reference, my area of doctoral studies was models of PA, in the region of mathematics which gave birth to modern computing.)

Consider modern hashing. If I know the correct input, I get the correct output. If I am off by one bit, but do not know which bit, and the input is 128bits, I have a 1/128 chance of getting the correct output. If I am off by two bits, I have a (1 / 128 choose 2) chance, and as the number of bit errors increases, this probability gets close to zero. Quantum mechanical effects occur when the number of bits of entropy get small, so that this probability becomes experimentally distinguishable from zero. Something like that.

Now consider that energy and mass are equivalent via Einstein's famous equation. Neglect the complex stuff for now. The current theoretical best idea is matter being vibrations in strings. For now I will just take a conceptually simple version to illustrate. A short vibration in a long string takes time to travel, and if this speed is c (lightspeed) and the string is long and coiled, it will take time to get to a place where one particular observer can see it. Likewise photons have to reach us before they can register. Of course interactions between matter through the elecromagnetic field happens via photos.

The obvious explanation is that there is some hidden delay in the underlying physics so that only, say, 5% of the energy in the universe is visible to an observer at any time. What this '5%' actually is will follow from the underlying structure, but quite possibly this cannot be probed by conventional experimental means since it is necessary that the part of the universe experimented on needs to be held constant, thus precluding conventional experiments using physical objects. Again, this is a sketch idea to be pondered, not a claimed 'final theory'.

The thing is, if energy is invisible due to delay, but still contributes to the overall mass inside our universe, these 'dark energy' type sum mismatches might be the only evidence they are there at all. But getting this right means getting the mathematical framework right, and mainstream theoretical physicists are still mostly using stuff done with methods that were beginning to become unstuck in the late 19th century. Issues with calculus gave rise to analysis using limits, and these were founded on arithmetic and set theory. But these last two assumed an infinitude of distinct objects with which to perform computations. It is known now that this is physically implausible. Thus one needs to use more strictly bounded arithmetics and recursive constructions using precisely accounted computational resources to form foundational models which can correspond to physically plausible structures. By studying such structures and limiting towards the ultimate capacity of the physical universe (think Bekenstein bound here) we will be better placed to sort out this theoretical mess. Current mathematical methods are simply not up to the task.

(google "John Allsup Mathematical Genealogy" and see where I fit in the Ph.D. tree to get an idea of the area I was trained in: life circumstances rendered a conventional career infeasible, which is why I have no academic reputation, but I have kept an eye on progress, and have kept my logical reasoning skills sharp, just in case.)