So, the flaw then, at least potentially, is that without a proper theory of mind that makes that connection... and even with a validated theory of mind, we may discover that there is a fundamental LIMIT to the level of intelligence possible. We may discover that humans already represent the maximum possible level of intelligence in our given universe.
The mediocrity principle would go against us being at or near the maximum intelligence possible. If is more plausible that we are near the limit on naturally evolved intelligence, however. The reasoning is that once a species reaches the level required to start building technology, the game is over. To get significantly higher intelligence, evolving species would have to keep encountering obstacles to tool usage that even higher intelligence can not work around.
The singularity might still be impossible if turns out to be impossible for a intelligence to design an intelligence greater than itself. We see hints of this the last AI winter and the current summer. The last round got stuck because no one could purposefully design intelligence. We still can't but we've made progress by loosening the controls and allowing emergent structures that evolve from training do the heavy lifting.