The very reason why physicists build quantum computers is *because* they suapect or propose this. In fact, the observation about the computational complexity was what lead to the idea of QC.
I have worked on QC (experimentally) and as an experimentalist i understand that the existence of Schroedinger cat-like states is a prerequisite to the generation of e-bits, which are what a succeding computation needs for the NP-speedup.
So hist section 3 is title wrong because it imples that arbitrary large quantum states can be generated (sine he uses the word "explained" and not "equivalent"). However these have not been observed for *arbitrary large system*. i observed such states experimetnally, and as a matter of fact we were busy oberving the decay into a classical state, which is standard technique in all experimental groups working on this field.
So iff NP=!P then QC makes sense and
a prerequisite for QC is the generation of systems with many e-bits (entanglement measure). Even a large system undergoing a quantum dynamics (e.g. the cooled MEMS systems) is not sufficient for claiming (or thinking) that there exists much entanglement in the computational sense.
I am sick and tired of mentally short-circuited papers like this one which restate the obvious and ignore the recent developments. i am sick theorist who dream of being great philosphersand at the same time utterly ignorant of many people doing hard work in the last 20 years.The citation pattern in the paper screams "shit". I see no reference to previousl literature about entanglement measures. He talkes about the "measurement" problem like it did not receive any attention in the last 80 years (and as a matter of fact it did, theoretically and experimentally). The abstratc doe not state a clear goal, the paper contains a quantum mechanics for beginners lesson and the paper does not have a "summary" but "final remarks".
looking at the prvious work of the same author an incredibly weird comment (http://arxiv.org/pdf/1401.1747v4.pdf) can be fund in which he has his personal definition of what is falsifiable. His central idea does not hold, of course, if i can do one or more things of the following:
* Apply trace operations before comparing the observation, and at the same time reduce the complexity of the theoreticla calculation
* Do postselection and compare relative probabilities of experimental outcomes , where the ration verifies or falisifies the theory.
Both are valid standard operations in verifying (i.e. not falsifying) quantum theory.
He seems to be a, medical data evaluation guy, has no significant publicaitons as first author (and to few impact points for his role), and, as much as i appreciate people of other disciplines getting interested in physics, i would expect that we distinct a nice college-level summary from serious research.