Comment Re:Summary's accuracy seems questionable (Score 4, Interesting) 222
It's a bit of both. Some of the facts of the matter were known, but it was assumed that this was just "the way it was". That is, no one considered it an open problem. For instance, we view the inability to divide by zero just a fact of mathematics, not a flaw. Likewise, this was not known to be a flaw, it was just assumed that this was the way things worked.
If you need to point to a definitive flaw, it was in our understanding of how it was supposed to work - the relationship between our understanding and the notation. Once *that* flaw was discovered, the actual notation just spilled right out. That is, the flaw was that people were *not* treating dy/dx *sufficiently* as a fraction, due to 19th century preferences against infinitesimals. Once you realize that dy/dx really is a fraction, and has to be treated accordingly, everything automatically works.
It's almost humorous because there was no real advanced work to do. Literally everything needed is available in intro calculus. The problem was (a) the mathematics community had a habit of *not* treating dy/dx as a fraction, and (b) new students who didn't know better were simply taught *what* to do, not *why* to do it, and continued to repeat the mistake for over a century.