Trying to estimate the rate of technological development across decades is a phenomenally tricky business. Looking at the past, ideas that were incremental can be lumped together and thought of as revolutionary, and revolutionary ideas can be re-imagined as merely incremental. The telephone was developed in the 1880s. But the telegraph, a perfectly good way of sending binary data down a wire, had already existed for decades. Telephony brought analog signals into the mix. It wouldn't be until the 20th century that they figured out how to multiplex multiple (analog) signals on a single wire. Or developed feedback amplifiers that permitted signals to be sent across North America using a reasonable sized conductor. Or developed an automated switch to replace the human operators that physically connected your circuit to the party you wished to call. Or realized that, rather than multiplexing analog signals, it was more efficient and reliable to digitize the signal and use packet switching on a digital communications network (back to digital data, like the telegraph!). Or to set up networks of locally operating radio towers (cells) that provide a mobile telephone with seamless coverage as it travels from one place to another.
Which of these is simply incremental? Which is revolutionary? Is the 1880s telephone itself the major revolution? Note that some buildings and ships already existed with tubes designed into them for communicating (i.e. shouting) between rooms. The telephone replaces the tube with wires, borrowing the idea from telegraphy, to achieve the same purpose. Were the innovations that followed (multiplexing analog signals, feedback amplifiers, automated switching, packet switching, cell networks) incremental or revolutionary? It's difficult, I'd actually suggest impossible, to make a definitive claim.
My claim is simply that measuring the rate of technological process is a subtle and tricky business. I believe the best we can really hope for is to find metrics with narrow domains. For example, the number of transistors in an IC (Moore's law), the annual output of technical papers, or the speed at which DNA can be sequenced. These metrics only truly measure what they say they measure (transistors, papers, and sequencing speed). We may try to infer progress rates from such metrics, but logical errors arise when these metrics are used to predict progress outside the metric. (transistor counts to predict artificial intelligence.) The author of the article makes the similar estimation errors to the singularity folks when discussing technological progress, but biased in the opposite direction.
I am not a believer in the singularity. But, it is fun to think about the future, so I cut the singularity folks some slack with their over-the-top predictions. I just don't take them seriously. But I believe futurologists do some good in encouraging people to think about possible futures and what it take to achieve (or avoid) them. If the aim of the article is to remind us not to take the singularity folks to seriously, then I agree. But as it is written, it sounds more like "Get off my lawn!"