The conclusions are bogus. The numbers they run only examine public posting, because the data on private posting is inaccessible to them, and then they draw conclusions based on that. Most Google+ activity is private and/or takes place within groups.
One of the people involved stated "just 9% of Google+'s 2.2 billion users actively post content", (emphasis added) and then from that the article concludes no one uses it.
They also picked the first 18 days of the year to analyze the data; this is prime vacation time for most people for 7-14 of those days.
His distribution assumptions are not evidence based, they are straight assumptions about uniform distributions, and they are all drawn from a single file of 45K profiles, which is the same thing as saying "If you want a straight line fit, only select a single data point".
It'd be much more useful if he had verified the distribution uniformity through an analysis of other sitemap files, and even better if he'd just spun up an EC2 instance and looked at *all* of them.
But I'm sure he got a lot of clicks out of this.