Probably because he only had one data point? Since the example site was selling furniture, I doubt his argument was that the blocking was due to content restrictions. And what scale would *you* expect even if it was due to content? You sound very sure that there's no linear correlation, but we don't even know how he selected the domains he tried (random vs. alphabetic vs. ip order). He at least used the phrase "if the same proportion holds", while you assert that "certainly" wouldn't be the case. So I guess I'm more curious why you are so sure it can't be linear.
At the very least, I take from his argument that comcast doesn't do a good job with it's DNS service (intermittent failures + missing records) and provides no recourse for small businesses who are being excluded for whatever reason from being easily reachable on the internet. I'm going to go on continuing to hate them, without the but.
To the systems programmer, users and applications serve only to provide a test load.