As pointed out in the article, the assumption that 20/20 is normal vision is absolutely wrong and is based on a self-reported sample of people 120 years ago. Mean visual acuity for ages 18-29 is better than 20/15.
I had a whole bunch of good information to post here, but someone at the article's comments said it even better.
So, shamelessly quoted from there:
40. Daniel Says:
June 10th, 2010 at 9:25 am
It is clearly an exaggerated claim. 20/20 Snellen visual acuity is a reference standard used as a cut-off for the lowest level of normal vision -- not as an average visual acuity for the human population.
Elliott, Yang and Whitaker (1995) published Visual Acuity Changes Throughout Adulthood in Normal Healthy Eyes. In this paper, they reported mean VA for 18-24 was 20/15 (6/4.5 metric). That's the MEAN visual acuity for young adults. So a significant number likely had a better VA than 20/15. 25-29 year old mean actually _improved_ to ~20/13 (6/4 metric). The mean VA increased (approaching 20/20 or 1.0) from that point until reaching a mean VA of 20/20 (6/6 metric) in the 75-year-old group!
20/20 is the wrong VA to use for average human visual acuity. In addition, R.N.Clark at Clarkvision.com reports that people up to 50 can reliably tell the difference between 300 ppi and 600 ppi printouts.