I agree with the parent post. The likelihood that this result comes from people manually making up results is nil, IMHO. By suggesting this, it just seems like Nate Silver has an ax to grind. That doesn't mean that there isn't a bias -- but the bias probably is an artifact of the survey designs and polling methodologies, not something completely nefarious.
Nate stresses that there is a lot of data -- "well over 100 polls, each of which asked an average of about 15-20 questions." However, the amount of data alone is only relevant if the data is random with respect to what is being measured (the frequency of the last digit of a response, in this case). It seems to me that this is not necessarily the case, since many of the polls ask sets of very similar questions which generate very similar answers, e.g.:
From
http://strategicvision.biz/political/georgia_poll_092309.htm (800 likely voters in Georgia, aged 18+, and conducted September 18-20, 2009 by telephone)
4. Do you approve or disapprove of President Barack Obama's overall job performance?
Approve 35%
Disapprove 58%
Undecided 7%
5. Do you approve or disapprove of President Obama's handling of the economy?
Approve 34%
Disapprove 58%
Undecided 8%
6. Do you approve or disapprove of President Obama's handling of health care?
Approve 33%
Disapprove 58%
Undecided 9%
In this example, what would seem to be a preponderance of 8's (the last digit of 58%), which would support Nate's bias argument, is actually just a single 8, repeated twice more in essentially identical questions.