It doesn't matter, on average. Let's say we know thermometers are off +/- 1 degrees. Maybe because the thermometer, maybe because of human error eyeballing the mercury, whatever. However, assuming it's as likely to be over as under (and there's been extensive research on that as well), the bad readings will more or less cancel each other out.
If you only had one reading, for instance, you could really only say what the temperature was +/- 1 degrees. If you have a million, you could say you know almost for certain (just like the more times you flip a coin, the closer your distribution is likely to be 50/50). The reality is somewhere in between, so there are error bars, but they're known quantities. And as a previous poster mentioned, the thermometers of the time were surprisingly accurate anyway.