The are probably using mean red shift to measure the distance. The collective light of all the stars in the galaxy are together red shifted a certain amount that's a direct indication of how far away the galaxy is due to the expanding universe. The farther away the galaxy, the more red shift there is. And, one can measure speeds due to red shift very very accurately, down to tens of meters per second. The error bars here are that the far away galaxy can have a local movement that's unknown, but if the galaxy is a part of a cluster, that factor can be measured and accounted for.
This is the most common and simplest way of measuring distance to galaxies, invented by Edwin Hubble in the 30s.
The second most common is using Type 1A super novas, but they have to appear to be used, and.. that's a fluke incident, more commonly used to calibrate and confirm the red shift measurements.