Reducing a sight in modern celestial navigation usually involves calculating the expected altitude of a target celestial body at a particular time at a particular place on the surface, noting the actual altitude of the body, then using the difference to determine your distance from the reference location. To do that you need to use a time standard that stays in sync with the heavens, or correct for the drift in one that doesn't. UTC is convenient because it's accurate to about a tenth of a second, as close as you could hope to measure by hand, and doesn't need overly frequent adjustment like UT1.
An error of one second in reducing a sight translates to a positional error of around a quarter of a nautical mile, or about half a kilometre. As discussed here a while ago, celestial navigation is still an important backup for military and commercial shipping, as well as deep water private sailors who aren't idiots. A kilometre or two one way or another often doesn't matter, but sometimes it does. GPS, of course, also has to make the same corrections, although it uses finer grained ones than UTC does.
If you're the OP, in regards to your original post, something I don't understand: if you're an astronomer who is noting the timing of astronomical events, why are you using UTC instead of terrestrial time: "a modern astronomical time standard defined by the International Astronomical Union, primarily for time-measurements of astronomical observations made from the surface of Earth?"