Are you really claiming that "roughly coordinated" is at much less than 1 second resolution? If I understand correctly the maximum discrepancy would be half a second to occur at the actual point that the leap second is applied plus any normal deviation of the time systems. For most applications this is acceptable. Anything that is failing to fly or exploding due to this level of difference in time has bigger problems than the time servers being slightly off and is an accident waiting to happen. It would seem that such a sensitive application should be able to determine and adjust to a skewed time from an external source regardless of the reason.
Further the bugs introduced in the timing logic of numerous applications that don't care so much about timing, just that their clock agrees with the world, makes this the right answer for most applications. For what its worth it sounds like the right answer is to have smeared and non-smeared servers, with the smear period using the same period for the smeared servers. If you need precise time make sure your system can handle a 61 second minute and use the special case non-smeared servers.