True, though the article does cover both aspects of the question.
That said, what's cool about 56 microseconds/day is that that is 1/50th of a second per year -- that's significant enough that it really is something we'd have to take into account at times, but it's small enough that the vast majority of applications wouldn't care.
Ultimately, what will probably come out of this is *two* time standards -- trying to do lunar time zones doesn't make a lot of sense, so they'll probably just end up with two time scales: Lunar time and Earth time, where the former has a very slightly shorter second than the latter.
So there will probably be some "epoch" time picked where the two time scales match up, and then they will diverge, very slowly. Anything that requires extremely high precision on the lunar surface will use lunar time (and will be explicit about this), and anything that doesn't won't matter. Maybe in 50 years they could do a leap second for lunar time?
And lunar timezones won't make much sense, but working in a particular Earth timezone might make sense if they're mostly working with people on Earth in that particular time zone, and they'd probably just default to UTC otherwise.