Comment Re:A Bit More To It... (Score 1) 95
Something that does not actually work, and has no prayer of actually working, does not "reasonably ensure" anything. It's kind of the opposite of "ensure".
Sometimes you're told to ensure things either can't or in fact won't be be ensured. That can happen because the people who create the standards are stupid, or because the people who create the standards don't understand obstacles created by how things actually work, or because their understanding is out of date, or because in their hearts they believe that if they just wish hard enough it'll all come true. On the more "positive" side, it can be because they are thinking in terms of "what would be objectively reasonable" instead of "what an organization can actually be beaten into doing", or because they make the mistake of pretending that everybody will comply with good will and actually accept meaningful restrictions on their scope of action instead of finding ways to claim to comply while doing exactly what they would have done anyway.
When that happens, a culture tends to get built up around accepting certain ritualistic actions as "reasonable compliance", in spite of the fact that everybody knows they're ineffective
Auditors (and people in basically similar roles) are usually big players in this phase. They find themselves told to enforce standards that cannot in fact be met at all, or that can't be met in any way that they actually have enough power to demand. Even if it would, in some sense, be objectively reasonable to do whatever it took to meet the standard, that doesn't mean that most auditors can really say something like "You can't do work-at-home because you can't protect customer data, and don't give me any cheesy BS about cameras.". At most, they might hope to expend a lot of political capital to get some exec to spend 15 seconds signing off on a "risk acceptance". Yet auditors also aren't really allowed to admit when the standards are unreasonable, and they sure aren't allowed to admit that they can't actually enforce effective compliance with a standard that is reasonable.
So they sort of converge among themselves on which specific rituals count as "best practices" and are therefore "reasonable". The legal system then plays along, because, first, lawyers and judges are not usually domain experts and may not even notice the flaws, and second, they are under the same pressures as auditors. Yes, judges too.
This is of course easier for everybody if the "best practice" isn't completely ineffective. You can then exaggerate that and tiptoe around the fact that it's almost completely ineffective. These cameras, for example, will probably catch some small number of the dumber criminals, which can then be used as justification for entrenching them as standard.
This whole dance benefits people by letting them avoid getting into conflicts that will hurt them personally, and/or that truly can't be reconciled at all. But it does not advance the supposed, object-level goal. The Emperor has no clothes.
If you want to actually prevent data theft/abuse/whatever by these CSAs, as opposed to putting on a performance to signal your abstract endorsement of prevention as a good thing, then you either have to find a way to do it while eliminating "clean desk" as a requirement, or you have to provide an environment where you can actually ensure "clean desk", and demand that people come there. Which of course involves paying more, or making some structural change that hurts careers or demands extra work somewhere, or worst of all forcing people think a thought or two that fall outside their comfort zones. Which is why huge numbers of structures around you will resist you and undermine your ability to successfully enforce it.