The thing is, I don't think it would be a stated business strategy.
The nature of most moral hazards isn't that they're obvious conspiracies to do the wrong thing, but a set of biases and bad incentives that lend themselves to creating a situation where bad choices get made.
As an example, drug addiction is a moral hazard for doctors. Doctors know that drugs can be habit forming. We expect doctors to be experts in administering them, to have reasonable ease of access to them for treating patients as best they can. The doctor believes his own expertise will prevent him from getting addicted to them. But expertise plus overconfidence in their own knowledge plus access results in a ton of doctors getting hooked on drugs.
Taser for the most part sells stuff to cops. Taser would like to keep cops happy and keep buying cop stuff. Taser "knows its market" and understands what they want. At some point the desire to make money selling stuff to cops and knowing what cops want lends itself to creating holes in accountability, not because some executive said "they're good guys and good customers, they shouldn't get dragged down because some douchebag criminal got a good attorney" but because they want to please their market for reasons that are independently all completely normal and reasonable.
With automated systems, it's much harder to argue that the problem wasn't deliberate.
"When asked why the body camera video of the police beating didn't exist, despite the system supposedly being automated to upload them to remote secure storage, officials noted that 'network limitations' caused by 'budget constraints' prevented the video from being immediately uploaded as originally designed. Police data networks were overwhelmed when the system was first rolled out and the vendor, Taser, Inc, added an on-site caching feature that uploaded the videos in a slower and more controlled fashion to prevent network overload. A problem with the caching server at Police HQ caused 'only a handful' of videos to be lost and Taser officials said this risk will be fixed in a new version available sometime next year."
Desire to sell your product + pleasing your customer = exploitable hole, even though nobody actually *conspired* to do this even though the design goal was the opposite. Had a vendor been selected whose first concern was guaranteeing data integrity, not necessarily accommodating the end user's specific desires, the hazard could be avoided. But this only happens if the vendor's allegiance can be to someone other than the cops, like some kind of oversight board whose principal interest is in data integrity.
This way the vendor's goals are aligned with the purchaser's goals and the hazard is avoided.