Yep. Software freedom is software freedom. Even if it lets those smelly other people who dare to have different opinions use the software ...
No, full stop.
The problem with software freedom, at least when it comes to sufficiently powerful entities like governments, is that they have access to treasure troves of data that the general public does not. They have the power to massively abuse privacy in ways that, once done, cannot ever truly be undone. They have the ability to stick AI technology onto a drone with machine guns and use it to assassinate random people anywhere in the world from miles away.
Absolute software freedom can, in a very real sense, result in the deprivation of other types of freedom, up to and including life itself.
When it comes to software that has massive potential for abuse, such as AI, there are good reasons for blanket licenses to not allow certain categories of use. It's not that those uses should necessarily be prohibited, because as the summary says, some uses in those areas can promote equality and freedom. Rather, the existence of banned areas in blanket licenses allows the creator to ensure that only those uses happen, while not allowing uses that do the opposite.
The existence of a license does not preclude use of that software in those areas if licensor agrees to make a specific exception on a case-by-case basis, based on who the entity is and how the technology will be used. Make the exceptions A. time-limited, B. scope-limited, and C. carefully monitored for abuse. For example, you might require an oversight body of AI ethicists who don't work for the company/government in question to review the project once a month with full access to everything they are doing with the AI tech. Allow that group to call for a shutdown at any time, for any reason, including getting even a whiff of evidence that information is being hidden from them.
For any project that comes within a nuclear bomb's range of any usage area with high potential for abuse, it makes sense for some technologies to require careful scrutiny, rather than blanketly licensing them to anyone who says "I want to connect your AI to a missile launch controller" or whatever.
You get the point.