Given the choice between "security through obscurity" and "security through thorough code review", I'd much prefer the latter. See also: Heartbleed.
You ask the programmer because it's the programmer's job to implement the design. There's no bias involved in doing one's job, unless you consider it biased to want to produce both safe and secure code.
What's the cost of the tradeoff between saving money and risking security? That's the first question you need to be asking.
Everyone's excited about IF they can put something on the Internet, and no one's stopping to think if they SHOULD.
John Barnes, author of several programming texts, clearly outlines the concepts of "safe" and "secure" software. For software to be considered "safe", it must not harm the world, and for software to be "secure", the world must not harm it. Given the tacit invitation for attack which is issued any time anything is connected to the Internet, such control systems MUST be developed with those two concepts not only in mind, but integrated into the core design.
I invite dissenting commentary.
Why not just keep the management system OFF the network? Make it local-only?
Just because something CAN be hooked to the Internet, it doesn't necessarily follow that it SHOULD be hooked to the Internet.
Just my 2p worth. Save up the change for a cup of coffee or something.
Not to belabor the obvious, but that takes the "Open" out of the equation, doesn't it?
Adacore has a perfectly good implementation of a high-security Ada compiler, which produces executables for multiple platforms. There's nothing difficult about finding such tools. What's difficult is finding programmers and developers who are willing to take the time to actually develop their code to take advantage of the strict typing which is one of Ada's strengths.
John Barnes, author of one of the most-used Ada texts, outlined the meanings of "safe" and "secure" software in a very straightforward manner:
If software is "safe", it cannot harm the world
If software is "secure", the world cannot harm it.
From what I've seen, C and its derivatives do not have the intrinsic mechanisms to make software developed with that language either "safe" or "secure". It's too easy to break both safety and security using C and its derivatives, because a programmer can cast between types, auto-promote from one type to another, and similar logical faux pas, and the compiler will very happily allow such to take place, which means most bugs are able to hide until run time. Not so with Ada; because of strict type checking, casting must be explicit, and an attempt to auto-promote will be met with a CONSTRAINT_ERROR at compile time.
As the poster on my wall says, "[i]f builders built buildings the way programmers write programs, the first woodpecker to come along would destroy civilization."
I see three problems with automatically presuming that any meaningful code for a compiled program will include a graphical interface.
- Adding code to produce a graphical interface immediately quadruples the amount of code required for a given program, and at least triples the size of the executable.
- Trying to design a usable and logical graphical interface for a program is frequently more complicated than devising the base program itself.
- Ask any two programmers to choose a standard library for developing a graphical interface, and you'll likely get three answers.
That said, programs that I write for my workplace perform 100% in the background, and do all their work on large data files. What needs to be graphical about that? Would it really be that useful to quadruple the size of the code base for a program just to add selector boxes for input and output files, and a status bar?
That's as far as I go.
The Principle of Least Privilege is also one of the core emphases in the Department of Defense's security-clearance program. This appears, to me, to be yet another case of one hand not knowing what the other is doing...or, possibly worse, not caring.
With that said, I see no reason to live in fear. If others choose to do so, it's their choice, and I have no say over their choices; only over my own.
...to the Principle of Least Privilege? What was the oathbreaker (I refuse to speak of him by his given name, and if that makes me a troll, so be it) working on that would give him copy access to that many files? Was he preparing some sort of comparative concordance with the WikiLeaks files?
The Principle of Least Privilege is one of the core emphases for the CompTIA Security+ certification exam. One would hope that an organization that goes by the moniker of "National Security Agency" would grok what's on that certification exam, at the very least.
Just my 2p worth. Save up the change for a root beer or something.
Recall Plato's admonishment.
In the Chinese, bing translates as "poison."
I'm just sayin'.
This is exactly what someone would expect them to say.
(See what I did there?)