Navy Now Mandated To Consider FOSS As an Option 205
lisah writes "In a memorandum handed down from Department of the Navy CIO John Carey this week, the Navy is now mandated to consider open source solutions when making new software acquisitions. According John Weathersby, executive director of the Open Source Software Institute, this is the first in a series of documents that will also address 'development and distribution issues regarding open source within Navy IT environments.'"
Re:Is the tide turning? (Score:4, Informative)
In short, I think you are wrong.
No surprise (Score:3, Informative)
More paperwork? (Score:3, Informative)
A long while back I worked for USGS. We were hampered with hiring people, getting new software, hardware, etc because of all the paperwork. If we made a decision we had to consider 50 different laws and regulations. Individually, they were great ideas. Put together they were paralyzing. This is the reason we were stuck with Data General for so long, because no one wanted to do the paperwork to change vendors.
Re:Cool!! (Score:1, Informative)
N M C I (No More Computing Inhouse) (Score:3, Informative)
In the last few years the Navy has straddled us with the hideous NMCI IT contract that dictates operating systems, software applications, and hardware. When NMCI was conceived, in the womb of ignorance and shortsightedness, they were thinking of providing a common monocultural solution that might work if the only thing the Navy did was to send email and make PowerPoint presentations.
In a research environment you need flexibility in order to match solutions to problems. NMCI forbids the installation "unapproved" software or hardware. This includes software drivers and communication applications for special purpose hardware such as serial/USB/PCI devices. You cannot connect any web enabled devices like cameras, 1-wire control, power control devices, UPS devices, weather stations, data acquisitions, etc.
So what happens at the Navy Labs is there are two networks - the NMCI network and the "Legacy Network" where the work gets down.
In the spirit of reducing cost we have have to maintain two networks and two computers on each desktop and have two exposed flanks to the outside world! It is wasteful, dangerous and inefficient.
Oh did I mention NMCI is inefficient and near useless. I have a NMCI laptop. I would rather have a 286 with two floppy drives and a sharp stick. The other day I needed to access a jpeg image that was on the NMCI network and edit it with Coral Draw (the application they felt I should be using instead of the more useful, efficient and cheaper PSP). I timed the process from pushing the "On" button and loading the remote desktop, mapping the network file system, logging on, clicking thru all the various dialog windows, loading the bloated application and load the file - it took over 27 minutes.
Re:Cool!! (Score:5, Informative)
Of interest would be the clause about internal use - if one government agency modifies it can any other use it without requiring a broader release of the source? On theory the DON, as longs the program stays within the US Government, would be under no obligation to release any modifications since they have not distributed it; all they have done is install and run it on machines owned by them.
FOSS in the Navy (Score:2, Informative)
Believe me I HATE the Windows 2003 enviorment I am forced to administer. And the SPAWAR forced enviorment on top of that which increases the issues. I'd thank God for reliable servers and workstations, but I don't for see this ever occuring. Alas I have to do my time and move to a sector that does. Nothing to see here. *shews away readers in MiB suit*
COTS = (Score:2, Informative)
Re:Great! This is what you have to do (Score:5, Informative)
Although NT4 was certainly used for secret material, I am pretty sure that only B-rated operating systems were entitled to hold secret and some top secret information. A-rated systems could be used for anything. Only one truly general-purpose A-rated OS (Genesis) was ever developed and officially rated - many other A-rated OS' existed, but they were all special-purpose. C-rated systems were only supposed to be used for unclassified and commercially sensitive material, if I remember the system correctly.
Trusted Solaris was rated B1, which meant it was as good as you could get without some very stringent formal proofs of correctness and formal design methodologies. The big difference between B1 and A1 is that a B1 system is bulletproof only according to any tests and evaluations performed on it, but the tests aren't guaranteed comprehensive. With an A1 system, you also know that the implementation exactly matches the design and that there is no obvious flaw in the design.
However, the criteria have shifted over time. Under the Common Criteria, Trusted Solaris and Solaris 9 "only" rate EAL-4+ (out of a maximum of 7), with PR/SM and XTS-400 being the only ones to rate 5. Bear in mind that RHEL4 update 1 is also classed as 4+, as are Windows Server 2003 and Windows XP. The difference in security between Windows 2003 and Trusted Solaris is so vast as to be laughable, and the idea that a highly specialized, highly secure system like XTS-400 is less than a single unit of trustworthiness better than XP is a complete joke. Clearly the method used in the Common Criteria is flawed to the point of not being useful as a measure of trust.
Mind you, the Orange Book was not perfect. Trusted Irix was rated B3, MULTIX was rated B2. The Multicians (a group of surviving kernel developers for MULTICS) let me know that there was no API, but you can't test if the API works if there is no API to test against. This makes testing for code safety difficult at best - you've nothing to tell you what's meant by safe. I'm prepared to believe MULTIX was brilliant, in fact I do believe that, but I have a hard time believing that the level of trust you could place in it was somewhere between that of Trusted Irix and Trusted Solaris. That may well be the case, but it feels more likely somehow that the evaluation criteria are too narrow and too minimalistic.
(I'd develop my own criteria, but having friends and karma on Slashdot doesn't equate to being taken more seriously by industrial leaders on security issues than defense industry specialists. In fact, even being on Slashdot is probably a big minus in the eyes of places like BAE or Sun Microsystems. Which, of course, is stupid - everyone here knows Slashdot readers are the creme a-la creme of the industry.)
Re:FOSS in the Navy (Score:2, Informative)
Later (late 90's) I worked for a company that specializes in Air Traffic Control Systems. Development environment was Linux and production environment was AIX.
Government agencies have accepted *nix flavors for a long time. "Never going to happen" is an incredibly strong term, and the fact that you've already got Linux boxes poking their head in leads me to believe that "Never say Never" is an appropriate response.
-CF
Same thing in Canada (Score:3, Informative)
* On average, commercial, off the shelf software (COTS) tended to be slightly cheaper for life cycles in the mid-term range, which seemed to be 5-12 years or so. Shorter than that FOSS was best because of the low up-front costs, while on the longer term the lack of vendor support for COTS was a concern. The number that was thrown out was COTS being about 15% cheaper for the mid-term, although there were cases where FOSS was still better.
* To avoid finger pointing between the OS and application manufacturers during bug hunts, it was desirable for a single company/consultant group to take responsibility for all software. They weren't inclined to wait in a war zone while tech guys played telephone tag while repairing a bug. The ideal would be to purchase hardware from a given supplier, and having one contact point for all software.
* Long-term software support was a concern for both COTS and FOSS, but the ability to either maintain the software yourself (least desirable) or form a consortium with other like-minded entities was an advantage for FOSS.
* Licensing was identified as a major hassle. The speaker identified that computer types are very highly trained from a technical perspective, but not trained from a legal standpoint, so navigating through licensing conditions was a problem. They were hoping our Treasury Board could handle government-wide licensing issues.
* There was definite interest in shifting the computer systems on-board our latest warships from HP-UNIX to Linux-based systems to avoid the vendor end-of-lifing the systems.
The talk continued on to discuss issues related to hardening systems from attacks, but I didn't stay for the whole thing. Just before I left, the speaker was bemoaning that while FOSS gave great tools for the good guys, they also empowered the foreign script-kiddies as well, so it was a two-edged sword.
Re:First OPSEC gets the axe... (Score:3, Informative)
Also having source-code to secure systems in the public domain doesn't hurt. In fact it actively can be of benfit as the more people look at it, the more loopholes get found and fixed. PGP source code has been freely available for decades but the algorithm that the code implements is still widely understood to be one of the most secure encryption methods out there.
Re:Inconceivable! (Score:3, Informative)
Sun, IBM, Novell, Oracle, Red Hat, UTS, SCO, HP, etc, etc, etc...
Re:Great! This is what you have to do (Score:5, Informative)
NSA trusted computing (Score:3, Informative)
What law require(s|ed) evaluation according to the NSA "rainbow books" before a system can be used for government work? Where I work, even systems which process Classified information are not required to have trusted system software. You have to protect the system, but that's most often accomplished by far less sophisticated means. It is what is called "system high" or "dedicated" operation -- you treat everything as classified, lock everything up, and only let cleared people near it. The OS is not part of the safeguarding. Hell, eight years ago, there were plenty of Windows 95 and Windows 98 systems processing Classified information.
The more sophisticated measures -- an OS supporting multi-level security -- is only required if you want to let people who are not cleared to the information access some other part of the system. In other words, if you want to have Joe Blow without a clearance store his order for janitorial supplies on the same system that has SECRET data.