An excellent question -- and not one I have an answer to.
I think that perhaps they should get Bruce Schneier to help design their systems for them.
| They should also assume that some of their own employees are moles.
I mention that they should assume that.
They aren't getting *nearly* paranoid enough. They should be encrypting the data on disk, on network connections between machines in the *same* data center, not just between centers. In fact the data should remain encrypted at all times unless absolutely necessary to have in clear-text to process it -- and that should never leave the CPU. It should remain clear-text only for the absolutely minimum time required.
They should assume that hostile agencies (foreign *and* domestic) have tapped every last network link they own. As well as most routers and processing machines. They should also assume that some small percentage of their workforce are working on behalf of one of these adversaries. Given these assumptions they should design a system that can remain as secure as possible given these circumstances.
Merely encrypting the network links between their data centers is not nearly enough to thwart the likes of the NSA, CSEC, GCHQ or other nameless agencies.
The problem with this is that it sounds like raving paranoia. And if it is paranoia and untrue, technically it's just a software update away from being true. And as a theory, it's not really falsifiable.
I certainly won't be buying one of these things.
Pedant.
How about comparing on the most recently available hardware...
My point is that, while open source drivers are a good thing, they are of limited usefulness unless they are competitive with closed source ones for performance, stability and completeness of functionality.
How is the stability and performance compared to their drivers on Windows for the same hardware?
Functional parity (GL version and extensions) would also be nice.
The most dangerous thing to security is a disgruntled employee.
If your regulations increase the likelihood of annoying your employees, they are actively counter-productive to security.
Speaking as someone who, without realizing it, has become one of those old fart programmers;
The key to not appearing selfish is not being selfish.
(I'll also let you in on my secret of weight loss -- *whispering silently* eat less, exercise more.)
I'm more concerned about this trend to solder RAM onto boards (Apple, I'm looking at you here.) -- RAM goes bad over time -- a shockingly short time. (google the papers (by google) about RAM failure rates, and what they do after 18 months). After a couple of years error rates go up -- way up. (ECC would very definitely be your friend here, but intel only makes it available on xeon series chips (the circuitry is there but fused off in consumer grade chips) )
My experience has been that after 24 months, you should just toss the ram dimms in the trash and start with new ones -- and you might as well max out the ram at that point. Otherwise the machine starts getting flaky as soft and uncorrected errors happen with increasing frequency.
For God's sake, stop researching for a while and begin to think!