Yes that's a fair characterization. For companies below the line, i.e. those that would lose a lawsuit easily this is helpful.
A server not connected to a network in a physically secure location was the situation for the computer that Bradley Manning stole from.
Centcom is interested in starting to build a government infrastructure for defense. They agree this needs more collective action and government assistance. Right now the public is pulling in the opposite direction however.
I agree there are terrible auditors that don't understand what they are doing. But in most companies you can push back against that, it is just that then the burden switches to you. You have to verify and certify that alternative approach X is better than industry standard approach Y.
As far as the rest, the purpose of an insurance company is to pool risk. The person being insured should likely not want to have to file a claim because that means something bad happened. The company doesn't want to give nothing in return because then there is no need for their product.
Industry handles this in other areas and for that matter security as well by having auditing firms and engaging in a "best practices" audit. "Best practices" doesn't actually mean best practice but rather not doing stupid or dangerous stuff. The audit is how that gets determined.
They ran into financial trouble as Mandrake. The distribution was moderately profitable but they lost their shirt on an educational software venture that failed.
I liked them late 1990s (Mandrake) they were my favorite distribution because so many things "just worked" and their configurations were often more sensible. You started off far closer to a working system.
Didn't try the server product much though did use it once for a RAID product and it did a great job on defaulting the RAID.
Linux driver support definitely is a bit crappier, but it's a lot better than it was even say 5 years ago.
My experience is that it has gotten worse. 5 years ago I could pretty much run an arbitrary Linux distribution on an arbitrary 1 year old laptop and have say an 80% chance of few if any problems. Today most interesting laptops have whole swaths of features not covered and many drivers not included. I think hardware got more interesting and the Linux community has gotten less focused on desktop (understandably) and the result has been a huge downgrade in terms of compatibility.
LAN Manager was multiuser. The client wasn't but that doesn't make much difference as the non-multiuser smartphones phones using apps and websites today proves quite well.
I see turbidostato below made the same point.
OS/2 had networking (really good networking) and multitasking. Lan Manager (based on OS/2) as well as Novell (worked with OS/2) had file permissions. So they produced a product with those 3 facets.
Apple today only sells to the "high end" of the market. They mostly sell $500+ and they have some share $400-500 with no share below that. Their phone is already niche in the way you claim it will eventually be.
Why does Apple feel the compulsion to plow money into an inferior map service?
1) They don't want to be held hostage
2) They can provide a high degree of integration and services on their mapping service than they could using Google's offering.
It only benefit their iphone niche until they can't sustain a lower end iphone market.
I don't understand this. I'm not sure they won't be moving down market not up market given they own the entire up market. Why wouldn't they be able to sustain a lower end iphone market?
RDBMS engines are designed to convert routines of in memory row by row or group by group statistical operations and figure out good (optimal) disk / memory organizations. That's one of the things they are very very good at.
Check your math on that, please. 8*3600*3 = 84.375 GB, not 85.5 PB.
You are correct. Sorry.
And if your tracking 3 MB per second per user, you're tracking bots, not users
Absolutely. You are mostly tracking network security events, computers talking to other computers. What you are generally looking for is unusual activity. Server 2047 never talks Asia all the sudden it is talking to Vietnam regularly. But to do that you need to know who is talking to what across the network.
SQL can handle all of it if you design your database sanely
Yes and no. Obviously if you knew in advance ever type of message, designed good ways of getting it in there, good aggregates then a RDBMS would be better. But with: formats of data poorly understood, bad understanding of the types of data, complex matches, unclear rules about to normalize... SQL Server's engine won't hold up. Of course you can just throw it in a table but then you can't do much with it at reasonable performance. That's what Big Data engines are for. Once (if everO you do understand the data well enough to get it into a RDBMS of course you would rather use an RDBMS.
. You pay a penalty for your poor design, sure, but everything works.
No it doesn't. RDBMS don't scale as well as Big Data systems. As the number of CPUs, total memory, total disk increases (particularly in cluster configuration) their performance does not increase linearly or even nearly linearly. You can't just pay a penalty and solve the problem by hardware.