Does it really make sense to spend money on CS education while importing cheap H1B labor?
Yes it does. Unless you do a job that requires direct person-to-person interaction (medicine, nursing) or tied to regulation by necessity (law), or that requires hand-on work (utilities), you are going to compete with H1B and and global workforce no matter what.
Deal with it. That has been the norm for, what now, 15 years? For 15 years I've been told that my career is going to go poof because H1B labor or because some guy in Bangalore makes 1/5 of what I make, as if software/IT work can be directly compared to picking up fruits or something. In my first 5 years of work, I doubled my salary, and in the 15 years that followed, I've doubled it again.
And I've also been laid off a couple of times, one time 6 days before my first child was born. Tough shit, such is life. You adapt, you fight, you learn, you re-learn, you borrowed Teddy Roosevelt advise ("“Whenever you are asked if you can do a job, tell 'em, 'Certainly I can!' Then get busy and find out how to do it.")
We have to compete against H1B workers and a global workforce? Yes. End of the world? Yes if you suck.
To compete, you need to build your network, and you need to have specialized skills that are on demand. And that requires a baseline education, CS education or something comparable, or related experience.
This has been a fact like, forever. H1B workers and globalization are just a new constant in the polynomial.
The hard part is indeed establishing what the right level of security is and how to evaluate companies against that. At least over here, the exclusions for burglary are pretty clear cut: leaving your door or a window open, and for insuring more valuable stuff there are often extra provisions like requiring "x" star locks and bolt, or a class "y" safe or class "z" alarm system and so on. With IT security, it's not just about what stuff you have installed and what systems you have left open or not; IT security is about people and process, as much or more than it is about systems.
I would disagree with you on this (somewhat). There are well established practices on how to build secure systems, for each major development platform (JEE,
Any organization, big or small, needs to be able to come up with scenarios and questions for things that need care, and for which it might need to provide evidence of attention. The important thing is to execute due diligence when it comes to defending your business against attacks, and to demonstrate providing evidence of such due diligence.
If we are in e-business or are bound by PCI, HIPAA and/or SOX compliance, the following questions would come to mind (just an example):
- Are we addressing the top 10 risks identified by OWASP?
- If so, can we quickly identify how we address them?
- What other risks identified by OWASP do we address and how?
- How do we address CERT alerts and advisories?
- Are we on top of security patches?
- Are the underlying systems security patches up to date?
- If so, can we quickly provide evidence of this?
- If we are bound by HIPAA and/or SOX how do we address security concerns that might stem from these regulations?
- How do we quickly provide evidence (evidence of process and assurance)?
- Do we have a multi-tiered architecture, or do we run everything co-located?
- Are back-end databases on their own machines, in their own subnets outsize of a DMZ?
- Are "mid-tier" services on their own machines, separated from databases?
- Are they in a DMZ? Are they proxied by a HTTP server in different machines?
- Do we have firewalls? If so, do we keep an inventory of their rules?
- Are we up to date with patches for network assets (firewalls, SSL appliances, etc)?
- Are we still on SSL 3.0 or older versions of TLS?
- Do we specifically disable anonymous ciphers?
- If we use LDAP, do we disable anonymous binds?
- Do we use IPSec to secure all communication channels (even those internally, a requirement for banking in several countries)?
- If not why? How do we compensate?
- If we are in E-Commerce, how do we demonstrate that we are PCI-compliant?
In my opinion and experience, these questions present the starting point for a framework to determine the right level of security in a system. More should be piled on this list obviously, but anything less would open a system to preventable vulnerabilities.
And that is the thing. The right level of security is the one that helps you deal with preventable vulnerabilities that you, the generic you, should know well in advance, vulnerabilities that are well documented. How costly the prevention is, that is a different topic, and any business will be hard press to justify to an insurer that they forego to deal with a vulnerability because it was too expense.
Answers to those questions and evidence of such would constitute proof that an organization followed reasonable due diligence in establishing the right level of security. Moreover, it will have a much greater chance to disarm an insurer trying to find a way to avoid covering damages.
Notwithstanding the ongoing abuses done in the Insurance business, insurers have rights also. My general health and life insurance is not going to pay up my family if I kill myself while base jumping with blood alcohol levels up the wazoo.
Uhh...wut? Just because they looked like overgrown lizards in Jurassic Park, doesn't mean they're related to lizards.
Well, some of them actually do look like lizards.
Tuataras are neither dinosaurs (clade Archosauromorpha), nor lizzards (order Squagmata). They are Rhynchocephalia, distantly related to the Squagmata, both orders being Lepidomorphs. It is almost as comparing Marsumials with Eutherians.
enough ram to run without swap file thrashing. Price was high as well
These two are related. OS/2 needed 16MB of RAM to be useable back when I had a 386 that couldn't take more than 5MB (1MB soldered onto the board, 4x1MB matched SIMMs). Windows NT had the same problem - NT4 needed 32MB as an absolute minimum when Windows 95 could happily run in 16 and unhappily run in 8 (and allegedly run in 4MB, but I tried that once and it really wasn't a good idea). The advantage that Windows NT had was that it used pretty much the same APIs as Windows 95 (except DirectX, until later), so the kinds of users who were willing to pay the extra costs could still run the same programs as the ones that weren't.