Forgot your password?
typodupeerror

The Failure of Information Security 172

Posted by ScuttleMonkey
from the everyone-is-happy-until-something-breaks dept.
Noam Eppel writes to share a recent editorial regarding the current state of information security. From the article: "It is time to admit what many security professional already know: We as security professional are drastically failing ourselves, our community and the people we are meant to protect. Too many of our security layers of defense are broken. Security professionals are enjoying a surge in business and growing salaries and that is why we tolerate the dismal situation we are facing. Yet it is our mandate, first and foremost, to protect."
This discussion has been archived. No new comments can be posted.

The Failure of Information Security

Comments Filter:
  • "It is time to admit what many security professional already know: We as security professional are drastically failing ourselves, our community and the people we are meant to protect. Too many of our security layers of defense are broken. Security professionals are enjoying a surge in business and growing salaries and that is why we tolerate the dismal situation we are facing. Yet it is our mandate, first and foremost, to protect."
    Bollocks - this implies that there's more security professionals could do, but they choose not to, to drum up business.

    The sad reality of the matter is the vast majority of the threats they mention - Spyware, phishing, Trojans, viruses, worms, rootkits, spam, web app vulnerabilities & ddos attacks - are enabled by the existence of botnets (to stage attacks from, send spam, provide anonymity, host phishing webservers, etc)

    The source of (the vast majority of) botnets is Microsoft's security failures in the late 90's/early 00s. How are security professionals supposed to combat something that happened in the past in another company?

    Furhtermore, the list of data losses
    Credit Card Breach Exposes 40 Million Accounts [com.com]
    Bank Of America Loses A Million Customer Records [com.com]
    Pentagon Hacker Compromises Personal Data [military.com]
    Online Attack Puts 1.4 Million Records At Risk [com.com]
    Hacker Faces Extradition Over 'Biggest Military Computer Hack Of All Time' [spamdailynews.com]
    Laptop Theft Puts Data Of 98,000 At Risk [com.com]
    Medical Group: Data On 185,000 People Stolen [com.com]
    Hackers Grab LexisNexis Info on 32000 People [pcworld.com]
    ChoicePoint Data Theft Widens To 145,000 People [com.com]
    PIN Scandal 'Worst Hack Ever'; Citibank Only The Start [csoonline.com]
    ID Theft Hit 3.6 Million In U.S.
    Georgia Technology Authority Hack Exposes Confidential Information of 570,000 Members [itworldcanada.com]
    Scammers Access Data On 35,000 Californians [com.com]
    Payroll Firm Pulls Web Services Citing Data Leak [com.com]
    Hacker Steals Air Force Officers' Personal Information [washingtonpost.com]
    Undisclosed Number of Verizon Employees at Risk of Identity Theft [com.com]
    can be blamed on companies who have failed to follow their security team's advice. Not on the security team itself.

    The story makes some good points, but blames the wrong people.
  • A real failure! (Score:5, Insightful)

    by VincenzoRomano (881055) on Wednesday May 10, 2006 @05:48AM (#15299638) Homepage Journal
    Information security is failing also because information needs to be managed and addressed by non technical people! Also known as "normal people".
    Techniques like phishing or social engineering, as well as a good dose of stupidity [slashdot.org] and ignorance, can make security technologies useless!
    Like writing down on leaflets PINs and passwords or communicating them via email.

  • The Human Factor (Score:5, Insightful)

    by CortoMaltese (828267) on Wednesday May 10, 2006 @05:57AM (#15299654)
    I think TFA pretty much ignores the fact that for the average user, security is just a warm fuzzy feeling they get after they've installed a virus scanner, a firewall, and checked that there's an image of a closed yellow lock somewhere. For security professionals and the like (including myself) it's usually much easier to tackle the technical threats, while it's all too easy ignore the user, which is typically the weakest link in any security critical system.

    I know I am stating the obvious here, but I still think the human factor is almost always greatly underestimated.

  • by BorgDrone (64343) on Wednesday May 10, 2006 @06:10AM (#15299674) Homepage
    Furhtermore, the list of data losses (...) can be blamed on companies who have failed to follow their security team's advice. Not on the security team itself.
    Not entirely correct. Yes, users are morons, and yes they often fail to follow the advice of the security team. However, it's the security team's responsibility to get proper behaviour into the users stupid little heads.

    Security is not just the technical part, educating your users is huge part of it and if users fail to follow advice the security team has failed in this part of their job. You can whine how stupid users are, but that doesn't change reality, it's the security team's responsibility to make them less stupid.
  • by rolfwind (528248) on Wednesday May 10, 2006 @06:13AM (#15299680)
    It must be someone's fault it's not perfect. Okay, I don't want a tomb but be able to interact with the outside world, so I still want doors and windows. But I think the contractors are secretly conspiring together and failing us security wise, because there should be completely unbreakable windows & non-pickable locks on the marketplace. WAAAAH!
  • Failing (Score:2, Insightful)

    by mulhall (301406) on Wednesday May 10, 2006 @06:23AM (#15299692)
    "We as security professional are drastically failing ourselves, our community and the people we are meant to protect"

    BS

    You cannot solve cultural problems with technology:

    http://news.bbc.co.uk/2/hi/technology/3639679.stm [bbc.co.uk]
  • Hmmm... (Score:3, Insightful)

    by Mostly a lurker (634878) on Wednesday May 10, 2006 @06:25AM (#15299698)
    Microsoft has had over two billion downloads of its malicious software removal tool in the last year, which tells us something about the overall size of the malicious software problem.
    Yep: it tells us exactly nothing about the overall size of the malicious software problem. It does, however, indicate that users are using Windows Update (either automatically or manually). [The malicious software removal tool is a critical update.] It is good news that Microsoft has persuaded users to keep up to date on critical updates, I guess.
  • An Important Note (Score:3, Insightful)

    by Effugas (2378) * on Wednesday May 10, 2006 @06:31AM (#15299715) Homepage
    In the Summer of 2003, the Internet suffered three major worms: Blaster, Nachi, and SoBig.

    We haven't had a worm since. There have been no systemic outbreaks in over three years. Sure, we've had mild rashes, but Zotob vs. Nachi isn't even a comparison, nor is Blaster vs. WMF.

    IE attacks are deeply problematic -- they're wonderfully targetable, among other things. But there's really no replacement for zero-interaction, receive-a-packet-and-you're-owned style vulnerabilities. SP2 put a firewall on every desktop that cared. Since then, no worms.

    That's not to say we're not fighting a painful battle. Really, every day we get to still bank online is another day I'm surprised. But the fact that SP2 was written, was free, and was actually deployed enough to matter is one hell of a win.
  • by Bacon Bits (926911) on Wednesday May 10, 2006 @06:38AM (#15299734)
    I don't think that's what he saying. That is, users are not to blame. The decision makers are.

    Let's say, as an IS professional, you explain to managment the need to restrict user accounts with Administrator rights, the need to implement an intrusion detection device, the need to eliminate spam, the need to make the network infrastructure fault tolerant, the need to update the antivirus client to something that can detect modern threats, and the need to educate users on how to operate their systems securely. Management denies budgeting these things on the basis that they are not necessary, and would you please increase maximum mailbox size again?

    If the company is unwilling to do what is necessary to secure the environment, then as an IS professional you are largely helpless.

  • by symbolic (11752) on Wednesday May 10, 2006 @06:42AM (#15299744)
    That all depends...many organizations have positions that are characterized by "all of the responsibility but none of the authority". This means that as a security professional, you may be able to recommend certain practices, but unless one has the authority to see to it that these recommendations are implemented, there really isn't a whole lot more that can be done.
  • by Linker3000 (626634) on Wednesday May 10, 2006 @06:46AM (#15299753) Journal
    Bad perspective.

    If you consider the users to be morons and know that they will fail to follow security advice than you plan for this. You can implement training to 'un-moron' them to a degree, but it is not wise to consider that the post-training person will do what they have been told all of the time.

    *ANYONE* in any support or consultancy role that starts to say to themselves (about the users) "You'd think that they would/wouldn't...." (eg: You'd think that they would know not to login as someone else") is totally missing the point about human behaviour and is not approaching the problem or their role in the correct way.

  • by Mr_Tulip (639140) on Wednesday May 10, 2006 @06:47AM (#15299756) Homepage
    As someone who is responsible in part for network security where I work, I would disagree that we are not doing 'enough'.

    The sad reality is that information security is rather hard to achieve in an imperfect environment and without unlimited resources.

    To make a bad analogy, it is hard to physically protect your client/employer if they insist on partaking in high-risk pursuits, and the environmaent is harsh and dangerous. Email-header spoofing, bot-nets, vulnerabilities in 3rd part software - these are not under the control of the admin, at least not if you are committed to the Microsoft platform.

    The same could be said that a doctor cannot be held responsible for their patients health, if their patient is a chain-smoking, alcoholic base-jumper who rides his a monocycle down the freeway at 100 km/h.

  • by mikehilly (653401) on Wednesday May 10, 2006 @07:36AM (#15299877)
    I do a lot of side work helping people with computer both in a home and office arena....

    You and your wife spent some time preparing and getting some type of defense up AND maintaining it. The great majority of people I deal with think that they can install Windows update once and they will be good. Or my favorite, "I have XP (windows) so I don't know what could have gone wrong." People click where they shouldn't click, go where they shouldn't go and do things without thinking.

    The only good analogy to help people understand the importance of security updates is vaccines for children. They may have to go back periodically to the Doctor to make sure all their shots are up to date. And if you think of the web as a disease ridden place, then it would make sense to wear some type of protection when you muck through it.

    You hit the nail on the head here. Three things are needed for a mostly safe computer experience:

    1: Some basic user education (could be the hardest one)

    2: Tools like Firefox, AdAware, Windows update, Firewall. Get em, use em.

    3: UPDATES!!! what good is a vaccine if it is out of date? Get regular updates for Windows, Firefox, and other tools.

    Most people are clueless when it comes to all three.

  • by ManyLostPackets (646646) on Wednesday May 10, 2006 @07:42AM (#15299890)
    I've specifically decided not to go for any security certs because of hoo-haw attitudes demonstrated in articles like this. As a regular sys-admin, no one listens to my recommendations in the first place, why ratchet up the accountability by being a certified scapegoat?

    This article is a riot act equivalent to calling out doctors to take accountability for people who run with scissors.
  • by Anonymous Coward on Wednesday May 10, 2006 @08:12AM (#15299967)
    Don't have kids, do you?

    Most security problems do not enter the company through the company firewall/mail gateway. They are *carried* into the building on employees (surprisingly often: managers) laptops. Laptops that are used at home for the kids to play with, browse the web or whatever. Or for the own employees entertainment.

    I don't have kids but a while ago I had a friend visit me, together with her 12-year old daughter. We kinda lost track of her whereabouts and found her behind my company laptop (in my study) on MSN or something like that. I run Linux and was logged in as myself, not as root, so the damage that she could have done to the OS was minor, but she got told off anyway. She now knows next time she'll have to ask and she's got her own account now on my private desktop. But how many people will happily let their or other peoples kids use a company laptop while being logged in as Administrator?

    Another poster suggested that all laptops should be on a separate network, and I presume he also meant that this network should be firewalled off from the rest of the company network in such a fashion that only the standard applications/protocols are allowed. (Better yet: firewall each laptop off from the other laptops.) Unfortunately, in large companies with a mixed desktop/laptop environment, this is incredibly difficult to achieve.
  • by LanMan04 (790429) on Wednesday May 10, 2006 @08:16AM (#15299979)
    If you don't have any anti-virus software installed, or at least a scanner, how would you know whether your computer is infected or not? If your machine belongs to a bot net, you probably don't know about it.

    To put it another way: Just because you have no symptoms doesn't mean you don't have cancer.

    Is this little traffic light on your router blinking 24/7? :)
  • by Anonymous Coward on Wednesday May 10, 2006 @08:20AM (#15299991)
    Especially when they're senior management types? You can bitch all you want to anybody you can find who'll listen to you but at the end of the day most companies place senior management and they're desires ahead of those of the IT department: if Company Director X declines to follow IT dept guidlines on security procedures, there is nothing IT can do to him and his activities which won't result in the IT guys being fired.

    So some Top Dog asshat opens a gaping hole into the company's system and there's not a damn thing IT can realistically do about it, bacause in most cases they are too far down the pecking order to get their way, but will still be blamed for the breaches and disasters that follow anyway.
  • by kfg (145172) on Wednesday May 10, 2006 @08:26AM (#15300017)
    Your users are so stupid they DVDs in their CDROMs, then complain that the drive wont play their movie.

    Your users are so stupid they tryed to plug their new phone into a ethernet port.


    This is ignorance, not stupidity. The people who wrote the jokes were too stupid to know the difference.

    I like LBJ's line about stupidity:

    They couldn't pour piss out of a boot if the instructions were written on the heel."

    KFG
  • by Maximum Prophet (716608) on Wednesday May 10, 2006 @08:43AM (#15300103)
    What many computer professionals don't realize is that a certain amount of loss due to crime is inevitable at any medium to large business. Stores like Walmart and Target have huge "shrinkage" problems, many times due to the employees themselves. Banks are constantly the victim of their own people all the way up to the VP level. Because of this, businesses are forced to make the calculation about how much security will save, vs. how much will be lost due to crime. If you want Military level security, you can buy it, but even the Military has had to deal with stolen information.

    The trick is getting a better crystal ball and figuring out how much a breakin will cost. Since the IT people often can't properly predetermine the cost of normal projects, predicting the cost of a hypothetical crime will be less acurate than predicting the weather. Perhaps instututes like SANS could put dollar number formulas on each threat type. Even though the formulas would require too many assumptions to be accurate to us, management types could plug in what they think and have the OMG moment w.r.t. security or lack thereof.
  • by Anonymous Coward on Wednesday May 10, 2006 @08:44AM (#15300110)
    Information Risk Managers didn't fail; their profession matured to the point that they realized that there is no such thing as "Security" and attempting to secure information from all commers is doomed to failure. The goal of our profession is "Risk Management" which involves:

    Identifying what is at risk.
    Identifying the threats to the assets.
    Assisting the business assign value to those assets. (Yes the business and not the Security prof. is end decider on value)
    Analizing the risk to those assets from identified threats
    Assitsting the business in a risk assessment.

    Information Risks are just that Risks. Business have been making decisions around business risks for ages and the successful ones stay in business. Nothing new here move along move along.

    If you still think you can provide "Security" then you are indeed a failure; however, with some new training and a slight ego reduction you can start over as a Information Risk Manager.
  • by stinky wizzleteats (552063) on Wednesday May 10, 2006 @08:53AM (#15300163) Homepage Journal
    If you ask a building design engineer to tell you the most important part of a building, they'll say the foundation. If you ask a historian to tell you the most important part of the U.S. government, they'll say the Constitution. Aircraft - airframe. Car - chassis. And so on.

    When you build anything, you make certain fundamental underlying decisions that affect how the rest of the system works - forever. If something is fundamentally broken about any of these core decisions, the structure will be irreparably and irrecoverably broken. It is universally understood that you can't really fix a building with a flawed foundation or a ship with a broken keel. If those parts aren't right, nothing else matters.

    In the 1990s, the world decided to base virtually all computer systems upon an operating system designed by Microsoft. Systems were changing radically over the span of months. Millions of dollars in computer investment could be rendered completely useless if the computer world changed direction. The panic led to sort of a terrified groupthink - we had to make sure we were on the garden path to computer goodness as soon as possible. We didn't choose Microsoft because it was better, or because it was secure, but because in 1992, it looked like the only thing that would work. Now, in 2006, we know (as will be attested by the numerous Microsoft astroturfers who will undoubtedly respond to this posting) that you really can use any operating system to get the job done. The fear of total obsolescence has turned out to be unfounded. We had more of a choice in 1992 than we really thought.

    The question is not whether or not we made the right choice. It is rather how far the fragments of the ship have to sink before we decide to abandon it. How much of the building has to collapse before we evacuate it? How many wheels have to fall off of the car before we pull over and call for a tow truck? The thing we most feared back in the 90s - total system failure for making the wrong crucial underlying choices, is happening every single day. When will we wake up and respond accordingly?
  • by Maximum Prophet (716608) on Wednesday May 10, 2006 @09:10AM (#15300281)
    Gack... That's because those worms were simply malicious. The newer cybercriminal is getting paid for his work, so he's more likely to lie low. Once he's compromised a machine, he doesn't want to get caught by interfering with the owner. Formatting the hard drive, or deleting files is sure to get you noticed. Most of the time these days, users don't know anything is wrong until they have multiple bots on their machine whose combined impact makes their machine impossibly slow.
  • by SonOfThor (684052) on Wednesday May 10, 2006 @10:52AM (#15300976) Homepage
    That all depends...many organizations have positions that are characterized by "all of the responsibility but none of the authority". This means that as a security professional, you may be able to recommend certain practices, but unless one has the authority to see to it that these recommendations are implemented, there really isn't a whole lot more that can be done.


    This is one of the reasons I refuse to ever work as anything less thant Chief Information Security Officer - I have seen SO many directors, administrators, etc.. that are "responsible" for information security, but have little or no authority to implement the changes that they feel are required to achieve their security goals. I prefer to work as a consultant - often on the side of those with limited authority but maximum responsiblity, to give credibility and support to their cause. It seems to me that Management is more willing to listen to a highly-paid 3rd party's recommendations, even when their own guys may have been screaming the same thing for years!

"Kill the Wabbit, Kill the Wabbit, Kill the Wabbit!" -- Looney Tunes, "What's Opera Doc?" (1957, Chuck Jones)

Working...