Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Schneier on Economic Insights to IT Security 58

Scyld_Scefing writes "In his June 29, 2006 Wired News article, 'It's the Economy, Stupid,' Bruce Schneier covers the content of the 2006 Workshop on the Economics of Information Security. Schneier says that economic analysis of IT security issues is relatively new, and links to one of the significant earlier papers from 1991, 'Why Information Security Is Hard -- An Economic Perspective' (.pdf). This article states: 'According to one common view, information security comes down to technical measures. Given better access control policy models, formal proofs of cryptographic protocols, approved firewalls, better ways of detecting intrusions and malicious code, and better tools for system evaluation and assurance, the problems can be solved. In this note, I put forward a contrary view: information insecurity is at least as much due to perverse incentives. Many of the problems can be explained more clearly and convincingly using the language of microeconomics: network externalities, asymmetric information, moral hazard, adverse selection, liability dumping and the tragedy of the commons.'"
This discussion has been archived. No new comments can be posted.

Schneier on Economic Insights to IT Security

Comments Filter:
  • by yagu ( 721525 ) * <{yayagu} {at} {gmail.com}> on Saturday July 01, 2006 @11:58AM (#15642662) Journal

    One of the hardest things about security is knowing you really have security. It's kind of like knowing your software doesn't have a bug. It's easy to know when you do have a bug, it's virtually impossible to know you don't.

    I think security suffers the same or similar perception, rightly so. So, no matter how much you invest, how strict your policies, you really never know you have security. Couple that with how expensive it is to apply and enforce the more draconian policies... who wants to spend a fortune and find out they've been compromised anyway?

    And, extreme security makes computing far less transparent, often to the exclusion of any reasonable work flow for day to day tasks. If security could be transparent (not sure it can), that would help.... no business likes fielding support issues for an entire corporation just because their network is PKI (ever administrate Sun's version?).

    (I once worked at a place that had a thirteen-rule requirement for setting new passwords... it was so intrusive, I kept a printout of the rules on my monitor to try and avoid a twenty-minute guessing game session for setting new passwords. What was really funny was at one point the "rules" conflicted with one of our systems, so you couldn't define a qualified password that the system could use. Hilarious.)

    On top of all of that, no matter how diligent you've been, one disgruntled (ex-)employee is all it takes with a modicum of social engineering savvy and you find the investment for naught. It's no wonder security is a tough nut to crack.

    (As an aside opinion... I think the press gives too much attention to things like the recently stolen laptop with all of the info on it -- it was a stolen laptop, probably nothing more -- they get stolen all of the time, and people have no idea what they've gotten other than a "free" computer.)

    • by ScrewMaster ( 602015 ) on Saturday July 01, 2006 @12:15PM (#15642699)
      I had a similar experience many years ago. I did some consulting for a major hospital, and as it happened one contract I received was to reverse-engineer a multi-drop mainframe terminal protocol. The idea was to use regular PCs as terminals instead of the mainframe vendor's overpriced equipment. In any event, I was working with one of the hospital's programmers on the job, and I asked about getting a logon so I could start analyzing the protocol. He said, "Here, watch this." It turned out that Arthur-Anderson (yes, that AA) had performed a security audit on the hospital and discovered that, as you would expect, the hospital's security was woefully inadequate. So they required that a triple-password scheme be implemented (yes, typing in three successive passwords to log in to the mainframe) in order to improve security and pass the audit. Well, as it happens this was back when "smart terminals" were getting popular, and this was a floor full of programmers, so it took about eight seconds after the last auditor left for the coders to agree on "F12" as a common macro key to spit out the required three passwords and log in. Everybody programmed their passwords into their own terminals so anybody could log in any time. Pretty funny, really, but it does go to show that what you're saying is correct: if security interferes too much with productivity there will be problems. Prior to that audit, everybody had a private password and used it. Afterwards ... productivity was unimpaired while security simply disappeared.
      • by BVis ( 267028 ) on Saturday July 01, 2006 @01:15PM (#15642864)
        Well, as it happens this was back when "smart terminals" were getting popular, and this was a floor full of programmers, so it took about eight seconds after the last auditor left for the coders to agree on "F12" as a common macro key to spit out the required three passwords and log in.
        Two problems here: Ignorant overpaid "consultants" who think a splint is a good remedy for food poisoning and a floor full of programmers who should be escorted to the door by (physical) security personnel.

        Just because a security policy is retarded is no reason to justify ignoring it. I don't care if the password policy is that you must dance a particular sequence on a DDR pad for access, if that's the security policy, you follow it until a better policy can be put in place.
        • No, the problem was ignorant, overpaid "consultants" who thought a bludgeon was a good replacement for actually analyzing the situation and solving the problem.. The idea was to make their own jobs easier so they could leave the site having "increased security" thereby justifying their rather hefty fees. Those consultants were paid serious money to come up with a solution that would balance the customer's stated security requirements with the need for workers to actually, well, work. The consultants failed,
        • Just because a security policy is retarded is no reason to justify ignoring it.
          That sounds like a good reason to me! You should follow rules that serve practical and ethical purposes, but you are morally obligated to circumvent the useless cock snot coughed out by some process consultants.
          • You should follow rules that serve practical and ethical purposes, but you are morally obligated to circumvent the useless cock snot coughed out by some process consultants.

            By that logic I should be able to plant a dozen pot plants in my back yard and drag my idiot Governor from his car and beat him with a lead pipe. You can't pick which rules to follow and which ones not to. If the rule is bad, change the rule. If everyone chooses to ignore security policy you may as well not have one. ANY security p

            • You can't pick which rules to follow and which ones not to.

              You can, and do. In your hypothetical example (violent assault of a public official), you made the wrong choice because you hurt another person and cheated the democratic process, not because you violated any law. (The law, in this case, exists so that we have a fair process for figuring out how to convict, jail, and execute you.)

              Rules serve to protect the more intangible exchanges of human nature. To make things fairer. To gain efficencies by s

              • You're missing my point. If the rule is bad, change the rule. But in the meantime, the bad rule is STILL the rule.

                We're not talking about ethics or morals here. We're talking about computer security. Security policy must be enforced at all times; if it isn't, and people are allowed to get away with breaking it, when the rule IS changed to not suck so much, people still won't follow it.
        • Just because a security policy is retarded is no reason to justify ignoring it.

          Well, that depends... I've seen cases where the employes definately should have followed the policy even though it seemed retarded to them, but I've also seen instances where the business would come to a screeching halt if the policies were actually followed. I think it's part of the blame distribution process - when shit hits the fan senior management can point to the security regs and say this is against protocol, isolated inci
          • Like in this case, it sounds really good on paper to have triple passwords.
            Yes, indeed, because neither upper management nor the Arthur-Anderson hacks were required to use them, and both of those groups were well-enough paid that they should have had some inkling that this was a bad idea.

            I guess hiring an accounting firm to perform a security audit wasn't all that bright either, now that I think of it.
    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Saturday July 01, 2006 @12:27PM (#15642735)
      Just to make this clear, "security" is not an end item. You cannot "have" security. My definition is: The process of identifying and evaluating threats and reducing their effectiveness.

      As Bruce says, when there isn't an economic incentive, that process is not maintained.

      But, suppose you are maintaining it. How do you know how good your security is?

      Bruce also wrote about "attack trees".
      http://www.schneier.com/paper-attacktrees-ddj-ft.h tml [schneier.com]

      Identifying and evaluating the different avenues of attack is part of evaluating the threats. Once you've identified one, don't think about how you can "prove" it is "secure". Think about how you would go about showing that it is NOT secure. Make your statements about your security "falsifiable". Just like in the scientific method.

      Then experiment, on an on-going-basis, to see if you can demonstrate that your security can be broken. This takes time and effort on your part as you have to continually read about the latest advances and theories.

      Which gets back to the economic issue. If the organization does not see an economic incentive for you to perform that research/work, then you will be assigned to other tasks and the process will not be followed. If you are not following the process, there is no "security".
    • It's easy to know when you do have a bug

      Since this is about security, a bit of nitpicking is in order.

      There are at least two meanings.

      It's easy to know when you do have a bug. You do. Just no idea what, where, how, etc. You can even use statistics to draw confidence intervals on the number and severity of the bugs.

      It's easy to know when you do have a bug. Assuming that if you have a bug you'd know it. This one is false, very false. It is quite possible for a bug to exist and to not be demonstrable under an
      • ... if you have a bug you'd know it. This one is false, very false. It is quite possible for a bug to exist and to not be demonstrable under any circumstances.

        Example: RC4. The keystream was supposedly indistinguishable from random data. People believed this for the good part of a decade, but they were wrong.

        There's also that ssh1 key parsing bug that was found a few years ago.

    • And, extreme security makes computing far less transparent, often to the exclusion of any reasonable work flow for day to day tasks. If security could be transparent (not sure it can), that would help.... no business likes fielding support issues for an entire corporation just because their network is PKI (ever administrate Sun's version?).
      Methinks the reality is that losing transparency means losing security.

      (I once worked at a place that had a thirteen-rule requirement for setting new passwords... it was
  • by Anonymous Coward
    Since you can sue to death anyone breaching security, you only need to put a cheap fence around the company assets and invoke the DMCA.
    • You make fun of it but for some of us it is a problem that they have to deal with daily.

      I for myself work at a multi-national, multi-location site with a mixed environment of mainframes, servers, terminals, windows workstations, mac workstations.
      I have to implement the policy according to Sarbanes-Oxley in their macintosh computers. They never got anyone to do it decently, so currently a mixed environment of G4's, G5's with all different software, licensed and unlicensed versions of Mac OS, Office, random p
  • Still too limited (Score:4, Interesting)

    by Beryllium Sphere(tm) ( 193358 ) on Saturday July 01, 2006 @12:12PM (#15642692) Journal
    Put the incentives in the right place and there's still the issue of implementation. Nobody benefited from Chernobyl blowing, but it did anyway, and investigators think part of the reason is that there were no reactor engineers on duty. Security, just like industrial safety, depends on having trained and informed people at critical decision-making points.

    Making security usable is another implementation issue. Everyone wanted airplanes to land safely, especially the pilots who were inside them, but there was one crash after another due to "pilot error" until the aerospace world began laying out controls and instruments to meet the needs of the pilots who used them.

    True, incentives do come first. But even then they need to be carefully chosen. Bad publicity and the threat of job loss didn't make the VA careful: instead those incentives fueled a search for scapegoats, a search which ended with the analyst who had written permission issued on three occasions to take the data home with him.
    • We've had this discussion before. For those folks where security is paramount there will be trade offs in usability. If you want more security then you have to jump through more hoops. The end users often (unfortunately) have the final say in usuability and therefore the extent of security. Where users value security more than the annoyance of jumping through hoops, security is better implemented. Where you don't want to be is caught between usability issues versus "how secure I thought I was". The VA
      • This VA situation also appears to be yet another case of IT being given responsibility without any of the required authority. Had you asked anyone in IT whether this was a good idea or not, unless they've all been lobotomized, they'd say "absolutely not". But, since they have no authority, the users know they can basically do whatever the fuck they want, since IT will catch the heat for anything they do wrong anyway.

        It's kind of like telling a police officer, "OK guard this prisoner, but you can't watch h
      • The end users often (unfortunately) have the final say in usuability and therefore the extent of security.

        THERE'S your problem.
        The end users have the final say on security. Really.
        It's like the bit about physical security.
        Security is not about the hardest way in (IT and management controlled) but the easiest way in (user controlled).
        Now it is completely feasible for management and IT to delude each other about the state of security. I assume that is the normal state of affairs.

        If stuff in an office needs to
        • In real life that's how it happens. I've seen suggestions for improvement in security shot down because of the impact it would have on the end users. I'm sure others have too. Physical and IT security are the same in some ways. If everyone in the office has to suddenly unlock three deadbolt locks on their office door, plus unlock the doorknob, when they used to keep it unlocked, then they will freak. Same thing with security for IT. Try to force the end users, especially those that are the "powers tha
          • They WILL NOT fire that senior partner who is bringing in the big bucks because he did something stupid on the computer

            Hmmm.
            senior partner who is bringin in the big bucks
            computer

            Basic security. You don't risk valuable resources (senior partner) to preserve cheap resources (computer).
  • by Anonymous Coward
    http://www.ecampus.com/bk_detail.asp?isbn=05216052 10&referrer=frgl [ecampus.com]

    Cheapest place a quick froogle revealed. I read this book a few months ago and found it pretty interesting, though perhaps best in its role as summarising further papers for reading.
  • by CodeBuster ( 516420 ) on Saturday July 01, 2006 @12:29PM (#15642738)
    It should not be surprising to people that economics provides the basis for explaining many interesting situations that occur in the real world in relation to computer security. Recall that economics is the study of how humans react to scarcity, or more bluntly how we behave in light of the fact that we cannot simply snap our fingers and have anything we want immediately placed in front of us all of the time (with the possible exception of Bill Gates and a few others, but they are not representative). It is precisely the ability of economics to insightfully solve common conundrums with deliciously counterintuitive explanations that seems to fascinate so many people, as evidenced by the recent success of books such as Naked Economics: Undressing the Dismal Science [amazon.com] and Freakonomics [amazon.com], despite the generally boring ways in which the subject is presented by our schools. If it involves human interactions and human nature then, ultimately, it involves economics.
    • It has a profound effect on our society.

      Take for example the debt based money system we have now. The government has the ability print money (well, borrow) as it likes. Well when you have that power, it's pretty damned difficult not to use it. After all, raising taxes is about as popular as a fart in a lift and all politicians want to be re-elected. So borrow some money from the central bank to pay for your pet oil liberation project. This has a number of implications:

      1: We've increased the amount of money
      • Inflation. Though it's percieved to be a general increase in prices it's essentially a tax on the currency holding population.

        The effect on debtors and creditors should far outweigh the effect on holders of currency. If you loaned money to someone to buy a house, inflation is very bad for you and very good for the person to whom you lent money.

        The more you expand, the smaller the debt is in proportion, so you must expand. Which basically means there must be a continual increase in the exploitation of resour

        • The effect on the holders of the currency is whatever the inflation rate is, 5%, 10% would be quite a big effect, particularly on the poor and if you don't adjust the interest rate on loans you provide in a timely fashion for inflation yup it will be very bad for you. If the inflation rate gets very high it's very bad indeed for the debtor. High interest rates, payments out of control, house lost and all that.

          "Economic expansion does not require the exploitation of resources. If exploitation of resources wa
          • If the inflation rate gets very high it's very bad indeed for the debtor. High interest rates, payments out of control, house lost and all that.

            No, inflation is very good for any debtor. If you owe $1000 at 5%, and inflation is 10%, then the debtor actually makes money in the transaction. Even ARM (adjustable rate mortgages) are typically fixed for years, and even when they do adjust they are not likely to change more than inflation. And that's only talking about mortgages, there are many other types of loa

            • $1000 at 5%, and inflation is 10%, then the debtor actually makes money in the transaction. Even ARM (adjustable rate mortgages) are typically fixed for years, and even when they do adjust they are not likely to change more than inflation. And that's only talking about mortgages, there are many other types of loans, and many don't adjust at all.

              Um, not here in the UK, most mortgages are variable rate. i.e. Set at the central bank base interest rate plus a couple of percent. The central bank increases/decrea

              • The closer to the supply of money, the larger is the differential, the larger is the benefit.


                I think I see what you're getting at, but could you please provide a source? I am not trying to disagree, but I am not entirely convinced and I would be interested to read a more thorough explanation.
            • No, inflation is very good for any debtor. If you owe $1000 at 5%, and inflation is 10%, then the debtor actually makes money in the transaction.

              Not true. It depends on how rich you are. Let me illustrate:

              Let's say I make $1000 per month, constantly, and need to spend $600 for rent, food, etc. I have a monthly obligation to the bank of $300. Inflation hits. I still have to pay the $300, while my daily life gets more expensive because of the inflation. For most of the working population, that will probably

              • Not true. It depends on how rich you are. Let me illustrate:

                In your example, your salary is declining, because the value in dollars is constant while the value of a dollar is declining (inflation). Generally, as your skills and experience increase, your salary will follow. A person's salary will decrease if the market value of their job decreases, or if it was higher than market value to begin with (for instance, in the case of minimum wage).

                But yes, "the rich" or middle class sometimes benefit (in the shor

                • In your example, your salary is declining, because the value in dollars is constant while the value of a dollar is declining (inflation). Generally, as your skills and experience increase, your salary will follow.

                  Not at all. Actually virtually everyone I talk to these days is complaining about the decline in salaries. Taxes go up significantly for years now (at least here in Germany, YMMV), salary rises are mostly unheard of, and those that get into the media are in the 1-3% area, by far not enough to outwe

                  • virtually everyone I talk to these days is complaining about the decline in salaries

                    I don't know much about the German economy (I have a friend that moved there a few years ago, but he doesn't talk much about the economy), but the US economy is actually doing quite well. We had a recession (technically, it wasn't even a recession according to the definition, but it's generally recognized as a recession). Now the recession is over and we are recovering (not that it was very bad anyway).

                    To me, the problems in

  • by Anonymous Coward
    'Why Information Security Is Hard -- An Economic Perspective' (.pdf). This article states: 'According to one common view, information security comes down to technical measures. Given better access control policy models, formal proofs of cryptographic protocols, approved firewalls, better ways of detecting intrusions and malicious code, and better tools for system evaluation and assurance, the problems can be solved. In this note, I put forward a contrary view: information insecurity is at least as much due
  • Insurance risk (Score:5, Interesting)

    by stox ( 131684 ) on Saturday July 01, 2006 @12:43PM (#15642770) Homepage
    We will not see real security until Insurance companies start to really evaluate the risks involved. Once premiums sky-rocket due to poor security, then people will pay attention.
  • by Dadoo ( 899435 ) on Saturday July 01, 2006 @01:37PM (#15642914) Journal
    I've been telling my co-workers for a long time - while hackers who break into companies' networks should be punished, the companies, themselves should be punished more. The very first paragraph of this essay (the one comparing the European banks to the American banks) would seem to agree with me.

    Let's face it: if your corporate network can't stand up to some high-school kid in his basement, it certainly isn't going to stand up to a well-funded foriegn power trying to attack us.
    • Looks like several of 'em in the same general space.
      Other than specific references to Windows 2000, seems relevant regardless of epoch.

      [4] RJ Anderson, "Why Cryptosystems Fail"
      in Communications of the ACM vol 37 no 11
      (November 1994) pp 32-40

      [1] GA Akerlof, "The Market for 'Lemons': Qual-
      ity Uncertainty and Market Mechanism,"
      Quarterly Journal of Economics v 84
      (August 1970) pp 488-500

      From the paper,

      The theory of asymmetric information gives us an explanation of one of the mechanisms. Consider a used car marke

    • Mod parent up: here is the IEEE citation [ieee.org] from 2001.
  • It occurs to me that is similar to what I encountered when a I was sysadmin. The boss has no idea how many problems the company didn't have because you're good at your job. In fact, an admin that's always fighting fires can be highly valued for all of the work they put it in.

    With security, the only measure is imagining the cost of outages and security breaks, maybe for other companies if you're good enough or lucky enough to prevent them. Otherwise, the bean counters will only look at what you want to sp
  • The next workshop on economics & info security will be held in October. So if you have strongly held views in this area (and who on slashdot lacks strongly held views), then think about submitting. You don't have to be an academic to submit a paper, although arguments should be carefully constructed and well organized.

    The Workshop on the Economics of Securing the Information Infrastructure (WESII) [econinfosec.org]

    • Workshop: October 23-24, 2006, Washington DC
    • Papers due: August 6, 2006

    Suggested topics (not in

  • Ross Anderson made an interesting presentation on the Economics of Dependability and Security at Networkshop this year which provides a good overview of the subject. The video and slides are linked from:
    http://www.ja.net/services/events/networkshop/Netw orkshop34/webprog.html [ja.net]

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis

Working...