Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Busting People for Pointing Out Security Flaws 350

gsch writes "'In 2004, Bret McDanel was convicted of violating section 1030 when he e-mailed truthful information about a security problem to the customers of his former employer. The prosecution argued that McDanel had accessed the company e-mail server by sending the messages, and that the access was unauthorized within the meaning of the law because the company didn't want this information distributed. They even claimed the integrity of the system was impaired because a lot more people (customers) now knew that the system was insecure. Notwithstanding the First Amendment's free speech guarantees, the trial judge convicted and sentenced McDanel to 16 months in prison. I represented him on appeal, and argued that reporting on security flaws doesn't impair the integrity of computer systems. In an extremely unusual turn of events, the prosecution did not defend its actions, but voluntarily moved to vacate the conviction.'"
This discussion has been archived. No new comments can be posted.

Busting People for Pointing Out Security Flaws

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Wednesday May 10, 2006 @09:09AM (#15300268) Journal
    If I were a customer of a company that had the mentality "anyone that helped developed the code is a threat to its security" then I would find another vendor--and fast!

    There are practices and standards for developing secure code. If your programmers follow these, then even their knowledge of the source shouldn't matter if they go rogue or want to have fun in their free time. Look at Linux. An operating system used by millions and every hacker in the world can get their hands on the source code. Why don't we see many viruses for Linux? Because it was implemented well. Perhaps companies should start to realize that if they produce code for Win32 applications, they're going to have to resort to the same tactics that Microsoft uses: Don't let the source code out or its true flaws will be revealed and exploited!

    For the consumers of these companies, be wary that your product is only as secure as the company's relationship with its developers--kind of scary considering they're keeping them quiet via threat of lawsuit.
  • C'mon.... (Score:5, Insightful)

    by Otter ( 3800 ) on Wednesday May 10, 2006 @09:17AM (#15300326) Journal
    Jail time for McDanel is almost certainly excessive, but that doesn't mean that accessing (or hax0ring -- it's not clear what he did) your ex-employer's email server to write to all their customers isn't a stupid idea, let alone that it's a protected First Amendment matter.

    And as long as we're slinging around prissy "Will they ever learn?"s, the other poor victim of persecution, McCarty (what's up with all these Celts?) is a real case of failure to learn. Has it not sunk in yet that you simply can't intrude on systems or files without permission, however helpful your intentions? How freaking difficult is that for people to grasp?

  • by QuantumG ( 50515 ) <qg@biodome.org> on Wednesday May 10, 2006 @09:19AM (#15300340) Homepage Journal
    Meh. If I you don't demand source you should expect security flaws.
  • Re:Understandable (Score:2, Insightful)

    by ArsenneLupin ( 766289 ) on Wednesday May 10, 2006 @09:19AM (#15300342)
    Prosecutors, at least in my neck of the woods, don't give two shits about justice or truth. They just want convictions.

    Well, that's their fucking job! They represent the accusation, after all.

    I'd be more concerned if the judge just wanted convictions. That's the guy who is supposed to be impartial, not the prosecution.

  • by Saint Fnordius ( 456567 ) on Wednesday May 10, 2006 @09:29AM (#15300391) Homepage Journal
    The image a prosecuter wants to project is one of infallibility: if the prosecuter isn't sure himself that the suspect is guilty, then he wouldn't go to trial. The image a prosecutor wants to have is that of a guy that is fair, and doesn't waste time or money prosecuting innocents.

    That said, I think I ought to reiterate that I'm talking about image, not whether the prosecutor is actually fair. Far too many prosecutors are willing to tar innocents rather than admit they nabbed the wrong guy.

    That said, it may be that this prosecutor actually may have learned something, and decided to cut his losses rather than look like a bully working for the company (instead of the public interest). This was a criminal case after all, not a civil lawsuit.
  • Solution? (Score:2, Insightful)

    by Uncle Rummy ( 943608 ) on Wednesday May 10, 2006 @09:30AM (#15300393)
    FTA:

    A third [solution] might be to define unlawful access as the circumvention of some kind of security measure.

    I'm not so sure about this one. After, we're talking specifically about criminal liability for researchers who demonstrate that the security of a system is broken. Criminalizing the circumvention of security is exactly the problem many people have with laws such as the DMCA.
  • by Technician ( 215283 ) on Wednesday May 10, 2006 @09:33AM (#15300417)
    The thing that may have raised eyebrows is he found a fault and sent the information to a 3rd party who then contacted the owner. The owner then checked logs to find out who breached the system.

    If he found the problem and contacted them directly they may have been more willing to patch and say thanks.
  • by Anonymous Coward on Wednesday May 10, 2006 @09:34AM (#15300426)
    After reading tfa it seems that the McDanel case is different from the other two in one very important way: intent.

    - McCarty notified security professionals about the issue.

    - Puffer notified the system owner/operator of the security issues.

    - McDanel notified the customers of his former employer.

    TFA does not go into detail as to why McDanel was no longer employed by the company, but its not a huge leap to assume that he did not leave willingly. Was he really concerned about the information security of the customers he contacted or was he more interested in causing damage to his former employer? Did he notify his company of the security issues before he left?

  • by geoffspear ( 692508 ) on Wednesday May 10, 2006 @09:36AM (#15300435) Homepage
    The case was a criminal prosecution.

    That said, I wouldn't want to hire a lawyer who thinks that the 1st Amendment is likely to be interpreted by any court as protecting speech that reveals "secret" information, especially if it's done by breaking into a computer system in the process.

    The fact that the charges were later vacated by the prosecution might indicate that they didn't really have a case, but I don't think the 1st Amendment is likely to be the reason why.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday May 10, 2006 @09:39AM (#15300454)
    Comment removed based on user account deletion
  • Re:and? (Score:3, Insightful)

    by Anonymous Coward on Wednesday May 10, 2006 @09:40AM (#15300458)
    "My friend used to work for an airline, and he had made comments about .. how easy it would be for someone on the inside to disrupt air traffic .."

    I don't suppose you will corroberate this fictional anecdote with the name of the airport and the name and manufacturer of the security system.

    Surely in your country this is cause for a massive class action against the airport.
  • by Irish_Samurai ( 224931 ) on Wednesday May 10, 2006 @09:44AM (#15300486)
    Why don't we see many viruses for Linux?

    While I think that implementation may have a little to do with it, I think the driving factor is that Linux has no where close to the user base that Windows does.

    The purpose of many of these viruses is to create a large botnet. That's alot easier to do when you targt an OS aimed at the everyman computer user who lacks sophisticated understanding of his box and how to maintain it. Linux on the other hand has no where close to the user base spread across so many different releases and distros that creating a virus for Linux is probably done just to prove a point. The numbers just don't warrant the attention yet.
  • by Black Parrot ( 19622 ) on Wednesday May 10, 2006 @09:52AM (#15300544)
    of Shoot the Messenger.

    That seems to be the only solution businesses and politicians can come up with for their self-caused problems anymore.
  • by slashname3 ( 739398 ) on Wednesday May 10, 2006 @10:04AM (#15300633)
    It is partially a numbers game. However, if linux systems (or any unix system) had easily exploited security flaws then there would be huge numbers of worms and viruses targetting those systems that are out there. If nothing else they would be excellent platforms to launch attacks on the huge numbers of windows systems.

    The real reason you don't see that many viruses or worms directed at linux systems is that the concept of least privilege was implemented at the start. Unlike most windows systems which users run with administrator privileges that allow a virus to do whatever it wants once it executes, linux systems users typically don't run everyday applications with admin or root privileges. As such it is much more difficult for a code that is executed on a linux system to gain complete control of the system.

    There are exceptions to all this, some windows users have locked down there systems and some linux users run as root all the time. Both cases are relatively small groups.

    And with the introduction of selinux security is getting even better on linux systems. But no matter how good the security tools are that are made available nothing can prevent a bad adminstrator from setting up an insecure system. The last few compromised linux systems I heard of all of them were owned because users utilized very poor passwords on the systems. Maybe someday when we can get rid of the users we can have real security. :)
  • by Akoma The Immortal ( 36474 ) <pascal@NOsPam.abessolo.com> on Wednesday May 10, 2006 @10:05AM (#15300645) Homepage
    Right. So all those web servers with apache, running linux account for how much % of the web (60,65,70 I dont know, check netcraft).

    Image the botnet you can have if you can manage to compromise all of them, silently sending data, doing damages.

    Numbers, numbers you said.

    Try again.
  • by Splab ( 574204 ) on Wednesday May 10, 2006 @10:09AM (#15300670)
    Since the customer is always right, the customer has to know what security problems means - and why he/she should care.

    In my experience, moveing a piece of graphics one pixel has way more priority for a customer than to fix an SQL injection problem, and since the company developing the software gets money for moving the graphics around, but not for fixing the bug - guess what I'm being told to do...
  • by Irish_Samurai ( 224931 ) on Wednesday May 10, 2006 @10:20AM (#15300746)
    Well, I hardly think that the people maintaining web servers are technical idiots. SO targeting a set of systems that are constantly monitored and maintained by people who are generally neurotic about it isn't exactly the most vulnerable group for creating botnets is it? The home users are.

    Thanks for playing.
  • by deanj ( 519759 ) on Wednesday May 10, 2006 @10:44AM (#15300931)
    The summary was written by the lawyer representing this guy (as others in this thread have pointed out), so there's obvious spin going on. The real kicker of all this is his lame "Free Speech Rights" claim.

    The government didn't do a freaking thing to limit his "free speech". The guy did something vindictive against his former employer, got caught at it, and they went after him.

    It's stupid statements like that which don't put this guy (or the lawyer) in a very good light. It sounds like he's grasping at straws, looking for some way to vindicate his client for doing something really stupid.
  • by PPGMD ( 679725 ) on Wednesday May 10, 2006 @10:53AM (#15300989) Journal
    Numbers is one factor, the administrator is another factor.

    The average home PC is administrated by someone that has no clue about security, while the average Apache admins, knows how to lock down a system, and doesn't use the system for everyday stuff, like viewing e-mails, and running programs randomly downloaded off the internet.

    If we gave Linux machines to the same idiots that run Windows XP machines, you would have botnets, there might not be as many, but they would still be there because many virii are run via social engineering, not via operating system tricks. The dumb user is not something Linux can fix.

  • Re:and? (Score:2, Insightful)

    by Hoch ( 603322 ) <hochhechNO@SPAMyahoo.com> on Wednesday May 10, 2006 @10:55AM (#15300999)
    And surely in yours, it is cause for massive terrorism against it.
  • by Y2 ( 733949 ) on Wednesday May 10, 2006 @10:55AM (#15301003)
    The real reason you don't see that many viruses or worms directed at linux systems is that the concept of least privilege was implemented at the start.

    No it wasn't. And it still hasn't been.

    Certainly it has a concept of "less than full privilege," and that was there from the start, having been copied from earlier systems. Windows has this concept also, but it's perhaps more honored in the breach than the observance. However, my email client, my video player, and my web browser still run with the full privilege of my user account, when something less would be sufficient. Any protection I have from malicious content is due either to efforts within the application rather than the OS, or by my choosing a bare-bones application which is as dumb as a box of rocks.

  • by dedeman ( 726830 ) <dedeman1&yahoo,com> on Wednesday May 10, 2006 @10:59AM (#15301041)
    I would say that prosecution of this guy is warrented only if the parties responsible for security administration at the company are also subject to prosecution for letting security flaws go.

    For a private sector company, who would you first inform of system vulnerabilities? The company, itself, I would imagine. After that (assuming no action is taken)? Not really my call to make, but there must be some amount of culpability laid at the feet of those responsible for security, particularly if they are made aware of vulnerabilities.

    Until there are laws regarding the fixing of flawed security, there should be relaxations of rules for those who, in good faith and effort, inform the possible victims of software vulnerabilities, particularly when the system is engaged in online commerce (makes for a big target).

    Not being a lawyer, I still believe in what I'll call "fairness". Given two examples:

    #1 Sysadmin/former sysadmin informs customers of possible vulnerabilities or exploitation of personal/financial/medical information = possible jail term

    #2 Sysadmin/company is aware of vulnerabilities, but either can not or will not inform customers/fix problems/make anyone outside the company aware of problem = unhappy customer base

    I see a disparity here. One example risks the walfare of the company, the other, it's user base.
  • by HTH NE1 ( 675604 ) on Wednesday May 10, 2006 @11:28AM (#15301244)
    He said, "If... you don't".

    But I'll say, if you do demand source you should be able to find and fix any security flaws yourself and report them for the benefit of those who can't and/or don't.

    Fixing flaws will always be faster for open source users because users can be doing it for themselves, and they'll be found faster too since you'll have more users proactively looking for and fixing flaws than a closed source company will (waste of manpower better tasked to adding new features and enhancements (i.e. future profits)).
  • by Irish_Samurai ( 224931 ) on Wednesday May 10, 2006 @11:32AM (#15301273)
    Man, this is something I sit up at night and try to figure out. How do you create a means of educating an ignorant end user to a satifactory point of sophistication all the while making the barrier to entry non existent.

    The problem is also compounded by the fact that the tech behind the scenes is getting more complex by the minute as the concepts build on each other.

    I think a cool idea whould be to create some sort of setting or application that runs on your windows box and proactively explains things when they come up. Somewhat like ESPN had going on about 3 years ago with Hockey games. Once a week a game was chosen to be the "learning" game. Whenever a penalty was called, the announcers would breifly explain and illustrate what the penalty was, how it occured, why it was a penalty, and the price to be paid.

    I know they have a help file now, but no one is going to go out of their way to learn something like this. Maybe a little more comprehensive tool tip text type of thing would do the trick.

    Just as long as it isn't animated and dosn't make noise.
  • by Akoma The Immortal ( 36474 ) <pascal@NOsPam.abessolo.com> on Wednesday May 10, 2006 @11:33AM (#15301279) Homepage
    Yes. You are right.

    But, (you saw that BUT coming did you :-P), when the social engineered mail bomb or trojan, uses a flaw in the OS to propagate itself, is it the fault of the user, or because of the bad OS design?

    Like when Sasser, or Slammer, so many names I am mixing them up, was runnig wild on the internet, I had a dozen of email containing the trojan paylod and i opened them! thats right I opened them and nothing happen. Why? Because I was smart? No, I wished to make a point to my friend. I used Mozilla on Linux, nothing happen.I used Mozilla on Windows, same result, nada. Did I dared use Outlook? not in a million years. In fact, My wife, who is a computer newbie, use Windows XP has her OS, with full admin rights, because you know some programs just runs better, and has no problem surfing where ever she wants, reading emails from friends, even infected one. She dont use Outlook or IE, that is all I ask of her.

    Anyway all this to say that no matter how competent you are, when your tools are broken, you will be broken. Period.

    Number is factor. Competent user is another factor, and platforms are one more factor to consider.

    P.S: Sorry for my english mistakes. I am a Canadian born french african.

  • by jawz101 ( 944009 ) on Wednesday May 10, 2006 @11:45AM (#15301394)
    Your argument has nothing to do with the fact that the employee emailed EVERYONE in his company about the vulnerability. And using Linux as an answer is not productive.
  • by A.Gideon ( 136581 ) on Wednesday May 10, 2006 @11:56AM (#15301510) Homepage
    However, my email client, my video player, and my web browser still run with the full privilege of my user account, when something less would be sufficient.

    This is important, as many forms of malware (including that needed to build a 'bot) can be implemented w/o the requirement of root/superuser access. While the OS protecting itself is a Good Thing, this doesn't do anything to protect the computer itself against abuse (or to protect the Internet against abuse of this computer).

    This is a fact too often missed during these discussions. And it's why we do need "least privilege", sandboxing, etc. for applications which execute untrusted content.
  • by Nom du Keyboard ( 633989 ) on Wednesday May 10, 2006 @12:14PM (#15301631)
    Not revealing security holes should be the crime, and not the reverse. Only a well-informed consumer has a realistic chance of protecting themselves.
  • by lordkuri ( 514498 ) on Wednesday May 10, 2006 @12:26PM (#15301752)
    Well, I hardly think that the people maintaining web servers are technical idiots.

    I've been in the webhosting industry for about 6 years... you have it quite backwards. Browse through the discussion threads on WebHostingTalk, and you'll see exactly what I mean.

    Granted, a lot of us are very on top of things, but there's also a swarm of 15 year olds that go get a dedicated server, and start up a hosting company with absolutely no clue what an SSH shell even is, let alone how to do anything but click links in cPanel/Plesk/etc.
  • by plague3106 ( 71849 ) on Wednesday May 10, 2006 @12:42PM (#15301881)
    First ActiveX exploit released: 1993. Latest ActiveX exploit: in the wild currently and unpatched. That's 13 years that Microsoft has ignored your security and refused to correct a huge, gaping security hole.

    Care to give details on the lastest one? ActiveX (in a browser, I have to assume thats what you're talking about) gives security prompts on any attempt to install software. If you click No or do not install or whatever, it doesn't.

    We won't even talk about the RPC processes (accessible through ports left open by default) that have traditionally been running in Windows (up until just a few months ago), with full Admin privileges, every time you log in, no matter how you log in.

    Windows Server 2003 ships with RPC network access disabled by default. XPSP2 has network access to RPC shut off by default (indeed, it will just disable it, even if you wanted it open).. that was released almost 2 years ago. Not sure how you get 'up until just a few months ago.'

    The real reason Windows has more security problems: the head-in-the-sand, we'll-bend-over-and-take-more-of-this-same-old-cra p attitude of Microsoft customers.

    I think a lot of security problems stem from needing to support DOS for so long. It wasn't until XP that home users had access to the NT kernel, which is much more secure.

    More to the point though, MS was doing what its customers wanted, and they weren't saying they wanted security. They wanted backward compatability and more ease of use. It wasn't until relatively recently that they wanted security. And MS is reponding; server 2003 comes out of the box pretty secure. Firewall that is on by default, minimal services installed by default.

    But here, I'll let the Microsoft folks themselves tell you:
    "Our products just aren't engineered for security," said Brian Valentine, Microsoft senior vice president for Windows development. Another Microsoft executive recently explained they never paid attention to security "Because customers wouldn't pay for it until recently."

    Article (2003) quote from http://archive.corporatewatch.org/profiles/microso [corporatewatch.org] ft/microsoft1.htm#Crapsoftware


    Wow, way to quote a 3 year old article. But it proves my point; are you, as a company, going to go with the vendor that gives you what you want, or something you didn't ask for? Again, I'd also like to point out that server 2003 is pretty secure by default, and it wasn't long until SP2 for XP came out, which fixes a bunch of security issues and other enhancements.
  • Re:and? (Score:3, Insightful)

    by pant ( 814786 ) on Wednesday May 10, 2006 @12:43PM (#15301891)
    I don't think it is all that silly. The classic limiting of the First Amendment is that it does not allow you to yell "FIRE!!!" in a crowded movie theater. This seems a little like the opposite, where there really is a fire in the movie theater and their lawyers sued you because you didn't keep your mouth shut.

    True, this is an analogy that may not fit, but if it comes down to one group being able to continue to make money at the expense of many other groups due to sheer negligence,(Gee, hope nobody finds out!) then they should be called to task.

    To me, this sounds like someone reinterpreting the First Amendment to whatever the hell they don't want at all times.
  • Re:Same here (Score:2, Insightful)

    by couchslug ( 175151 ) on Wednesday May 10, 2006 @12:52PM (#15301944)
    The moral is don't be a "good kid". Look like one, keep you head down, and don't trust authority figures. If you have information whose release might get you punished, release anonymously or not at all.
    This has never been different, by the way.
  • by Irish_Samurai ( 224931 ) on Wednesday May 10, 2006 @01:08PM (#15302098)
    While I agree that there are planty of people in the hosting business who are ignorant on how to do it properly, I would also argue that these people at least have a technical proficiency above and beyond the average user.

    I'm not disagreeing with you, and many others here have made very valid points about other factors to viruses and the systems they run on - but I am only really qualified to make statements regarding end user proficiency.

    Taking your statement as true, I still believe that the number of clueless users far outweight the number of clueless webhosts. I would also be willing to bet a clueless webhost has enough technical knowledge to "know what he doesn't know", hence the number of elementary questions asked on boards such as the one you pointed out.

    I don't believe the average end user has the knowledge to evaluate what exactly is the problem with their computer they need to address. They just know its "broken." This tendency alone gives even a clueless web host a leg up.

    Once again I'm not trying to say that there aren't a a sizeable amount of clueless web hosters out there who are getting their boxes compromised. I just think there is a larger. slower moving target of home users that gets the main focus.
  • by Fareq ( 688769 ) on Wednesday May 10, 2006 @01:18PM (#15302213)
    That sounds very good, however you might want to think about these two facts, and how they interact:

    1: All software has some number of bugs.

    2: A VM is a piece of software

    --

    Also realize that in order to be effective, each such piece of software would have to execute inside its own VM in complete isolation from other applications... no IPC, no shared memory, no networking -- after all, a bug in one application could be exploited by a "properly" invalid network request... While highly secure, this is not the most useful of configurations...
  • by ScentCone ( 795499 ) on Wednesday May 10, 2006 @01:25PM (#15302281)
    kind of scary considering they're keeping them quiet via threat of lawsuit

    But isn't this how a bank keeps its employees quiet about private data, or how a manufacturer keeps its trade secrets (spaghetti sauce recipe, engine tuning secrets, freight routing AI, etc)?

    And why do they have have to? Because relying on personal integrity routinely fails. Don't even start with "if they'd only treat employees fairly, by paying every 21-year-old new hire mid six-figures, a corner office, two months off their first year and free food all day, they wouldn't ever have to worry about anyone every compromising anything!" That's total BS. There are broken people out there, people with totally twisted senses of propriety, and people who simply can't be made happy because they have a fundamental inability to have rational expectations (or, live beyond their means, or develop expensive drug/gambling habits, whatever).

    Without some actually meaningful way to make both parties (employer and employee) abide by the actual terms of their agreement - especially such terms as those that govern the end of their relationship - then there's no point for either party to even sign such an agreement, and no ability for a lot of companies to engage in anything like high-stakes business development, research, and more.

    How would YOU keep quiet someone that has some axe to grind, and had previously been trusted with your trade secrets? Just asking nicely, over and over again? And if your business is ruined, or your customers are lost? Or if a vulnerability that you're in the middle of fixing, and which is unknown to the outside world, is disclosed before your patch is out, and your customers get hacked... well, that's just the price that a small tech company has to pay for not making an absolutely perfect in every way product? Clue: very few tolerably priced customized, niche-market products would ever come into existence if absolute perfection were the only defense against someone with inside knowledge bent on causing your customers trouble. Note that I'm not commenting on the case in question, but on your notion that civil legal consequences are somehow inappropriate.
  • by lamber45 ( 658956 ) <lamber45@msu.edu> on Wednesday May 10, 2006 @06:20PM (#15304604) Homepage Journal
    Did we do anything about it? Nope. We ignored it. I didn't even bring it up to our managers. Why? Because in documenting the issue we would have most certainly violated the licensing agreement,

    While the incident appears to have been some time ago, I think you ought to at least have documented the issue internally, sending reports as high as the officers of your company. That documentation, of course, would have been proprietary and confidential. What the other company didn't know couldn't have been used against you. Even if you couldn't have made the ASP fix their product, your HR department would have known not to rely on it for confidential communications.

  • by tekrat ( 242117 ) on Wednesday May 10, 2006 @10:12PM (#15305765) Homepage Journal
    So, if we apply your logic: What then, gives telemarketers the right to call you? Your number is publically accessable, and no password is needed to call your number and have the phone at your end ring because the phone lines go right into your house. In short, there's NO SECURITY between you and the telemarketer.

    However; that doesn't mean that they now have the right to invade your privacy and call you. And yet, they do. How is it that your logic will apply to a security firm breaking into your house, but ignores a telemarketer that does, essentially the same thing? They call on a regular basis and really, that's as much "breaking in" as any other computer analogy.

    Now, we all hate the telemarketers, and laws have been enacted to prevent them from harassment; but really, technically it *IS* legal for someone to "break in" to your house via the telephone, so I cannot say that your logic is flawless.

    TTYL

What is research but a blind date with knowledge? -- Will Harvey

Working...