Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Busting People for Pointing Out Security Flaws 350

gsch writes "'In 2004, Bret McDanel was convicted of violating section 1030 when he e-mailed truthful information about a security problem to the customers of his former employer. The prosecution argued that McDanel had accessed the company e-mail server by sending the messages, and that the access was unauthorized within the meaning of the law because the company didn't want this information distributed. They even claimed the integrity of the system was impaired because a lot more people (customers) now knew that the system was insecure. Notwithstanding the First Amendment's free speech guarantees, the trial judge convicted and sentenced McDanel to 16 months in prison. I represented him on appeal, and argued that reporting on security flaws doesn't impair the integrity of computer systems. In an extremely unusual turn of events, the prosecution did not defend its actions, but voluntarily moved to vacate the conviction.'"
This discussion has been archived. No new comments can be posted.

Busting People for Pointing Out Security Flaws

Comments Filter:
  • and? (Score:5, Interesting)

    by schnits0r ( 633893 ) * <nathannd&sasktel,net> on Wednesday May 10, 2006 @09:12AM (#15300290) Homepage Journal
    THis happens a lot. My friend used to work for an airline, and he had made comments about weak airline security to his coworkers and boss, and that he was concerned how easy it would be for someone on the inside to disrupt air traffic. They called the transport authority and they have basically black listed him from being at an airport and told him he was lucky they didn't press charges.
  • Understandable (Score:5, Interesting)

    by BenEnglishAtHome ( 449670 ) * on Wednesday May 10, 2006 @09:14AM (#15300307)

    The first impression is that this is really weird. Prosecutors, at least in my neck of the woods, don't give two shits about justice or truth. They just want convictions. Do we actually have a prosecutor somewhere with integrity? How many times has hell frozen over this month?

    Take a minute to think about it, though, and things change. Prosecutors still just want convictions that stand on appeal. In this case, the conviction was eventually going to get tossed, so the prosecution gets to look like a hero by bailing out early.

    As usual, what at first blush appears to be a noble action by a public servant turns out to be self-serving. There is still no chance of a prosecutor having integrity. All is, again, right with the world.

  • Vacation vs. Repeal (Score:5, Interesting)

    by Gallenod ( 84385 ) on Wednesday May 10, 2006 @09:16AM (#15300318)
    Vacating the conviction doesn't challenge the law, just the individual action. Looks like the company wanted the publicity from the conviction to reinforce their non-disclosure agreement but didn't want to take the risk that the law would be rolled back later on appeal.

    (IANAL, but my uncle is.)
  • by fabs64 ( 657132 ) <beaufabry+slashdot,org&gmail,com> on Wednesday May 10, 2006 @09:16AM (#15300319)
    It is a fact that programs get released with known bugs, it's actually an economic certainty for commercial programs.
    It is a SAD fact, that some of these known bugs are security vulnerabilities, one would hope that security bugs top the priority list but they do not, useability most often comes first.
  • by Mobster ( 306973 ) on Wednesday May 10, 2006 @09:18AM (#15300337) Homepage
    This kind of trend is only gonna end when something catatrophic happens and it's traced back to someone that could have said something but didn't out of fear of losing their job or prosecution. It wouldn't suprise me if the whole FEMA/Katrina fiasco was this kind of situation.

    Can a federal law be passed to correct this? DOes congress even care?
  • ISAGN (Score:2, Interesting)

    by MOtisBeard ( 693145 ) <atomdebris@NoSpAm.gmail.com> on Wednesday May 10, 2006 @09:19AM (#15300341)
    New technologies often require changes in the law and in the legal system itself, and computer technology is far from being an exception to that. As a society, we really need to have more specific legal definitions of what is and what is not black-hat hacking, defined by people who truly understand the technology... namely, white-hat hackers. Until this happens, we will continue to see people unjustly prosecuted for pointing out their local emperor's nudity, and we will continue to see nonsensical bills bouncing around Washington, D.C., written by and debated by people who don't understand them and who have no clue what stand to take on them. Senatards and Congresscritters simply are not qualified to make these decisions for us, but they will continue to do so until the ubergeeks get organized into a Congressional subcommittee or something, and take the reins.
  • Congrats! (Score:3, Interesting)

    by DamienMcKenna ( 181101 ) <damien@@@mc-kenna...com> on Wednesday May 10, 2006 @09:27AM (#15300382)
    Just a quick word of congratulations to Mr McDanel and yourself, finally some common sense rears its head in this case.
  • by cdrudge ( 68377 ) on Wednesday May 10, 2006 @09:31AM (#15300401) Homepage
    No publicity is bad publicity...or something like that. However, if I were a company executive, I'm not sure if I would like my company being in the news because I went after a former employee for pointing out a security flaw in my software. It draws attention to the fact that my software had a flaw in it, that our policies aren't keeping confidental information confidental, etc.
  • Of two minds (Score:4, Interesting)

    by Billosaur ( 927319 ) * <<wgrother> <at> <optonline.net>> on Wednesday May 10, 2006 @09:31AM (#15300403) Journal

    The McCarty prosecution, brought by the same office that so egregiously mishandled the McDanel incident, is in the same vein. As with Puffer and McDanel, the government will have to prove not only that McCarty accessed the school system without authorization, but also that he had some kind of criminal intent.

    Likely, they will point to the fact that McCarty copied some applicant records. "It wasn't that he could access the database and showed that it could be bypassed," Michael Zweiback, an assistant attorney for the Department of Justice's cybercrime and intellectual property crimes section, told the SecurityFocus reporter. "He went beyond that and gained additional information regarding the personal records of the applicant."

    But if he wanted to reveal USC's security gaffe, it's not clear what else he could have done. He had to get a sampling of the exposed records to prove that his claims were true. SecurityFocus reported that USC administrators initially claimed that only two database records were exposed, and only acknowledged that the entire database was threatened after additional records were shown to them.

    Ok, so there are two ways to look at this:

    1. He did commit a crime. He broke their security, using a known flaw. Happens all the time to anyone running Windows when some virus or Trojan uses a known exploit to mess round with data on your PC. They're guilty, mainly for then using your PC for other nefarious purposes. This argument is weak because all he did was reveal the information to a reporter, and while that's a dubious move at best, it really ended up in little harm.
    2. He didn't commit a crime. He exposed a major college's security lapse and did something with that knowledge that allowed the problem to be solved. I don't agree with his methods -- it would have been far easier to simply go to USC, tell them of the flaw, and then leave them to their own devices. Knowing USC, they would have hemmed and hawed, until some enterprising hacker, out for a little fun, discovered the flaw and did more than steal the records of seven people. He probably felt that this needed to be publicized to force USC's hand, but I still think that smacks of lack of common sense.

    I doubt a jury will convict him, though, this being a technical argument mainly and a computer crime, any jury they seat is bound to wind up confused and the best the prosecution can hope is that someone on the jury will have enough savvy to explain it to the others. Or they may convict him for being a wily, young whippersnapper. Who knows?

  • Re:C'mon.... (Score:4, Interesting)

    by goldspider ( 445116 ) on Wednesday May 10, 2006 @09:32AM (#15300412) Homepage
    "...however helpful your intentions?"

    I think you mis-spelled "vindictive".

    Afterall, we're talking about a former employee, and considering how far things were taken, it doesn't sound like it was an amicable separation.
  • Re:C'mon.... (Score:3, Interesting)

    by russellh ( 547685 ) on Wednesday May 10, 2006 @09:40AM (#15300460) Homepage
    Well as the article points out, it is the murky definition of "access" that is troublesome, such as the case where emailing a company was ruled as "unauthorized access" - not only to the company's email server, but to all the computers on the route. This is fear based on ignorance. The trouble is that there are no good analogies to the real world - it's all hidden, it's all geek magic. And of course the juries are composed of mostly regular joes with spyware-ridden computers and who hate the IT guy. And the lawyers, lobbyists, politicians, corporate executives were the ones who stuffed the geeks in the lockers back in school. There is not a lot of money to be made in just letting people do what they want. So there is a bright future for convictions.

    Has it not sunk in yet that you simply can't intrude on systems or files without permission, however helpful your intentions? How freaking difficult is that for people to grasp?

    I admire your idealism. But you had better keep your head up and pay attention to the motives of the people we are reading about. It has little to do with whether you are doing right or wrong, or "accessing" with or without "permission".
  • First Amendment.? (Score:4, Interesting)

    by Frankie70 ( 803801 ) on Wednesday May 10, 2006 @09:41AM (#15300470)
    Notwithstanding the First Amendment's free speech guarantees, the trial judge convicted and sentenced McDanel to 16 months in prison. I represented him on appeal

    Thank god, the prosecution did not defend the action on appeal.
    Because the defendent seems to have been represented by someone who doesn't
    seem to know that the 1st amendment isn't relevant here.
  • Re:and? (Score:2, Interesting)

    by justthinkit ( 954982 ) <floyd@just-think-it.com> on Wednesday May 10, 2006 @09:55AM (#15300582) Homepage Journal
    I worked on the Canadian commercial and military Automated Air Traffic Systems (CAATS & MAATS). A co-worker who tested software tracked one particular bug daily to see if it had been fixed yet -- it never was in the year I was there. The major network design problem I inherited and verified was totally denied during my entire stint, but I heard later they switched things to the way that I had advocated. I also heard later that the biggest advocate of the flawed design was married to the top person on the project.

    It is quite an unforgettable experience to be the "Junior Barnes" in a room full of high level types working for a 100,000 person corporation who turn on you like a pack of dogs when you state that the design won't work. The most senior person in the room said just one thing, "Why wasn't I told of this earlier?" [I had been invited to this meeting almost on a whim, to help explain something if my boss floundered.]

  • by Asklepius M.D. ( 877835 ) on Wednesday May 10, 2006 @10:04AM (#15300640)
    was somebody's pride. This "form over function" thing is starting to get out of hand both in the gov't and in the private sectors. True story: I once took a military medical course that was teaching information many years out of date. Using the appropriate forms, I submitted detailed critiques complete with sources and references. Rather than fix the problem, I was called on the carpet and ordered to stop submitting critiques because they "questioned the integrity of the course." This strikes me as very similar to "They even claimed the integrity of the system was impaired..." Yes Virginia, that's exactly what we're doing! You can't fix it if you don't admit it's broken.
  • Point taken... (Score:3, Interesting)

    by BenEnglishAtHome ( 449670 ) * on Wednesday May 10, 2006 @10:06AM (#15300652)

    ...but not completely. There's a saying where I live that the County Prosecutor can get a grand jury to indict a ham sandwich. Any grand jury that doesn't do exactly what the prosecutor wants will find itself the subject of a carefully orchestrated smear campaign, complete with local news stories (planted by guess who) investigating the problem of "runaway grand juries."

    My point is that prosecutors have a lot of power and any public servant with lots of power should always be willing to step outside the game and do what's right before they start punishing people. And yes, prosecutors punish people long before trials happen before supposedly impartial judges. Just being indicted for a serious crime, something the prosecution does essentially without oversight, is usually a life-wrecking event no matter how innocent the accused. Normally, prosecutors who exercise their power with an eye toward justice, declining to prosecute marginal cases or cases where a bad law could be enforced, wind up simultaneously serving two goals: they serve their public mandate and they don't wind up looking like idiots in the end.

    In this case, the prosecution actually did something that was right and sacrificed a little of the "We're perfect" vibe they normally work so hard to maintain. I simply chose to think less of them for being so slow to reach the conclusion such was the right thing to do. By being so slow to act, they have punished someone who ought not to have been punished.

  • by elronxenu ( 117773 ) on Wednesday May 10, 2006 @10:07AM (#15300660) Homepage
    Without taking any sides on the matter of full disclosure, there are interesting parallels with the quoted cases.

    Full disclosure: if I find a bug in, say, Windows, should I

    • Report it to Microsoft?
    • Announce it to the world?
    • Report it to CERT?
    • Send details to Oracle?

    If I find a bug in USC's website, should I

    • Report it to the USC administrators?
    • Announce it to the world?
    • Report it to SecurityFocus?
    • Send it to MIT?

    If I find a bug in my employer's systems, should I

    • Report it to my employer?
    • Announce it to the world?
    • Report it to CERT?
    • Send it to my employer's competitors?

    Enquiring minds wish to know ...

  • by joshv ( 13017 ) on Wednesday May 10, 2006 @10:21AM (#15300757)
    When working for a company I shall not name, we used an ASP for our recruiting software, which company I will also decline to name. This software had a document upload functionality that would allow clients to upload offer letters and such. In trouble shooting an issue with our company's uploads we found it was quite easy to browse to other client's uploads by changing a client ID in a URL. Granted, you had to login to the system to be able to access this URL, but once logged in, there were apparently no security restrictions across clients. We had free access to the offer letters, job applications, any document having to do with the recruiting and hiring process, of other companies - some of them very big names.

    Did we do anything about it? Nope. We ignored it. I didn't even bring it up to our managers. Why? Because in documenting the issue we would have most certainly violated the licensing agreement, and a good argument could be made (especially in light of judgements like the one in the article) that we were conducting criminal computer trespass by changing the URL to knowingly access another client's repository. As stupid as that sounds, I was not willing to risk my job, or prison time, when I knew there were probably 15 other such security issues in the product, and my blowing the whistle on this one wasn't going to fix what was essentially a very crappy product.
  • by Anonymous Coward on Wednesday May 10, 2006 @10:25AM (#15300781)
    There is no code of ethics.

    You have kids trying to "make a name" by breaking things. You have companies [idefense.com] paying these kids to find vulnerabilities, I've heard that there is a 6-figure type bounty on certain specific vulnerabilities. At the same time you have big corporations that are taking a beating in the media because vulnerabilities are disclosed before they have time to react; you also have big corporations being told about problems (whether or not it is through proper channels remains to be seen, I don't expect that the new Windows bug is going to get fixed when you tell MS Sales about it.) You have security companies like eEye publishing every vulnerability they can find to give their company some "street cred." You have companies like Foundstone (now Symantec) pirating software [governmentsecurity.org] to search for holes in it. There is this whole rationalization in the "hacker community" that they are some how doing the software vendors favors by finding the stuff; so just randomly postscanning hosts is really "research," huh? Dispite your lack of any publishing, education and any agreements with anybody that you're "researching" on? You have frauds like Steve Gibson saying that big corporations are putting backdoors in to code on purpose [thisweekintech.com]. You have opensource tools changing their license [nessus.org] and close sourcing because of companies that are simply packaging their work can charging a lot of money for it; who can blame them? There are companies [immunitysec.com] that now sell exploits [gleg.net] and "0days." You have a whole OS [openbsd.org] "designed" around security, yet they cannot publish any of the changes they've actually made and explain why they have made them (come on guys, this would be a best seller of a book, just lists of code, this is the bug, this is why it's a bug, this is how we fixed it...) At the same time, I don't want Apple and MS pushing out patches minutes after they hear about things, I want the code QAed.

    Now the lawyers are getting involved. We need to check ourselves as an industry. We are a stones throw away from developers being held responsible for damages caused by software, there are already people in favor of that. Just stop and think about that. There is no union, there is no protection for the worker here, we're held in contempt at a lot of places, because of the highly paid prima donnas jerking around writing shitty code. It will only get worse right now.

    It's a sort of hot area right now, the feds are spending money. You can't be involved with software or networking and not have some kind of concern for security. This may sound old fashioned but to get a cert, whatever certs the security world wants to embrace, there should be an oath that encourages security always, encourages openess, discourages black market tactics for trading viruses and exploits, discourages this whole notion of "black magic," and discourages profiting from secrecy regarding security. I'd even go one better and add to the oath that there should be a certain and accepted public disclosure process for when a vulnerability is found in a network or application, the owner is told and then after 90 days the whole world is told, all of the time. I know of companies that have found problems in networks and then extorted money for information regarding them. That's just wrong and that should be criminal.

    There are no security best practices, not in any formal sense. You can pull 100 consultants or CISSPs off the street and you'll get a 100 different sets of things you should and shouldn't do. We need to formalize the discipline. We need to encourage practices during the writing of software and constuction of networks for security.

  • Re:First amendment? (Score:5, Interesting)

    by cdrguru ( 88047 ) on Wednesday May 10, 2006 @10:26AM (#15300790) Homepage
    The First Amendment refers to the government's ability to pass laws to restrict speech. It has limited effect on states, cities, villages and other municipalities.

    It has no effect on companies, contract law, or anything else.

    There is no "first amendment right to access the system". Period. You do not have any rights at all - you have privileges that the operator of the system gives you. And these can be revoked at any time. Without cause or explanation.

    Yes, that means AOL can cancel your account without telling you why.

    Yes, that means when your employer says not to do something and you do it anyway you are exposing yourself to consequences. Sometimes legal consequences in addition to just getting fired.
  • Same here (Score:5, Interesting)

    by GmAz ( 916505 ) on Wednesday May 10, 2006 @10:54AM (#15300995) Journal
    The school district where I work used to have its entire network wide open. Anyone could access everything, e-mail, grades, pernament record. You name it, they had it. They just has to browse to it through the Network Neighborhood icon. One student saw this and told the assistant principal several times and he was ignored. He finally printed off a bunch of student grades and gave them to the assistant principal showing him it was a real risk and that something should be done. He was a legitimate good kid trying to help. Instead, he was Expelled from the district and was given probation (he was a minor). After that, the district REALLY tightened up its security. I feel that kid shouldn't have had anything done other than a huge thank you.
  • This is nothing new. (Score:3, Interesting)

    by Optifark ( 973981 ) on Wednesday May 10, 2006 @11:03AM (#15301064)
    I worked for an Army contractor in the 80's. I found flaws weekly. I caught flack for each one I pointed out. In the end they made me data security manager so I would just fix them and stop pointing them out to the customer. I was told I would go to jail more than once. You have to do what is right for the customer. In this case the customer was the US Army. Any company should see this is the only way to to fix holes. See them, report them, fix them. -Steve
  • Real Fear (Score:5, Interesting)

    by Anonymous Coward on Wednesday May 10, 2006 @11:09AM (#15301100)
    Sprint runs a 9-1-1 service for hundreds of jurisdictions around the United States. The heart of their system includes a Windows server that is left virtually wide open on the internet. This server is the repository of all the 9-1-1 data from telephone companies around the country. It would be trivial to add, delete, or alter the 9-1-1 data on that server and wreak havoc. The system does not even require a password.

    This has been reported to Sprint and various local 9-1-1 officials several times. Sprint denies it is vulnerable; local authorities are disinterested in investigating. Nobody will put any attention on this until that one day that a malicious party will cripple 9-1-1 systems throughout the U.S. Then there will be screams for congressional investigations and finger pointing galore.

    But the well-meaning party that performs a proof-of-concept exploit to make a point would be butchered as the terrorist they are trying to prevent.

    For now, there are people who know that the 9-1-1 system is extremely vulnerable, and they fear the day it gets exploited. But they are more afraid of ruining their lives and their families' lives by speaking out.
  • by hullabalucination ( 886901 ) * on Wednesday May 10, 2006 @11:27AM (#15301242) Journal
    I'm pretty sure that that gigantic market share of Windows is the main reason that it's got so many viruses.

    Right. The fact that Gates, Ballmer & Company decided to ignore practically every reputable security expert on the planet and release ActiveX, a completely unsandboxed tool for crackers, had nothing to do with it. Right-o, Matey.

    First ActiveX exploit released: 1993. Latest ActiveX exploit: in the wild currently and unpatched. That's 13 years that Microsoft has ignored your security and refused to correct a huge, gaping security hole.

    We won't even talk about the RPC processes (accessible through ports left open by default) that have traditionally been running in Windows (up until just a few months ago), with full Admin privileges, every time you log in, no matter how you log in.

    The real reason Windows has more security problems: the head-in-the-sand, we'll-bend-over-and-take-more-of-this-same-old-cra p attitude of Microsoft customers.

    But here, I'll let the Microsoft folks themselves tell you:
    "Our products just aren't engineered for security," said Brian Valentine, Microsoft senior vice president for Windows development. Another Microsoft executive recently explained they never paid attention to security "Because customers wouldn't pay for it until recently."

    Article (2003) quote from http://archive.corporatewatch.org/profiles/microso ft/microsoft1.htm#Crapsoftware [corporatewatch.org]

  • by Retric ( 704075 ) on Wednesday May 10, 2006 @11:34AM (#15301296)
    You can easily produce software that does not cause security vulnerabilities. Just run the software in a VM and keep it the hell away from the host system.

    Granted there will always be software bugs, but there is no reason why running software should introduce security holes into the host system.
  • turn it around (Score:2, Interesting)

    by Anonymous Coward on Wednesday May 10, 2006 @11:52AM (#15301459)
    If a vendor gets notification of a security breach and doesn't fix it within x-number of days, you should be allowed to sue them if you are a customer and must use that insecure software. Not they get to sue you or the other guy who found out about it, or the state prosecutes. That's what this article case was about. Bogus. The guy who did it could have been a little smoother in how he went about it, but really...

    Yes, that should apply to operating systems and applications as well.

    That would slow down code bloat and new features in favor of writing secure code and having secure access.

    I work on cars sometimes. If I notice a defect that looks like it could be a serious design flaw, and notify acme motors, and they still keep shipping cars with that defect,and people get hurt...well, they get nailed in court then, and the law falls pretty well on the side of the customers and the people who found out about it. That's with the car I have access to. If I have to break into their factory to do this,to find out, that's another story.

    I think the difference is normal access as opposed to extra-ordinary access. If it is normal access, I see no probs, the other, gets to be a tricky call when it comes to code. We need a legal definition of what is access. If it is a web facing page, and no hacks are involved in accessing it, then I say there should be no threat to the accesser, looking for security breaches or anything else. If a glitch is found that seems to offer the potential to elevate access permissions, I think a proper response is some way to have a verified notification to the vendor, (we need a legally verifiable way to do this, a public bulletin board recognized by industry, something like the notices in your local classified paper for example) (doesn't exist in the software world that I am aware of),then x-days later publish it publically, no matter fixed or not. X-days does not have to be a long time either, a few days to a week should be sufficient, and no way charge the poor guy with anything for doing that.

    We have very little accountability for software now,none basically, or to the people who use it and sell it to "make money" with. They offer a product, it shouldd have a warranty, it is that simple, all other products out there come with warranties "suitable for purpose and free from defects that would allow significant harm". All other products out there stilol have some defects, our laws identify BAD ones that cause harm.

      Until we get software warranties,to balance all the patent and other legal protections they have for their "products" in order to transfer cash from your wallet to their's, security will remain dismal and abusers and profiteers from bad code will remain reluctant to develop or deploy greatly enhanced/audited for security code.

      This is 2006, I think it is safe to point out this is the case with the vast majority of code out there now, and has been for a long long time unitl it has become the industry mantra and miondset that "it can't be done". I saw rubbish. Before we had legally enforced warranties for tangible products, "the industry" claimed the same thibng, that "it couldn't be done". We have proven it is possible to reduce the defect rate to a point where all other industries manage to survife, yes?

        Software companies *don't give a crap* because they aren't LIABLE for any bad code, no matter what happnes to YOU if you use it. That's because they have no legally enforced warranties. End.Stop.

        There is no stick to go with the carrot in this situation, unlike the vast majority of other products and services to products. Software has gotten a completely free ride for too long a time now.
  • by blincoln ( 592401 ) on Wednesday May 10, 2006 @11:55AM (#15301489) Homepage Journal
    It is a fact that programs get released with known bugs, it's actually an economic certainty for commercial programs.

    Bugs are going to happen. Incompetent design doesn't have to.

    There is an expensive (~$3000 license per machine) "enterprise" product that we use throughout the company. It needs to store usernames and passwords with reversible encryption. In the first version we deployed, the encryption was a substitution cipher - literally the level of "security" you'd get from a cereal box spy ring. We complained to the vendor. The next version used a one-time pad that was the same for every password on every machine where the software was installed in the world. I wrote a script that generated a decoding table in a few hours, and I'm not even a cryptography geek. We complained again, and they changed it to something that at least *appears* reasonably secure, I haven't had time to look into it.

    Even assuming it is decent this time, why did it take so long for them to do? Encryption isn't a new field. There were plenty of algorithms they could have used from the beginning instead of re-inventing ciphers from centuries ago.
  • Re:A weak analogy... (Score:3, Interesting)

    by ajs318 ( 655362 ) <sd_resp2@earthsh ... .co.uk minus bsd> on Wednesday May 10, 2006 @12:09PM (#15301600)
    It's a defence to any crime that you only carried it out in order to prevent a greater crime. Like the old "dog in distress" scenario: it's perfectly OK to force entry into a vehicle or building in order to rescue a trapped animal in serious distress. By committing criminal damage {a crime against property} you have stopped an act of cruelty to animals {a crime against living things, therefore by definition a much greater offence}.

    If analogies from outside the computing world applied within the computing world, then it would be a valid defence for McDanel to say that his {fairly minor} offence of sending an e-mail to employees of a company was done in order to prevent a much greater crime involving exploiting a security flaw in that company's products. As things stand today, however, non-computer analogies don't translate well to computerised situations.
  • by slugstone ( 307678 ) on Wednesday May 10, 2006 @12:23PM (#15301727) Journal
    In the wild a tiger, lion or bear will go after the easiest prey not the most abundant.
  • by Anonymous Coward on Wednesday May 10, 2006 @12:29PM (#15301785)
    Speaking of stupid moves, should we talk about PHP? How much LAMP crap is out there, with the big P making things absolutely insecure by default? How many clueless users do you have out there installing phpWhatever without bothering to change a few lines of /etc/php.ini?

    Linux is insanely popular on the server end of things. The majority of people running servers these days aren't hardcore system administrators, they're the average Joes of the Windows crowd. ("I'm l33t! I installed phpBB! I'm l33t! I have a blog! With an xmlrpc vulnerability that I don't know about and won't upgrade for! Upgrading is hard!*") Same crowd, same unwillingness to learn, same problem - it doesn't matter one bit what the operating system is.

    (* You'd be surprised how much of a bitch it is to upgrade a number of content management systems.)

    Wanna guess how many compromised Linux boxxen I've seen, running anything from Viagra spam lists to PayPal phishing schemes?

    And while you're ragging on Microsoft for not designing things secure by default, but for money... Where's the tomatos being thrown at Open Source developers, who generally tack on security as an afterthought as well? Writing secure code isn't sexy, don't ya know?

    Thusfar, there's only two projects I've seen that have security written into their designs from the start: GPG (duh), and the good folks at OpenBSD.
  • Re:and? (Score:5, Interesting)

    by Fulcrum of Evil ( 560260 ) on Wednesday May 10, 2006 @12:57PM (#15301984)

    Yep, and the submitter's remark, "Notwithstanding the First Amendment's free speech guarantees," is silly because the First Amendment doesn't guarantee 100% free speech in all situations.

    How do you get from there to criminal prosecution for pointing out security flaws?

  • FreeMcCarty.com (Score:5, Interesting)

    by OneByteOff ( 817710 ) on Wednesday May 10, 2006 @01:03PM (#15302036)
    Since it seems this article is primarily about me, I felt it was necessary to post here. My name is Eric McCarty and you can read up on the case from my perspective on my website :

    http://www.freemccarty.com/ [freemccarty.com]

    I am not a malicious hacker, i am not even a hacker, I am a security researcher who wanted to goto USC to get my degree, nothing more, nothing less. If you think about it, I am one person, if I goto prison for the offense I am accused of commiting then I can still look in the mirror and know that because of my action over 200,000 people won't be victims of identity theft.

    Thats the whole point of security research in my opinion, making the internet safer, not for notariety, not for fame, or for money. Please take a look at my website and feel free to contact me directly with any comments, suggestions or if you are willing to assist my case.

    Thanks,

    Eric C. McCarty
    admin@freemccarty.com
    http://www.freemccarty.com/ [freemccarty.com]
  • by Mistshadow2k4 ( 748958 ) on Wednesday May 10, 2006 @01:21PM (#15302240) Journal

    "ActiveX (in a browser, I have to assume thats what you're talking about) gives security prompts on any attempt to install software. If you click No or do not install or whatever, it doesn't."

    Spyware vendors got past that years ago.

    "Wow, way to quote a 3 year old article."

    You say that as if three years were a long time or things had changed at Microsoft. Three years isn't that long at all, especially as Microsoft hasn't yet produced another OS or browser (Vista and IE 7 are in beta), nor has there been a large turnover in key employees, and especially, the executives who make the decisions about these things.

  • The other side (Score:5, Interesting)

    by geekyMD ( 812672 ) on Wednesday May 10, 2006 @02:00PM (#15302563)
    FTFA:

    That means the law frequently rests on the definition of "authorization." Many cases suggest that if the owner doesn't want you to use the system, for whatever reason, your use is unauthorized. In one case I took on appeal, the trial court had held that searching for airline fares on a publicly available, unprotected website was unauthorized access because the airline had asked the searcher to stop.


    If a shop owner tells you to get out of his store, then you must comply or the police will be called. Why? Because if you do not comply with the wishes of the owner, its called trespass. But on the other side, the shop owner must notify the customer that they need to leave before calling the cops, otherwise its harrasment.

    Just because you know something about computer systems doesn't give you the right to invade them and show the owner what you found. How would you like a home security firm to break into your house and then publish in the local paper that you keep a key under the doormat? Yes, my house is 'publicly available' given that its not behind any gates or walls, but that is not an invitation for everyone to come in.

    What needs to happen is for security professionals as an industry to have more savvy contracts with the companys they consult for. With clauses stating that the consultant will be free from prosecution if a) they notify the company and give time to repsond and b) if the company doesn't take action and the risk is great to the public or the company's clients then c) the consultant has the right to go public with the information.

    Of course there are more clauses you might want to add, but it seems like a lot of this could be solved in the contracting steps of taking the job. If you can't get a good contract, don't take the job.

    Vigalante justice is illegal. Robin Hood was a good guy, as were the American Revolutionaries, but from a criminal law perspective they were all guilty of many crimes. They chose to break the law because of their personal convictions but they also more or less accepted the risks of doing so.

    What happened to whistle blower protection laws, wouldn't those apply in these situations?

  • Re:and? (Score:3, Interesting)

    by duerra ( 684053 ) * on Wednesday May 10, 2006 @02:02PM (#15302577) Homepage
    It protects you from the government censoring your opinion, but when your speech begins to infringe on the rights of others (harassment, libel, revealing of trade secrets, etc.)
    Oddly enough, I hold my first ammendment guaranteed right to free speech at a lot higher level than any trade secret.

    Come to think of it, I don't know that the constitution guarantees me the right to trade secrets. Hmm.
  • Re:Same here (Score:3, Interesting)

    by koshatul ( 198070 ) on Wednesday May 10, 2006 @02:18PM (#15302716) Homepage
    Back when I was at school, I lost my Subject Captaincy, and almost got expelled over realising the system administrator had used a simple formula to turn all our student numbers into all our passwords.

    When I came forward with it, they called in my parents and were threatening me with explusion if I didn't tell them how I hacked the password list as "figuring out they're a formula from noticing a pattern in myown and my friends passwords" was considered impossible.

    We'll never live in a society where the people who enforce rules know about the systems that operate on them.
  • by blincoln ( 592401 ) on Wednesday May 10, 2006 @09:50PM (#15305672) Homepage Journal
    I hear you. I really tried to get some of the upper management to care about the issue, but it didn't work. Even some of the other engineers basically said "it's difficult to get access to the file that stores the 'encrypted' passwords, so this is less of a security concern than some others that are outstanding."

    The company has a substantial investment in this particular product (on the order of half a million dollars in licensing), so they wouldn't consider replacing it.

    I am a little more confident in the latest revision of the 'encryption' because it doesn't have any obvious patterns. The previous two were obviously weak because patterns started emerging after seeing what a handful of passwords 'encrypted' to. I also did some preliminary research to see if e.g. they had taken the XOR pad to the next level and had it change based on the line number in the text file as well as the character position on each line. I still don't think it's a strong mechanism, but at least it's not the awful joke it started out as.

    At the time, I had also gotten my hand slapped by the security department for sending my cracking script to anyone other than them (I cc'd the vendor and the management above my group), so I pretty much left it alone until their staff changed.

    In relation to TFA, this isn't evan a matter of poking through things where you don't belong, if you can crack your own password, that's enough of a concern that someone else could too.

    I agree. They might have been able to make a flimsy legal case against me though because the crack would work for the passwords on any machine in the world running the software - the pad had no salt of any kind.
  • I know how it is... (Score:2, Interesting)

    by ronz0o ( 889697 ) on Wednesday May 10, 2006 @11:54PM (#15306081) Homepage
    The same sort of thing happened to me. I was wardriving one day, and came across a hot spot. After connecting to it and not being able to browse the internet, I did a little more investigation. Turns out that I discovered an unsecured POS terminal. Not just any POS terminal, but this was part of a nation-wide store chain. Any monkey with the slightest computer knowledge would have been able to sniff credit card numbers, account numbers, etc. with little to no problem. The odds of being caught were also slim to none. I made all the contacts I needed to, and recieved a phone call a half hour later. "Why did you breach my computer system? You DO know what you did is illegal, right?" "Look sir, it could have been me or a person sniffing credit card numbers. I am helping you." And yes, there are still honest people in the world...

Always draw your curves, then plot your reading.

Working...