Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

The 2006 Underhanded C Contest Begins 232

Xcott Craver writes "The second annual Underhanded C Code Contest is live as of April 4th, and runs until July 4th. The object is to write malicious C code that looks perfectly readable and innocent under informal inspection of the source."
This discussion has been archived. No new comments can be posted.

The 2006 Underhanded C Contest Begins

Comments Filter:
  • I Win (Score:5, Funny)

    by ExE122 ( 954104 ) * on Wednesday April 05, 2006 @09:17AM (#15065557) Homepage Journal

    In this contest you must write code that is as readable, clear, innocent and straightforward as possible, and yet it must fail to perform at its apparent function. To be more specific, it should do something subtly evil.

    system("c:\Program Files\Internet Explorer\iexplore.exe");

    Where's my prize?

    --
    "Man Bites Dog
    Then Bites Self"
    • Re:I Win (Score:2, Funny)

      by Anonymous Coward
      system("c:\Program Files\Internet Explorer\iexplore.exe");

      Where's my prize?


      I don't think you read the task description very good, it said:
        "it should do something subtly evil"
    • Re:I Win (Score:2, Informative)

      by Kjella ( 173770 )
      Well, if you ran it on this machine you'd get a "File not found". In a related note, everyone who hardcodes paths like "C:\Program Files" "C:\Windows" , "My Documents" should suffer. Likewise those who completely ignore regional settings (no, my decimal point and thousands separator are not the same as yours). Variations include those who can't handle non-ASCII letters or sorting (heard of æøå?).
    • system("c:\Program Files\Internet Explorer\iexplore.exe");

      I'm not sure what the \P, \I, and \i escape characters do, but I think you were looking for this:

      system("c:\\Program Files\\Internet Explorer\\iexplore.exe");

    • Re:I Win (Score:4, Funny)

      by darkmeridian ( 119044 ) <william.chuangNO@SPAMgmail.com> on Wednesday April 05, 2006 @10:18AM (#15066138) Homepage
      In this contest you must write code that is as readable, clear, innocent and straightforward as possible ...


      Read the conditions of the contest clearly. You obviously lose.
    • That looks plain evil.
    • They wanted "subtly evil"!
    • Problem: You didn't write it. 'Least, I presume not.

      Also, IE is not SUBTLY evil.
  • by sgant ( 178166 ) on Wednesday April 05, 2006 @09:21AM (#15065588) Homepage Journal
    Why is this a good thing? I'm not a programmer, so I don't really understand why writing code that appears to be innocent, yet is really evil, help the community?

    I understand about making source code available helps in a secure system, but what if that code has evil code...made to look innocent upon inspection....written into it?

    I know that showing how to crack into a system, or how to write a virus actually helps in the long run as it exposes weaknesses that can and should be patched and closed. But what does having people practice hiding malicious code do for us?

    Just wondering. I find this stuff fascinating....though not fascinating enough to actually learn how to do it!
    • by chrismcdirty ( 677039 ) on Wednesday April 05, 2006 @09:22AM (#15065593) Homepage
      1. It teaches you not to take all code at face value, and actually read into it.
      2. It's fun.
    • It provides a method to enumerate the techniques used by those with less than pure intentions.

      IOW, it helps folks learn to spot these 'bugs' more readily.
    • by Xcott Craver ( 615642 ) on Wednesday April 05, 2006 @09:26AM (#15065629)
      Well, ask yourself how the Obfuscated C Code contest "helps the community." To some extent, it's just a contest, and not meant to bring about world peace.

      On the other hand, I think it does teach us a thing or two about what to look for when reviewing code. I know I've learned a lot about sneaky coding practices since it started. I learned C in the 1980s and thought I was pretty knowledgeable by now, but I actually didn't know about ASCII trigraphs until last year. X

      • by kimvette ( 919543 ) on Wednesday April 05, 2006 @11:34AM (#15066911) Homepage Journal
        Perhaps this "contest" is sponsored behind the scenes by Sony, in their search for more stealtht rootkit implementation methodologies in their next Anti-Fair-Use software release. They're counting on some smartass or two submitting really clever malicious code, I just know they are!

        This has been the crackpot conspiracy theory of the day.

        (Why yes, I'm bored! Why do you ask?)
    • by tmjr3353 ( 925558 ) <tmackintosh&gmail,com> on Wednesday April 05, 2006 @09:26AM (#15065637)

      I understand about making source code available helps in a secure system, but what if that code has evil code...made to look innocent upon inspection....written into it?

      I think you've highlighted the point right there. By getting the community to find ways to write code of this fashion, you're simultaneously getting them to learn to read code better (or at least that would be my hope). If I know how to write code in a fashion that looks innocent but brings with it not-so-innocent consequences, then hopefully I know how to tell when someone else is doing the same thing.

    • by l2718 ( 514756 ) on Wednesday April 05, 2006 @09:33AM (#15065677)

      This problem arises whenever you need to use software for an application that must be secure. One famous case of tampering was by the CIA; control software for a Soviet oil pipeline purchased in the West was modified to fail upon a remote command [msn.com] causing a massive explosion.

      One hypothetical scenario: Diebold [diebold.com] decide to act on their CEO's promise to deliver the election to the Republican party by making a small modification to their voting machines [diebold.com]. If they can use the techniques this contest is looking for they would write the code so that it would escape even scrutiny by an outside agency (say, the government).

      In general, the idea of the contest is to showcase ways of breaking security and therefore perhaps ways to overcome them.

      • For a (past) contest targeting the specific scenario I described above, see the Obfustcated V contest [stanford.edu], which was the inspiration for Xcott's contest. The winning entry manages to only show its bias on the day of the election itself, but not before, so that it can satisfy serious testing.
      • Are you sure you meant "hypothetical"?

        In Diebold's case, I'm not willing to ascribe to incompetence that which can be explained by malice.

        • In Diebold's case, I'm not willing to ascribe to incompetence that which can be explained by malice.
          Personally, I'd go Occam's Razor on that issue and say that there's been a few hundred more years of mechanical voting fraud to fall back on rather than messing with software. Why do you think the dead rise again every few years to vote?
      • Actually that pipeline case is probably disinformation.

        The Trans-Sib pipeline control system was developed by a UK company. It used MC6800s and was written in assembler. The stuff was so unstable anyway due to the hand coded networking that deliberate interference would hve been picked up during the shake down (the code was continually being rewritten and EPROMs reblown).

        • Actually that pipeline case is probably disinformation.

          Most likely a case of a typical megalomaniac ex-Intelligence blow hard trying to take credit for the Sun rising in the West, in his highly incredulous "memoir".

          If any of that crap was true, Russia would be suing for damages, which under international law they would be entitled to, since they actually bought that stuff for their pipeline legitimately, and would be using that idiot's book as Exhibit A.

          On an unrelated note, I wonder when will some more e

      • One famous case of tampering was by the CIA; control software for a Soviet oil pipeline purchased in the West was modified to fail upon a remote command causing a massive explosion.


        Shouldn't the CIA be held responsible for criminal behavior like this?

    • by Anonymous Brave Guy ( 457657 ) on Wednesday April 05, 2006 @09:34AM (#15065683)
      I understand about making source code available helps in a secure system, but what if that code has evil code...made to look innocent upon inspection....written into it?

      The "many eyes" theory can only work in practice if there are indeed many eyes reviewing the source code and those eyes can see any problems. That doesn't just mean accidental bugs, or portability/future-proofing concerns, or a poor choice of data structures and algorithms leading to a performance hit. It also means spotting the devious and subtle attacks.

      Just imagine what would happen if a major OSS project like Apache or Linux accepted a "useful" patch that contained a backdoor that wasn't identified, and this then got distributed worldwide. A significant number of people believe, erroneously, that using OSS inherently makes them safer because of the many eyes theory. These people will happily download and build the updated code, or install prebuilt binaries with correct checksums, completely oblivious to the fact that they just stuck a major security hole in their system.

      Thus it's important for those who review submissions to software development projects - OSS, commercial or otherwise - to be very aware of these possibilities, and likewise for anyone else who contributes to them so they can spot a problem if they come across it.

    • From the FAQ: Why?
      We were initially inspired by Daniel Horn's Obfuscated V contest in the fall of 2004. I was greatly impressed to see how even a short program to simply count characters in a text file can be made to fail, and fail only on one specific day.
      The longer answer is that my research interests are in covert behavior: detecting it, and getting past people who try to detect it.


      The prize is $100.00, I think it should be more IMHO.
    • The contest will show code auditors what they should look for, what kinds of underhanded practices are there and what patterns they employ.

      In a fun and harmless way this makes public the techniques until now used only by people with malice. It's really the same as showing how to crack a system, but here the system is "code auditors", and just like the system will close the holes and fix the bugs, the auditors will also improve.

      (this is partly a summary of other comments, in what I think is more focused an

    • It's like a wet t-shirt contest, allowing you to show off your rack in a controlled environment. Except in this case it's the metaphorical rack nerds develop from years of programming prowess, and not the actual rack they develop from scarfing down doritos and Mountain Dew during the same interval.
    • Well, here's the alternative.

      Put your hands over your ears and sing this song:

      "La la la! All C code is secure! Strcpy is perfectly safe! if (uid = 0) is a harmless typo! La la la!"

      This isn't about _practising_ how to write evil code, it's getting to know what kind of evil is possible so that you can recognize it when you see it later. The best possible outcome of this sort of event is for one of the observers to say "Ohh... I never know that you could do _that_" during the contest and then, a few

    • Because it helps teach people to recognize malicious code?

      Think about it this way. You're the head dev on a big software project and are in charge of committing changes that the other programmers have made to the code base into the repository. You screen all the pieces of code to make sure they are reasonable before they get merged, right? Well, if you don't know anything about clever techniques that can be used to hide backdoors and other malicious code, one could sneak by you. Nobody finds the bad code un
    • Simple. It'll kill the language off completely. Time to put it out of it's misery I guess.

      The process has already started with the latest versions of C++, virtually all the standard functions are being deprecated and replaced with ones that include target buffer size limits, forcing you to either ignore massive lists of complier warnings, to turn off the warnings, or to macro replace the functions with ones that will likely hardcode the limits to possibly inappropriate values, thereby making them behave

    • I understand about making source code available helps in a secure system, but what if that code has evil code...made to look innocent upon inspection....written into it?
      So this contest bothers you because you think it encourages people to write malicious code? Trust me, nobody needs encouragement. And doing it in a contest where the results are published helps educate people who need to review potentially malicious code.
  • I know... (Score:5, Funny)

    by scolby ( 838499 ) on Wednesday April 05, 2006 @09:26AM (#15065630) Journal
    ...I'll design a media player that appears to be playing a CD when it's actually installing a root kit that creates an easy way back door for malware.

    And then I'll get sued by Sony for copyright infringement.
  • by Ihlosi ( 895663 ) on Wednesday April 05, 2006 @09:26AM (#15065636)
    I really liked last years task but this years, um ...

    It depends way too much on things like the compiler being used, the optimization level, the actual hardware (how do they compare program run-time if the two OSes in question run on very different CPUs ?), and so on, than on actual C.
    • But that's the point... you want to create code that either uses a native feature found on certain CPUs (maybe something that PowerPC architecture is optimal for, as compared to the Pentium architecture), or else something that you KNOW causes bad behaviour under certain compilers. As a long-time embedded software designer, I can tell you that with embedded hardware, OS and compiler suites, there is a BIG difference from one system to the next on the level of optimization. About 10 years ago we were doing c
    • It depends way too much on things like the compiler being used, the optimization level, the actual hardware (how do they compare program run-time if the two OSes in question run on very different CPUs ?), and so on, than on actual C.


      If you do it right, it's entirely dependant on the actual C code (and its interaction with the OS/CPU). There's no compiler flag in the world that can turn an O(n!) program into an O(n) one.
  • Any code (Score:2, Insightful)

    by Anonymous Coward
    Any code that includes a patented idea could win this contest.

    Looks innocent, is malicious.
  • by MT628496 ( 959515 )
    Isn't it likely that encouraging people to design programs in this way would lead to companies using these techniques in their own software? Say someone has a contract with Microsoft, the linux version, while being fully functional, could be made to be slower. Then someone would go and demonstrate how poor linux performanace is yadda yadda
    • by plover ( 150551 ) *
      That's probably part of the point of the contest -- to point out that malicious code such as they're suggesting already exists in the world.

      Saying that this "helps the bad guys" (not that you did) misses the point. We know there are bad guys out there. This becomes an awareness campaign.

      There are several documented cases of stuff like this happening. Both ATI and nVidia (the graphics card companies) added code to their drivers to cheat [extremetech.com] -- take "shortcuts" when certain benchmark programs were running

    • You mean like how Intel constructed their C++ compiler to produce slower code for AMD chips than their own Pentium chips?
  • void main(int argc, char **argv, char **envp) {
    }
  • Lucid Programming? (Score:3, Interesting)

    by frantzdb ( 22281 ) on Wednesday April 05, 2006 @10:42AM (#15066377) Homepage
    I see a lot of utility in a contest like this. As much fun as an obfuscated programming contest is, in a day and age when our critical infrastructure, including voting machines, are running on software, it is important that we be aware of just how difficult it to assure that code does what it should.

    A related contest I would like to see is a lucid programming contest. Given some small but insidiously tricky task, write a program in the language of your choice which solves the problem correctly and which is easy for someone else to understand. It would be interesting to discover which languages excel at this task and what sorts of patterns emerge when emphasis is placed on clarity.
  • My entry! (Score:4, Funny)

    by radiumhahn ( 631215 ) on Wednesday April 05, 2006 @10:48AM (#15066430)
    #include <stdio.h>

    main() {

    /* Rob a bank! */
    /* Steal Stuff! */
    printf("hello, world\n");
    /* Use Drugs! */
    /* Kill, Kill, Kill! */
    }

  • by PeeAitchPee ( 712652 ) on Wednesday April 05, 2006 @10:51AM (#15066462)

    An oldie but goodie . . .

    while (1)
    {
    status = GetRadarInfo();
    if (status = 1)
    LaunchMissiles();
    }
  • Can I beta test these things? =P
  • write malicious C code that looks perfectly readable and innocent under informal inspection of the source.

    Just another item on my list of reasons I hate C-language. And I first started uisng C in 1977.

  • by Ashtead ( 654610 ) on Wednesday April 05, 2006 @12:09PM (#15067363) Journal
    #include <stdio.h>

    main()
    {
      char stuf[80];

      while(1)
      {
        fputs("Enter something: ", stdout);
        fflush(stdout);
        gets(stuf);
        fputs("You have entered ", stdout);
        printf(stuf);
      }
    }

    silly (and looks innocent enough) but closer inspection will reveal nastyness...
    • by whitenaga ( 886892 ) on Wednesday April 05, 2006 @01:11PM (#15068110)

      Your code is dangerous, but it has to be exploited by a knowledgable user. I think what they're looking for in the Underhanded C Contest is code that exploits itself. But for the purpose of being pendantic, i'll bite... =)

      • You're using gets(), which is notorious for buffer overrun problems.
      • You mix fputs() and printf(), right next to each other. And you use printf() just like fputs(), and that looks suspicious.
      • printf(stuf); is practically asking for exploitation. If stuf contained the proper combination of "(filler) %junk %junk %n", printf()'s return address would be overwritten.
  • #include <stdio.h>

    int main( void )
    {
          printf("Goodbye, world!");

    }
  • by patio11 ( 857072 ) on Wednesday April 05, 2006 @01:07PM (#15068067)
    Here's what I'm thinking: take a data structure which is well-understood, easy to implement, and boring as mud. Like, say, a hash-table with collisions resolved by linking. Everybody saw that back in sophomore CS right? And everybody knows with even a cursory inspection that a hash table offers constant-time performance on lookups and o(maximum size of table) time on reading out, right? Except when it doesn't. Malicious choice of data for feeding into a hash table can severely degrade performance, and we wouldn't want that, so we're going to be extraordinarily conscientious engineers and salt our hash function so that a malicious user can't cause our program to have worst-case performance.

    I think, with creative use of bad programming, you could corrupt either the salt or the calculation of the hash function in such a way as to guarantee that for a target OS the hash-table performance would degrade into worst-case. So if you took your borked hash table, and used it to implement an associative array, the fairly trivial read in stdin, increment fields in associative array, sort array in order code could be made to perform at average time complexity in non-targetted OSes and worst-case time complexity in your target OS. Assuming you pick an O(n log n) sort algorithm, if you manage to "accidentally" make each of those n's actually polynomial complexity (heck, n^2 even) the computer should essentially blow up on non-trivial data sets. Its late in the evening and I haven't thought through this very much, but one way would be to use utsname's sysname thing as part of your "random data" to make the salt. That sounds a little obvious though. Maybe there's some obscure function somewhere for getting dates or times or something that I can exploit format of the returned data to reveal the difference between OSes, as that would be a lot harder to detect ("oh, seeding a hash function with a date and some magic numbers, nothing wrong with that").

    Anybody got any ideas or corrections to share? Its been a while since I've taken data structures, and I've got essentially no ideas for obscure functions revealing system differences to exploit (C isn't my bag).

    • The only problem with this approach is that the difference between O(n log n) and O(n^2), or O(1) and O(n) would likely require a fairly large data set before you see serious practical performance degradation, and I'm not sure you'll get that with their test data. You'd likely need a boneheaded data structure that goes from O(poly) to O(exp) time to see a significant time difference, which I think they're shooting for.
    • Sounds like an Algorithmic Complexity Attack [rice.edu]. According to the paper, such vulnerabilities are "extremely widespread", found in software such as:
      Mozilla 1.3.1
      DJBDNS 1.05
      TCL 8.4.3
      GLIB 2.2.1
      Python 2.3b1
      Perl 5.6.1
      Perl 5.8.0
      Linux 2.4.20 directory cache (dcache)
      Squid 2.5STABLE1
      Bro IDS 0.8a20
  • There is good way to measure the real difference of different distributions!
  • Its basics of benchmarking. The goal of producing benchark that performs given task and results show one system inferiour to other is REALLY easy. Too bad I don't have one of the old mac mini:s to show my athlon64 the superiority of RISC architecture. We all know very well that RISC is 1000 times faster than CISC dinosaur.
  • since I don't plan to spend any time on this one, I thought I'd start a thread on ways to attack OS specific issues for people that do want to try (the hard version - non hardware dependent - architecture is easy - endian-ness, pipeline, unaligned memory copies, etc).

    since you can't rely on architecture, and can't attack stuff like endian-ness, you need to hit the nuances of the OS. One way I can think of is exploit size differences of stuff like wchar_t, since it's 4 bytes on most newer flavors of BSD (e.

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...