Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

The Biology of Network Security 85

Posted by ScuttleMonkey
from the bionic-firewall dept.
Bob Brown writes "A University of New Mexico researcher is taking lessons from biology and using them to try to stymie hackers and viruses. Projects such as RISE attempt to secure computers and networks by promoting application diversity." From the article: "Diversity of systems and applications can play a key role in safeguarding computers and networks from malicious attacks, Forrest said. Her team published a paper last year on a system dubbed RISE (Randomized Instruction Set Emulation) (PDF) that randomizes an application's machine code to stymie would-be attacks, such as those launched via binary code injection."
This discussion has been archived. No new comments can be posted.

The Biology of Network Security

Comments Filter:
  • Gee, ya think? (Score:2, Insightful)

    by Otter (3800)
    She said this idea didn't fly very well with hardware engineers at Intel with whom she spoke to last year, as they envisioned having to build different chips around all these different instruction sets.

    Gee, ya think?

    Forrest's team got around this issue by building its technology atop virtual machine software dubbed Valgrind that she said provided flexibility because it is open source but that is not as efficient as she would have liked.

    Gee, ya think?

    Forrest acknowledged that the RISE system is unwieldy

    • correct if i am wrong but what kind of cop out is this to make yet another closed VM? of course that will make software that runs on it impenetrable to viruses... but then what about the VM itself?
      • You foolish young man, it's VMs all the way down!
      • correct if i am wrong but what kind of cop out is this to make yet another closed VM?

        Unless there was a coincidence in naming, I think valgrind [valgrind.org] refers to the (really quite awesome) open source debugging tool for Linux. Valgrind's primary purpose is to let you run your x86 Linux executables in an emulation environment where any memory-access errors can be detected and reported; it makes debugging much easier than in the "real world", where an error might only cause a crash or other visible symptom 1% of the

        • I suspect this researcher merely "appropriated" the valgrind source code as a test harness for her ideas. It's certainly much cheaper to do it that way than do build your own x86 fab...

          That's what struck me as funny -- she bothered to talk to someone at Intel about her scheme to implement the hardware counterpart to Gentoo. And the more realistic fallback plan was to run everything in a debugger!

          • Changing hardware to do ISR isn't that difficult; you essentially include another register that holds a secret key that has transformed the binary. During runtime, as an instruction is fetched, it is decoded with the key and then passed to the normal execution machinery.

            Since not every university has their own chip fabrication facility, the next most logical choice is to run things in an emulation or binary translation environment. Valgrind itself isn't a debugger, although its most popular tools (Memchec

    • So, isn't RISE (Randomized Instruction Set Emulation) similar in concept to PIC (Position Independent Code)?

      If you want to secure computers via the Linux route then with Hardened Gentoo [gentoo.org] is a good way (Follow the Resources links in sections 6).

      PaX [gentoo.org] is a hardened Linux kernel using ASLR (Address Space Layout Randomization) to support applications built as a PIE (Position Independent Executable) and to provide non-executable memory (NX).
      PaX home [grsecurity.net].

      PIE/SSP (Position Independent Executable)/(Stack Smas
  • Extinction? (Score:3, Insightful)

    by MECC (8478) * on Tuesday April 25, 2006 @02:01PM (#15199075)
    Would that include extinction of species with inadequate immune systems?

    • Re:Extinction? (Score:4, Insightful)

      by Opportunist (166417) on Tuesday April 25, 2006 @02:04PM (#15199097)
      Unfortunately, no. The "new" kind of infectors don't aim at killing the host. They just want to "milk" it. They want its processing power, its connection speed, its information and its user's credit card number.
      • Re:Extinction? (Score:3, Informative)

        by LordKazan (558383)
        So you mean they're parasites since we're using biological terminology
      • Yeah, that would make them parasites... which try to get resources without being noticed.

        Interesting ideas, but I don't know how well the biological maps to the commercial. After all, in biology, you have a population of genetically different individuals. The idea being that, among this population, some will have the functional capacity to avoid/survive whatever impending disaster/predation/disease/parasitism comes up. That's all well and good. What doesn't work so well for commerce is the corrolary tha
        • that made me think.. and I think the idea of security as an immune system response is quite a good analogy.

          In an immune system, once you catch a virus, your body will produce antibodies to fight it off, and then remember the virus so it'll be easily taken care of if it re-appears (hence we innoculate ourselves with a harmless attack).

          In security system, once an attack is noticed, the system is fixed/patched/configured to prevent the attack, and what you (as a sysadmin) remembers what you did so next time th
        • Well, MS is trying to sell us a winning solution in DRM. Personally, I consider it a lot of fluff, more hype and all wrapped up in marketing speech, in an attempt to lure clueless managers into buying into that crap.

          To ride the biologic analogy a bit further, computers are essentially "clones" of each other. Yes, they may have different makeup, they may have different graphics cards and so on, but then again, drivers nullify that difference again. The task of drivers is essentially to make "clones" out of d
    • Only if they are unmaintained. One can not drastically change a biological species' immune system (yet), but one can improve a network's security measures.
    • Would that include extinction of species with inadequate immune systems?


      That depends on whether the weakest creature happens to have a monopoly stranglehold on the PC desktop market, and a prooven interest in manipulating the political system to keep it that way.
    • Cockroaches don't fight off infections. Their systems are designed/evolved to work in spite of infection. This makes them dangerous for creatures that do fight off infections. This also seems to be the direction Microsoft Security is headed. And no, Paladium doesn't stop infection via security flaws. It only stops infection via idiot users.
      • Cockroaches don't fight off infections. Their systems are designed/evolved to work in spite of infection.

        No, B. germanica, like other arthropods, has two primary active immunocytes, namely the granulocytes and the plasmatocytes. The former are particularly cool in the cockroach -- their granulocytes (GRs) discover, encapsulate, and phagocytize foreign substances. In fact, unlike in other arthropods, cockroach GRs are particularly active in terms of encapsulation; they flatten and increase the number of m

  • by Opportunist (166417) on Tuesday April 25, 2006 @02:02PM (#15199077)
    "We already have malicious code that can replicate and spread itself. The only thing we're missing in terms of real Darwinian evolution is mutation,"

    Nope. Polymorph viruses are not really unknown. Right now as we speak, they make a comeback.
    • Every once in a while, networks will introduce an error into a file. Most often, this will be fatal to an executable, like a zero length file. But, there is a non-zero probability that someday an error introduced into a computer virus will change it from non-malicious to malicious. There's also the probability that an error will change a regular program into a virus. (or change a virus into a useful utility)

      Also, the environment that the viruses live in is changing. It's possible that a security "fix
  • Diversity is the key (Score:3, Informative)

    by mtenhagen (450608) on Tuesday April 25, 2006 @02:03PM (#15199094) Homepage
    The key point in network security is diversity and multiple layers of security. When there is a fault (due to whatever cause) in one of the layers only that layer will be comprimised but no real severe damage done.

    Ofcourse it is important that those layers are created and maintained by several entity's.

    A simple example:
    - Have your network guys maintain your firewalls
    - Have all traffic go through a application gateway which is maintained by a third party.
    - Have system administrators to secure the system

    Ofcourse adding layers increases costs and security.
    • What you talk about isn't akin to Biological diversity.

      If you took the Biological diversity to the nth degree, what you are talking about is designing systems with the goal that SOME systems will survive a given threat being realised. Hence the species survives.

      Biological Diversity in IT Security people are stating that we should use all flavour of Operating Systems, application systems etc...

      The problem is we (humans) are not really interested in "some systems surviving."

      We are interested in "ALL systems b
  • Intel not so happy (Score:4, Interesting)

    by TubeSteak (669689) on Tuesday April 25, 2006 @02:06PM (#15199110) Journal
    She said this idea didn't fly very well with hardware engineers at Intel with whom she spoke to last year, as they envisioned having to build different chips around all these different instruction sets. Forrest's team got around this issue by building its technology atop virtual machine software dubbed Valgrind that she said provided flexibility because it is open source but that is not as efficient as she would have liked.
    I imagine that Palladium style code checking wouldn't be to happy with programs that did funny things like this. I could be wrong, but off the top of my head, it seems plausible.

    As for mutation aka polymorphism (she talks about this at the end of TFA), doesn't she know about virii having built-in mutators? And metamorphic code does almost the exact same thing she's talking about in RISE.
  • by digitaldc (879047) * on Tuesday April 25, 2006 @02:06PM (#15199112)
    "This is a little tricky because we don't want to make everyone write their own operating system or e-mail reader from scratch or even learn a new interface," Forrest said.

    Speak for yourself, this is a lifelong obsession.

    A wise man once said - 'Never connect to the internet and your troubles will be few.'
  • by weetabix (320476) <weety&nucleus,com> on Tuesday April 25, 2006 @02:06PM (#15199113) Homepage
    So, what happens when someone finds a way to either a) run code right on the hardware and bypass the virtualization, or b) finds some small snippety of code (a binary prion, perhaps?) that plays hell with this RISE? I mean.... Mad Cow Disease is a prion.... Mad Computer Disease next?
    • According to p. 27 of the RISE paper ( PDF [unm.edu]),

      RISE is resilient against brute force attacks because the attacker's work is exponential in the shortest code sequence that will make an externally detectable difference if it is unscrambled properly. We can be optimistic because most IA32 attack codes are at least dozens of bytes long, but if a software flaw existed that was exploitable with, say, a single one-byte opcode, then RISE would be vulnerable, although the process of guessing even a one byte representat

  • Sure, in biology, differences help make the species stronger. Not true in IT. Which is harder to maintain, a shop full of [InsertOSHere] standard PCs, or a mixed environment with different hardware, different OSs, and different applications. Sure, it might lesson the potential vulnerability to various virus and other automated tasks, but at what cost? Suddenly instead of needing one or two specialized skill sets, you need lots. Not to mention the fact that the more environments you support, the more l
    • Sure, in biology, differences help make the species stronger. Not true in IT.

      Depends how big the difference are.

      Take for example address space randomization [redhat.com] (part of execshield). I'll quote redhat's explanation of it (as it's quite good):
      The idea behind Address Space Randomization is to put program code at a different address each time it starts. This way, an exploit can't know where the return address pointer should point to.
      Protects against many buffer overflow attacks (regardless of the hardware), with no cost to your 'standardized environment'.

      Pity windows & macOS don't have something similar.
    • It may be easier to maintain a network of homogenous PC's, but once I've broken into one of your computers, I've broken into them all. That's something that management should consider as well as the supposed "ease of maintenance" a homogenous network would bring. What's easier - fixing one compromised machine, or an entire network of them?

  • The biology of network security... is that when the lead batteries in UPSes goes bad, spring a leak, and make the surrounding area smell like an open sewer for a few days before people realized it's not a sewer problem from a nearby restroom?

    Or would that be when the air conditioning guys pump coolant fluid through a garden hose in the false ceiling space until the hose exploded and sent all this green goo crashing down on the sys admin's brand 19" monitor and nearly nailing the sys admin?

    Does that ma
  • How about this lesson from biology: animals need to reproduce.

    So the solution to stop having crackers breaking in to things?

    Mandatory sexy girls for all geeks!
  • But this is exactly the kind of thing large companies are trying to get away from. FTA:

    Making each computer unique would make life a lot tougher on attackers, she said.

    This is costly for companies with large networks as it requires too much overhead to manage this kind of a diverse network.

    "This is a little tricky because we don't want to make everyone write their own operating system or e-mail reader from scratch or even learn a new interface," Forrest said. "The look and feel of the program and
  • Wouldn't allowing each app to have its own instruction set create yet another kind of programming bugs, and make debugging really hard?
  • Her team published a paper last year on a system dubbed RISE (Randomized Instruction Set Emulation)

    My Windows machine already performs plenty of "Random Instructions", thank you very much.

  • by Anonymous Coward
    Marcus Ranum's opinion
    -----------------------
    Monoculture Hype Alert!
    NSF Grants Two Universities $750,000 to Study Computer Monocultures (25 November 2003)
    With the help of a $750,000 National Science Foundation grant, Carnegie Mellon University and the University of New Mexico will study computer "monocultures" and the benefits of diverse computing environments. "The researchers intend to create an application that could generate diversity in key aspects of software programs, thus making the same vulnerabili
  • It's a novel concept, but I can't picture how it would work outside of Open Source software.
    To run a program on such a chipset, it must be specifically compiled for that chipset. So for commercial applications, you either require a separate version for every possible chipset, or a method for the user to compile it for their computer. The latter isn't rational - all it takes is a single unscrupulous user to leak the code, the program gets out of your control. As for the former, I can picture going to a st
  • How are companies supposed to distribute copies of their closed, binary only applications. I cannot see Microsoft willing to let users compile their own copies of windows, office, exchange, visual studio, etc to match their architecture. I cannot see Microsoft compiling binaries to match a user's given architecture. I even more cannot see the average person being able to successfully do this on their own. Imagine introducing the nearest lay person you know to Gentoo and telling them to get a system oper
    • How are companies supposed to distribute copies of their closed, binary only applications.


      I'd say, the same way they do now, except that the executable would contain enough information so that the installer process can swizzle the code around in a random fashion. To the user there would be no visible difference, but to a virus that was relying on the code or data being laid out in memory in a certain way, it would be completely different from what the virus was "expecting".

    • As the sibling poster says, there is no need to include source or ask people to recompile. The binary can undergo a reversible transformation at installation (or even at load time). Then, during execution, each instruction is essentially decrypted/decoded with the appropriate secret key.

      One of the major benefits of instruction set randomization is that you *don't* need the source code or to recompile to get the security benefits.

      The only *real* downside is the performance hit (and the fact that it doesn

      • I believe both of you missed the point of the article. Code obfuscation wasn't the goal. That exists now. If I want to move around bytes in some form of reversible transformation I can download one of a plethora existing tools on the internet. The article suggested having the processor actually use a different instruction set. The binaries would be specific to the computer on which the code is run. It's not a polymorphism idea at all. It is creating unique executables for unique computers. The diffe
        • I wasn't suggesting "moving bytes around." The binary undergoes a reversible transformation (like XOR with some key). This creates a "new" binary based on a "new" instruction set specific to the key used in the transformation. Then the binary is decoded at runtime with the key. Anything injected into the binary causes an exception (either invalid opcode or invalid memory reference, etc.) The idea is the same whether or not a software system does the execution (an emulator) or the hardware does it (special n
  • This would appear to be an attempt to increase security by hiding the instruction set. Security through obscurity is not effective for long and anyone interested in hardening their system would be much better advised to use defence in depth.

    In the tradition of Slashdot, I have not RTFM but I imagine that this technique would not help with non-binary code injection (e.g. SQL).

    However, increasing the diversity is a valid weapon against scripted attacks (including those real-world, RNA scripted viruses). Perha
    • This would appear to be an attempt to increase security by hiding the instruction set. Security through obscurity is not effective for long and anyone interested in hardening their system would be much better advised to use defence in depth.

      The idea is to protect against automated attacks that currently rely on undefined behaviour that is the same for all targets. Example: Currently, if you can figure out how to fool Internet Explorer into munging memory at the right spot, you can use that knowledge to in

    • "In the tradition of Slashdot, I have not RTFM but..." here's my opinion anyway.

      Sort of a microcosm of the world at large, don't you think?
    • In fact, the general concept of "instruction set" randomization, where instruction set is loosely defined can be broadly applied. In particular, this paper looks at SQL randomization:

      http://www1.cs.columbia.edu/~angelos/Papers/sqlran d.pdf [columbia.edu]

      and this paper also looks at instruction set randomization, and randomizing Perl:

      http://www1.cs.columbia.edu/~angelos/Papers/instru ctionrandomization.pdf [columbia.edu]

    • So, what you're saying is that the idea is useless because it can't protect a given machine from an attacker that knows its key?

      Isn't that like saying an immune system is no good to you because it doesn't stop your neighbor from running you down with his car?

      Or staying closer to the original analogy, it would be like saying you shouldn't get a booster shot, because someone can always create a virus hand-tailored to exploit your genetic makeup.
      • What I'm saying is that most exploits start with a port scan to determine operating system and vulnerable services on the victim machine (ironically, it's the diversity of the responses to the scan that reveals the information). Then the attack generally involves supplying data to a service program to gain entry to the machine. Since most of these attacks do not rely on injecting binary code, this attack would work regardless of the instruction set of the machine. Once on the machine, determining it's key w
        • I think you're wrong, in that most attacks do rely on the injection of binary code. Every buffer overflow exploit I'm aware of involves overwriting the stack, replacing the OS's instructions with the exploiter's code, then having that code execute. That's the step you identify as "sending data to the service", which is correct. But it needs to be the right data: the code that will make the machine do what you want.

          While the plaintext of the executable may be known (sometimes you get custom compile jobs),
          • I'm fairly sure that you're right about most of that too! You're definitely right about buffer overflows but canary values and randomised positioning of the stack (diversity again) are making those harder and more vulnerabilities seem to be non-binary attacks. I've no hard figures to support this though.

            However, once on a machine using a non-binary exploit, I can use the executables on it to transfer a sample of known programs to my machine, where I can crack the code using standard techniques (in effect, i
  • Tech support for large companies is tough enough as it is - throw deliberate diversity into the mix and support would become a nightmare.
  • "But honey, with too many layers of data protection, I can't feel the Internet properly!"

    ....sorry.

  • New Page 1

    In theory this might work to provide slower spreading infections, in practice it will cause more problems than it solves .

    As a security practitioner for more than ten years, I can tell you that this type of diversity makes security management more difficult. Can you imagine trying to troubleshoot a problem when you don't know what the code is supposed to look like this time, or where it loads this time or how it interacts with other components this time.

    I can also say that pretty muc

    • My experience has been quite the opposite. We have had many incidents in the last three or four years where we had to have IT staff go around to every computer of a specific type and do a particular procedure to handle a security issue. For example, a while back we had to go around and manually remove the PNP worm from every machine running Windows 2000 on our network. This was before the patch came out on Windows Update. It took about three days to get to every machine and it would have been a lot wors
  • We already have malicious code that can replicate and spread itself. The only thing we're missing in terms of real Darwinian evolution is mutation

    Actually there is code that does just that [wikipedia.org], but as far as I am aware genetic programming hasn't been used to make viruses.

  • how is this different from code obfuscators?
  • A computer virus that followed Darwinian evolution (as I understand it) would make copies of itself, each with a small change, and execute it. Eventually, (infinite monkeys at infinite typewriters) it will create a better virus. Repeat infinite times.
  • Seems to me Gentoo is the ideal candidate for this type of thinking. With the variety of hardware out there, the combinations of assembly boggle the mind.
  • A 2005 paper by David Evans, "Where's the FEEB? The Effectiveness of Instruction Set Randomization" [virginia.edu], demonstrates how to remotely determine the key for this protection scheme in under 6 minutes. The paper goes on to examines diversity defenses more broadly to examine schemes that might be resistant to such attacks. The author also gave an interesting talk at USENIX Security Symposium on What Biology Can (and Can't) Teach Us About Security [virginia.edu], which is probably a better paper for this article to point to.
  • Nature presents "tests" for advancement, here is today's email from postings@ic.fbi.gov

    " The following Federal Bureau of Investigation job was just posted at https://jobs1.quickhire.com/scripts/fbi.exe [quickhire.com] "

    Job # HO-2006-0045 (0080 Security Specialist) $108,145.00

    Is this really just a test of whether a real IT person would:
    1. Click a link from inside an Outlook variant?
    2. Navigate to a folder called "scripts" using a Microsoft product?
    3. Start an immediate download of a Windows EXEcuteable?

    Submitted f

  • Its easy to make a binary that is less subject to unknown attacks than the factory versions. I've been doing this for years and its not too hard. Start by building everything from source. Find the link order and change that around. Look at the build options since you may not need that -O2. There are programs that will rearrange the order of the variables which changes the stack order and some will even rearrange the calling order. You can even add filler as well. If your going to rebuild an entire os
  • This idea is an implementation of Automated Diversity, presented in 1977 (!!!), furthermore the RISE method is described here in a paper from 1995.
  • How about just fixing the Memory Management Unit [wikipedia.org] so as it don't get buffer overflows etc. And don't say it ain't possible.

    As for the above I recall reading something similar about scrambling the microcode table and the opcodes in the actual program residing on disk. Since each processor would have its own unique instruction set viruses/trojans would be stopped in their tracks. And what's more you don't have to learn Calculus [unm.edu]

The absent ones are always at fault.

Working...