Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Cache Servers Keeping Exploit Code Alive 68

1960's architecture writes, "At last some evidence that exploit code is hiding on servers used to cache website content. According to Techworld, Israeli outfit Finjan has come up with evidence that real exploits have hidden on cache servers used by large search engines, effectively extending their life for periods of weeks after the original website had been taken down. The exploits detailed are from 2003-2004, but the principle would still apply to any exploit website around today, and any cache servers used by any one of the three unnamed search engines. It's almost literally malware 'life after death.'"
This discussion has been archived. No new comments can be posted.

Cache Servers Keeping Exploit Code Alive

Comments Filter:
  • by A beautiful mind ( 821714 ) on Thursday October 12, 2006 @03:16PM (#16412247)
    The brilliant study says: "content available as cache, even after the original source is not there, for some time"?

    Bravo! Bravo! Revolutionary thought!
    • Exploits from 2003 and 2004? You've had 2 years to patch your systems. Don't cry.
      • by Threni ( 635302 )
        Exactly. It's as if they're claiming you can use old, fixed exploits because of caches somewhere. A cache is like a photocopy. Is anyone suprised that the photocopy exists after the original is lost? Isn't that the whole point?
      • Re: (Score:2, Insightful)

        by Anonymous Coward
        Here's a long-view perspective though. In my research (chemistry) I use a 486 almost daily. The computer is infected with an old innocuous boot-sector virus, and I simply don't remember enough DOS/486 era stuff to put on a proper antivirus solution without seriously diverting my research in the short term. Luckily, my modern-era computer is solid vs. this old school virus - this is the other reason I haven't bothered fixing the old one. If this were a nastier virus, and my AV protection didn't go back f
    • It sounds like you're missing their point: These "caching" sites are storing the data from the original site! This has got to be stopped immediately!
    • People running a web cache *ought* to scan the cache directory periodicly with a virus-scanner.

      For a specific example, I use Squid + ClamAV both for at work and at a number of client sites for which I provide sysadmin support; every so often, the scan of the squid cache files finds an exploit being cached, and I can look that specific file up against the Squid logs, and identify which client machine was responsible for accessing the malware.

      The next steps are to check the client machine and see whether it h
  • What's the use of relying on a site been taken down?

    You should patch your software in any case, otherwise the exploit still works if it is put somewhere else.

    • Exactly. The people behind this "discovery" seem to think that the best way to combat security holes is to go after the exploit demonstration code, rather than, say, actually fixing the problem.

      That's what's really frightening; that there are exploits that have been in the wild and in the hands of the black hats for three years, which still have not been patched.

      Those "exploit sites" are not the enemy here. If anything, they're a powerful tool that lets the 'good guys' be on equal footing, or near equal footing, with the bad guys, who are probably trading exploits around in IRC channels regardless of whether they're on the WWW or cached or not.
      • I didn't RTFA so this may be covered or off topic, but...

        IMO there is a big difference between posting information and posting exploits. If I write a convienent tool to hack something, then publish if for script kiddies everywhere does that improve or reduce security for everyone?

        If I descover a new way of breaking into a car and tell everyone, isn't that different than selling the tools to do so?

        I agree that spreading the information is valuble. I don't believe that spreading the cracks to use sai
  • Hey sucka, gimme your cache!
  • by jZnat ( 793348 ) * on Thursday October 12, 2006 @03:24PM (#16412389) Homepage Journal
    How about fixing the problem that's exploited rather than try to hide the problem's existence in the first place?
  • by nickheart ( 557603 ) <[moc.liamg] [ta] [namtrah.j.kcin]> on Thursday October 12, 2006 @03:28PM (#16412467)
    ... and think of all those old hard disks with exploits on them. We need to go to the dump and degauss all of them, NOW! C'mon people, this is a security issue.

    gimme a break, a cache is a cache, it's supposed to have old information, even if that information is wrong, or destructive.

    • by hurfy ( 735314 )
      hehe, not as funny as it sounds when you try and infect your system cause you are trying to rebuild your XT ;) Apparently i had more infected disks than i thought way back when.

      Luckily even the 15 year-old 386 i was using as a go-between recognized Michelangelo :)

      Great i have viruses old-enough to drive now.....
  • i guess i'm going to show my complete ignorance of web development and teh intarweb at large, but here goes:

    why on earth would something get cached if it is malware infected/contains exploits without being cleaned at some future time when said malware or exploits are discovered?

    i know the caching is an automated process, but the caches themselves aren't scanned for malware/code exploits like the live sites?
    • by geoffspear ( 692508 ) on Thursday October 12, 2006 @03:31PM (#16412519) Homepage
      If by "like the live sites" you mean "not at all", then yes, they're scanned exactly the same.
      • point taken. i guess i have this naive idea that larger sites at least are somewhat regualarly reviewed for this kind of stuff. i suppose not. and now with google &c. caching everything, every tom, dick and harry who puts up a site that gets hacked/infected/is poorly written in the first place winds up being preserved for posterity in a cache somewheres. i really ought to learn more about the internets. i mean, i'm mostly a hardware geek. but i should still be able to understand it, what with the
    • Re: (Score:1, Funny)

      by Anonymous Coward

      i know the caching is an automated process, but the caches themselves aren't scanned for malware/code exploits like the live sites?

      Ours are. We have an army of pixies and an ostrich called Sam who painstakingly audit and review everything we store on our web caches. We chose pixies because they're quite small and we can pack them tightly to get the density up. Real world IT solutions rarely scale up to enterprise performance without squashing a few little folk and sometimes it can be fun to squash a few an

      • this is why i love /.. i can comment on things i don't know much about and get funny sarcastic replies that are still more-or-less good-natured.

        that's three hyphens out of the last 26 characters i typed. not bad.

  • by jschottm ( 317343 ) on Thursday October 12, 2006 @03:35PM (#16412571)
    Blah [yahoo.com]

    Yahoo's cache can be addressed at rds.yahoo.com (compared to Google's cache, which uses IP addresses with no associated hostnames). Thus, all the various message boards that use the slashdot style of putting the domain name of the host will show yahoo.com even if it might be serving up an IE exploit that was hosted at mynastystuff.ru, increasing chances of click through. MSN uses a resolvable name for their cache as well, but it's at least identifiable as msncache.com rather than just msn.com.
  • Nothing for you to see here.

    Just us trojans invisibly taking over your system.
  • Excerpts from Vernor Vinge's [wikipedia.org] A Fire Upon The Deep [amazon.com]

    How to explain? How to describe? Even the omniscient viewpoint quails.

    A singleton star, reddish and dim. A ragtag of asteroids, and a single planet, more like a moon. In this era the star hung near the galactic plane, just beyond the Beyond. The structures on the surface were gone from normal view, pulverized into regolith across a span of aeons. The treasure was far underground, beneath a network of passages, in a single room filled with blac

  • Think Microsoft has patched them yet?
  • I thought that if an exploit was discovered, systems that could be infected were patched, rather than worrying too much about the virus itself staying in the wild.

    Sure, a lot of caches can keep very old content (the Wayback Machine www.archive.org would be a good example). But spread infection is mainly prevented by immunising systems, not by removing all known traces of the virus / trojan / etc. Bacteria and viruses can live in harsh conditions (relative to those that they require to thrive) but immunisat

  • <META NAME="ROBOTS" CONTENT="NOARCHIVE">
    <META NAME="msnbot" CONTENT="noarchive">

    Done.

  • This is more than just a theoretical danger.

    Yeah, if you're running your vulnerable server code out of the same cache. ;-)

    "What our latest report shows is that current processes to remove such malicious content from the Web are simply not going far enough to combat this very serious and growing threat."

    That's because removing the content doesn't combat the threat at all. Fixing the bugs that allow malicious code to work, is the only way to combat the threat.

    It is useless to try to put genies back into

  • Whenever there's an article about MySpace or Xanga, there are always people talking about how once you've published something to the web, you should assume it will always be available to anyone who wants it, even if you decide later you want to take it down.

    A kid may write on their xanga about how drunk they got thursday night, then decide to take it down saturday, but it's always possible a future employer could come up with it anyway. Likewise, developers should assume that any exploits that have ever be

  • So, does the Wayback machine keep exploits forever?
  • by Goldenhawk ( 242867 ) on Thursday October 12, 2006 @04:44PM (#16413469) Homepage
    This article has (here on /.) already raised the question "Why can't we stamp out the viral code from archives?" Well, let's take a lesson here from biology.

    The human race took two different solutions to polio and malaria. (I'm not a doctor, so forgive any minor inaccuracies.)

    With malaria, we took the "stamp out the viral archive" approach. We tried to kill the carriers - the mosquitos. If we can eliminate all the mosquitos that carry the infection (like eliminating old internet caches), nobody will have to worry about getting infected. Well, guess what - it didn't work. Malaria is a HUGE problem in many third-world countries, routinely killing a million Africans a year and costing $12 BILLION annually in Africa alone (see last week's WashPost Magazine article for details; registration required: http://www.washingtonpost.com/wp-dyn/content/artic le/2006/10/04/AR2006100400127.html [washingtonpost.com]). The problem? You simply can't squash all the bugs. Only recently has attention turned to developing an artificial method of immunity from the disease, so that the bugs won't matter (at least, from that perspective).

    With polio, we took the approach that preventing infection was the key. We innoculated EVERYONE, so that even if the virus surfaced, it wouldn't cause infections. It's proven to be a largely effective solution, with only a few periodic pockets of infection occurring in remote parts of Africa where the youngest are not innoculated afresh. And that problem is fairly easy to control.

    Same thing here. Forget the archives. That's naive. Instead, focus on better immunity.
    • by mgblst ( 80109 )
      Wow, ingenious. I guess they should have innoculated everyone against Malaria in Africe with the non-existent serum. Why didn't anyone think of that before.

      And to say that people have just started trying to create innoculations against Malaria is a truly stupid statement.
    • Of course, the fact that polio, unlike malaria, spreads from one human to another quite easily, making a strategy of killing potential carriers not particularly attractive, combined with the fact that a vaccine is easily made by killing the virus and then injecting it into people might have something to do with the different approaches too.

      Next, can you explain how emphasizing condom use instead of just giving everyone an AIDS vaccine shows that doctors today are increibly stupid?
  • So what? I find exploit code all the time, week, months, years after the fact. It's called Packet Storm Security [packetstormsecurity.org] or elsewhere.

    Hell, google.com cache pages are great for shit like this.
  • by tobiasly ( 524456 ) on Thursday October 12, 2006 @04:51PM (#16413589) Homepage
    It's almost literally malware 'life after death.'

    But is it almost literally, or literally almost? What would make it true life after death? (Literally)

    • It's almost literally malware 'life after death.'
      But is it almost literally, or literally almost? What would make it true life after death? (Literally)
      True life after death?

      If the 'fixed' page reverted to the malwared page 3 days after being nailed to the cross^W^W^W^Wcached

      /ducks

  • To the tone of a speech by a famous U.S. General --

    "Old (xxploits) never die, they only (hid) away (in proxy cache...)"
  • Does anyone else remember when if you wanted to be sure something would remain available for a few weeks, you just posted it to usenet?
  • by Lord Kano ( 13027 ) on Thursday October 12, 2006 @05:51PM (#16414559) Homepage Journal
    Trying to get something off of the internet is like trying to get pee out of a pool.

    Why not just patch the vulnerabilities? If publishers would fix their shortcomings then it wouldn't be an issue.

    LK
    • by mgblst ( 80109 )
      So to continue the analogy, patching the internet is forcing everyone in the pool to wear full wetsuits and diving gear, so that the pee doesn't affect them. Just ensure that not part of them is exposed. Brillant!
  • You don't fix security holes by trying to track down all the code that exploits it on the web, you fix security holes by fixing the software containing the security hole. So, it doesn't matter how long this stuff stays in anybody's cache.

I program, therefore I am.

Working...