Virus Trackers Find Malware With Google 113
Casper the Angry Ghost writes "Malware hunters have figured out a way to use the freely available Google SOAP Search API, as well as WDSL, to find dangerous .exe files sitting on thousands of Web servers around the world. Queries can be written to examine the internals of web-accessible binaries, thus allowing the hunters to identify malicious code from across the internet." From the article: "We're finding literally thousands of sites with malicious code executables. From hacker forums, newsgroups to mailing list archives, they're all full of executables that Google is indexing. About 15 percent of the results came back from legitimate Web sites hijacked by malicious hackers and seeded with executables."
do no evil, rat out evil (Score:5, Interesting)
This raises Google's "no evil" equity significantly. Any mechanism to sniff out, identify, and hopefully proactively take measure to protect against the evil that is the web and its sinister demographic is a good thing.
So, Google takes the "do no evil" a step further and calls evil out.
There is a quote from the article I don't quite understand,
Is there some potential badness that Google is indexing binary file content? What might that be?
Re:do no evil, rat out evil (Score:5, Interesting)
In any case, the only thing I can figure about the quote is that Google indexing these sites helps to spread the malware around. Somebody could type in "l337 hax0rs hax" and end up at a malware site.
Web Site Contact (Score:3, Interesting)
They could also build a list of these sites to periodically check them to make sure the malware files have been removed.
And it would be nice if they allowed a search facility so some FireFox/SeaMonkey plugin could check to see if that site you are going to has malware installed.
Securing the Search Engine? (Score:5, Interesting)
The 15% of sites that are reputable sites being attacked are the biggest threat. These are probably websites people visit often, and people should be warned. Perhaps even web browsers such as firefox and i.e. could incorporate the API into a toolbar and warn users before a dangerous site loads.
My only question is how long does it take for the API to verify the potential threat of a webserver? Is it fast enough for these applications to be feasible? No one wants to wait for their websites to load.
How to (Score:5, Interesting)
Then, click View HTML
Well, malware *writers* can do the same (Score:3, Interesting)
Indexing these MAY be exploitable (Score:5, Interesting)
The idea is to put up useful content into the web site, along with the exploit. Google will index, and when the target searches google, the code will be injected into the search results.
Of course, this needs hacking; both trying to figure out what google will allow in the content section, and to find a browser exploit that can be exploited.
Just sayin...
Your point of trust (as the target) is your browser. Which means ONLY open source browsers should be used. Those, at least, are controllable as to the exposure and behaviour when being delivered content.
Ratboy
Re:Web Site Contact (Score:4, Interesting)
Or maybe a system to allow automatic DNS cache injection (on my own DNS client) to prevent lookups going to the correct (infected) site.
Once sites realize that big parts of user base is cutting them off premptively, they'll take notice and get rid of the crap so they can get users back.
Re:Web Site Contact (Score:2, Interesting)
Re:Web Site Contact (Score:3, Interesting)
Which doesn't help the rest of us. And why should a site owner get all bent out of shape if you tell them something they didn't happen to know? They must not be in direct control of the site or are pretty lazy if they are allowing this malware to pile up. And they won't be popular for very long if people catch on that the site is infecting them.