Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Link pyramids (Score 1) 38

The RescueTheWeb article is a high level discussion of link architectures that currently exist in the wild. The article wasn't trying to show samples since disclosure of which websites are breached is against the privacy policy of RescueTheWeb. These are private websites that have been breached by others and used to create these various structures. Thus, their web addresses would revel who's website were breached. I can tell you that an example 'constellation' Google look-alike search engine consists of some 26 domains of this pattern: http://googpill_.com/ where the '_' is the letters from 'a' to 'z'. When you visit these sites directly they say 'Under Construction', but when you visit them from a hacked site you get the Google look-alike. (Not all of the lettered domains appear to be working.) Follow this link to see an example: http://googpillc.com/zgyllgiaahkeiryy_idknxqkbi.py This constellation example is different than the pyramid example from the seoblackhat. The goal of this constellation, as an example, is to confuse the user into thinking they are on Google, it is not to increase page rank.
Spam

Submission + - Several Link Spam Architectures Revealed (blogspot.com)

workie writes: Using data derived from website infections, RescueTheWeb.org has found several interesting link spam architectures. One architecture is where concentric layers of hijacked websites are used to increase the page rank and breadth of reach (within search engine search results) of scam sites. The outer layers link to the inner layers, eventually linking to a site that redirects the user to the scam site. Another architecture involves hijacked sites that redirect the user to fake copies of Google, having the appearance that the visitor is still within Google, but in reality they are on a Google look alike that contains only nefarious links.

Comment Many people are working to help App insecurity. (Score 2, Insightful) 65

I just wanted to point out that many organizations and people are trying to resolve the global web-insecurity issue caused by many things including application insecurity. Google is just one participant in this effort. What is frustrating is that when Google talks people call it news. When these other organizations make contributions, nothing is heard.
Google

Submission + - Google Asks US For WTO Block on China Censorship (businessweek.com)

An anonymous reader writes: Google is asking the US Government to petition the WTO to recognize China's censorship as an unfair barrier to trade. The US Trade Representative is reviewing their petition to see if they can prove that China's rules discriminate against foreign competition. At least it's something worthwhile for the US Trade Reps to do, rather than secretly negotiating ACTA.
Science

Submission + - Scientists Develop Financial Turing Test (technologyreview.com)

KentuckyFC writes: Various economists argue that the efficiency of a market ought to be clearly evident in the returns it produces. They say that the more efficient it is, the more random its returns will be and a perfect market should be completely random. That would appear to give the lie to the widespread belief that humans are unable to tell the difference between financial market returns and, say, a sequence of coin tosses. However, there is good evidence that financial markets are not random (although they do not appear to be predictable either). Now a group of scientists have developed a financial Turing test to find out whether humans can distinguish real financial data from the same data randomly rearranged. Anybody can take the test and the results indicate that humans are actually rather good at this kind of pattern recognition.
IBM

Submission + - IBM Researcher Develops Energy Efficient Algorithm (yahoo.com)

jitendraharlalka writes: IBM Research today unveiled a breakthrough method based on a mathematical algorithm that reduces the computational complexity, costs, and energy usage for analyzing the quality of massive amounts of data by two orders of magnitude. This new method will greatly help enterprises extract and use the data more quickly and efficiently to develop more accurate and predictive models.

In a record-breaking experiment, IBM researchers used the fourth most powerful supercomputer in the world — a Blue Gene/P system at the Forschungszentrum Julich in Germany — to validate nine terabytes of data (nine million million or a number with 12 zeros) in less than 20 minutes, without compromising accuracy. Ordinarily, using the same system, this would take more than a day. Additionally, the process used just one percent of the energy that would typically be required.

The breakthrough will be presented today at the Society for Industrial and Applied Mathematics conference in Seattle.

"In a world with already one billion transistors per human and growing daily, data is exploding at an unprecedented pace," said Dr. Alessandro Curioni, manager of the Computational Sciences team at IBM Research – Zurich. "Analyzing these vast volumes of continuously accumulating data is a huge computational challenge in numerous applications of science, engineering and business. This breakthrough greatly extends the ability to analyze the quality of large volumes of data at rapid speeds."

Slashdot Top Deals

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (2) Thank you for your generous donation, Mr. Wirth.

Working...