Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
User Journal

Journal ericspinder's Journal: "Google" search engine improvement idea

From a comment that I made on search engines (correcting a small, but important error)...

I've even seen pages that return one set of content if your user-agent is "Googlebot", and another, totally different content (dialer, etc) if your user-agent is anything else.

This is probally Google's biggest problem. What they need to do is make a second pass at specific pages in a site which has recently been crawled with a more typical USER-AGENT to see if there is significant differences. It whouldn't have to hit every page. The second crawler could also check to see what is "visiable" to the user. When sites fail the second crawler, lower (or eliminate) their ranking. If the Secondary crawler has enough problems with a domain they can lower all of the rankings on it. If they have enough problems with an IP address (say over 35% of know domains)they can do the same for that address, ditto for subnets. Basicly use automated "scammer checks" to see if a particular area of the web is less "reputable" and then rank that area (directory, domain, IP address, subnet, netblock) lower. These scammer checks may change based on the flavor of the month.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...