Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Doesn't intent matter... (Score 1) 101

By registering these domains he prevented the senders from getting a message that the url in the address they were sending to did not exist. Presumably he also made it so whatever catch all for the typoed domains wouldn't report an error. If he hadn't set up these domains then the senders would have received automated messages informing them their emails weren't delivered. While he didn't violate the law in stopping these emails from bouncing with errors, his behavior certainly wasn't ethical and did disrupt the intended communications.

Comment Re:Paywall sites are going to be hit pretty hard (Score 1) 345

The classic example is JavaScript-rendered dynamic content. This tends not to work so well when you're dealing with search engines. However, if you can serve them a static page that contains the text of the page minus all the rendering, then it can index the content without choking on the JavaScript. I'm not sure how important this is these days, but it certainly was a problem at one time.

That's what the noscript tag is for

It's also useful to serve modified versions for search engines so that searches for content within your site can return more relevant results. For example, you might insert certain keywords that describe the content of the page using terms that don't actually appear. Case in point, your page talks about Airport, but you serve a copy to Google that inserts the terms 802.11 and Wi-Fi.

That's what the meta name=keywords tag is for

Finally, there's the question of bandwidth and CPU overhead. If your site changes a lot, Google beats on your servers rather frequently. You can reduce the bandwidth hit by stripping JavaScript, CSS, images, etc. from your content before serving it to Google. This won't significantly change the searchability of the content, but will reduce the bandwidth overhead. And, of course, if there are static versions of content that you can serve instead of a server-side-dynamic version, this also saves on CPU overhead.

Google spiders text, not images. It also doesn't spider the text of css or javascript files. Also, I question how effective it is to dynamically decide to serve a static page based on a user-agent as opposed to merely serving everyone the dynamic page.

Slashdot Top Deals

2.4 statute miles of surgical tubing at Yale U. = 1 I.V.League

Working...