Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Data Storage

Journal Journal: Life of CDs for archiving data

One of the science fair entries I judged today was on the use of CDs to archive data and the expected lifetime. The students did accelerated lifetime testing at 80C to determine the failure rate of the cyanine dye on which is written the data. They didn't have enough time at 80C to detect any failures. Of more interest to me and /. folks is conversations they had with people at NIST and the Library of Congress. The students learned that CDs lifetimes have greatly improved d
PHP

Submission + - Fake OS project attempts to copyright Pligg code..

xeit writes: "Pligg, the open source Digg-like CSM has been sent a C&D from a project called SuperGu, another open source project run by a little passionate man named James Phelps. He claims he owns their CSS and other aspects of the project including RSS and the RSS icons.. (all of them) It looks like people haven't learned their lesson on claiming ownership of things that are clearly not theirs. The issues in question can be found here"
The Matrix

Submission + - Fundamental particals not so fundamental

SpinyNorman writes: A new "string-net" theory of matter by researchers Xiao-Gang and Michael Levin, initially created to explain the fractional quantum hall effect (FQHE), has been shown to derive fundamental particals such as quarks and gluons, as well as naturally giving rise to Maxwell's equations of light. The new theory also predicts a new state of matter that appears to have been experimentally verified, and oddly enough also occurs naturally in a gemstone, Herbertsmithite, that was discovered over 30 years ago. The new theory builds on the work of Nobel physicist Robert Laughlin, and according to the New Scientist report has already attracted the attention of luminaries such as Fields medallist Michael Freedman who describes it as beautiful.
Google

Submission + - Super Mega Zoom Hack for Google Maps

Criticker writes: "Select a location and switch to satellite view. Zoom in as far as you can, and click 'link to this page' at the top right. Now replace the 'z' parameter in the URL with a higher value, e.g. 20, 22, or 23, and wait. Some locations will now show more detailed imagery.

More Details, specific Urls and images: Link"
Operating Systems

Submission + - Version control for config files, scripts etc....

TokyoCrusaders92 writes: Like a lot of other organizations (800 staff, 5000 students) we have a mix of windows, novell & linux...primarily linux...for our IT infrastructure. We now have a multitude of config files, firewall rule bases, shell scripts etc which are managed by multiple people / groups. Recently we started using RCS for version control of the firewall rulebase but this doesn't seem like it would scale up to larger groups of users. What are other people using to manage their config files....nice features would include version control, logging, multiple users, secure authentication, integrity checking...?
Linux Business

Submission + - SCO Admitting the End My Be Near?

inetsee writes: "According to Groklaw, SCO has admitted in a 10K filing that if the court grants any or all of IBM's six motions for summary judgement, 'We can not guarantee whether our claims against IBM or Novell will be heard by a jury.'"
Google

Submission + - Things to know before Google indexes your website

Etherfast writes: "The same way you clean up your house before your guests arrive, the same way you should get your website ready for Google's crawler, as this is one of the most important guests you will ever have. According to that, here are 10 things you should double check before submitting your website to the index. If you want, you can view this article as the correction of the top 10 mistakes made by webmasters.
1. If you have a splash page on your website, make sure you have a text link that allows you to pass it. I've seen many websites with a fancy flash introduction on the index and no other way to navigate around it. Well, Google can't read into your flash page, and therefore it cannot bypass it. All you have to do is put a text link to your website's second index, and the deed is done.
2. Make sure you have no broken links I know this is kind of obvious, but you'll be surprised to find out how many errors is the Google crawler experiencing daily due broken links. Therefore, you'd better check and double check every internal link of your webpage before submission. Don't forget that your links are also your visitor's paths to your content. It's not all about Google, you know :)
3. Check the TITLE tags Since you are able to search in title tags on Google and since the title tags is displayed in the top of your browser window, I'd say this is an important aspect you need to check. This doesn't mean you have to compile a >20 keywords list there. Instead, make it a readable sentence since it's viewable by both crawlers and surfers.
4. Check the META tags Rumors about Google not caring about META tags are not 100% correct. Google relies on these tags to describe a site when there's a lot of navigation code that wouldn't make sense to a human searcher, so why not make sure you're all in order and set up some valid KEYWORDS and a valid DESCRIPTION. You never know.
5. Check your ALT tags The ALT tags are probably the most neglected aspect of a website since no one bothers to put them in order. It's definitely a plus if you do, so Google spider can get a clue about all of your graphics. However, don't go extreme and start explaining in an ALT tag that a list bullet is a list bullet.
6. Check your frames If you use frames on your website, you might not be indexed 100%. Google actually recommends that you read an article of Danny Sullivan on Search Engines and Frames. You have to make sure that either Google can read from your frames, either that it has an alternative, defined via the NOFRAMES tag.
7. Do you have dynamically generated pages? I know the web evolved so much in the last period of time, and more and more websites based on dynamic scripting languages (PHP, ASP, etc) are coming out every second, but Google said they are limiting the amount of dynamic webpages they're indexing. It's not too late to consider a compromise and include some static content in your pages. It helps.
8. Update your content regularly This is an important aspect that you should consider, since Google indexes more quickly pages that get updated on a regular basis. You will notice that the number of pages indexed by the search engine will increase day by day if you update, but will stagnate or decrease if you don't bring something new. I suggest setting up a META option in the header that will tell Google how frequently should it come back for a reindexing.
9. The robots.txt This file is a powerful resource if used properly. You have the chance to filter out the bots that crawl your website, and you have the chance of restricting access to certain URL's that should not be indexed (login pages, admin backends, etc).
10. To cache or not to cache?
Google caches some webpages for quick access, but some webmasters do not like that. The procedure is quite simple. All you have to do is write a line of code between your HEAD tags.
META NAME="ROBOTS" CONTENT="NOARCHIVE" — should be enough to stop all robots from caching and archiving the page where the code is embedded. All these being said, you can now submit your website to Google's index."
Announcements

Journal Journal: PDF to become an open, ISO standard

This is great news: "Adobe Systems Inc. on Jan. 29 announced that it has released the full PDF (Portable Document Format) 1.7 specification to AIIM, the Association for Information and Image Management. AIIM, in turn, will start working on making PDF an ISO standard."
Now I won't have to start endless discussions with people not liking PDF because it is 'proprietary', an argument that IMHO made no sense because Adobe has alwa

Spam

Submission + - Catching spam by looking at traffic, not content

AngryDad writes: HexView has proposed a method to deal with spam without scanning of message bodies. Instead, the method is based solely on traffic analysis. They call it STP (source trust prediction). An RBL-like server collects SMTP session source and destination addresses from participating MTAs and applies statistics to identify spamlike traffic patterns. A credibility score is returned to the MTA, so it can throttle down or drop possibly unwanted traffic. While I find it questionable, the method might be useful when combined with traditional keyword analysis.
Wireless Networking

Submission + - Is static actually good for wireless networks?

BobB writes: "The National Science Foundation has awarded a 5-year, $400,000 grant to explore ways of using radio interference to actually improve wireless communications. Radio interference "is not just noise: it's structure — it's a communication going on between a pair of devices," a University of Illinois researcher says. http://www.networkworld.com/news/2007/012407-radio -interference.html"

Slashdot Top Deals

egrep -n '^[a-z].*\(' $ | sort -t':' +2.0

Working...