Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Microsoft

Submission + - Microsoft pays companies to use LiveSearch

tsa writes: "On John Battelle's Searchblog there is a piece about how Microsoft pays companies to not only have them use MS's Live Search engine, but also get information about the amount of users and PC's in the company. Companies get paid a fee per PC. The height of the fee depends on the usage of the MS LiveSearch engine. The use of IE7 is mandatory for this, because MS wants the companies to use a plugin for this browser to be able to keep track of the user's searching habits. If you ask me, this is bribery."
Security

Submission + - Monthly OS Security Scorecard and answer!

Flosse writes: "as mentioned on OSNews, Jeff Jones of Microsoft is makingg a monthly OS security scorecard here. This is fine and dandy but the numbers are a bit weird since they take into account only how many 'fixes' the companies have issued each month. In response I have made a similar security scorecard, adding a few more factors over the same time period, the picture is quite different I have to say. Though his graphs are nice, without any major factors you can turn this "scorecard" into any direction you want and funny enough, his favor Microsoft."
Quickies

Submission + - High School Student Builds Fusion Reactor

deblau writes: "In 2006 Thiago Olson joined the extremely sparse ranks of amateurs worldwide who have achieved nuclear fusion with a home apparatus. In other words, he built the business end of a hydrogen bomb in his basement. A bright plasma "star in a jar" demonstrated his success. "The temperature of the plasma is around 200 million degrees," Olson says modestly, "several times hotter than the core of the sun.""
Data Storage

Journal Journal: Life of CDs for archiving data

One of the science fair entries I judged today was on the use of CDs to archive data and the expected lifetime. The students did accelerated lifetime testing at 80C to determine the failure rate of the cyanine dye on which is written the data. They didn't have enough time at 80C to detect any failures. Of more interest to me and /. folks is conversations they had with people at NIST and the Library of Congress. The students learned that CDs lifetimes have greatly improved d
PHP

Submission + - Fake OS project attempts to copyright Pligg code..

xeit writes: "Pligg, the open source Digg-like CSM has been sent a C&D from a project called SuperGu, another open source project run by a little passionate man named James Phelps. He claims he owns their CSS and other aspects of the project including RSS and the RSS icons.. (all of them) It looks like people haven't learned their lesson on claiming ownership of things that are clearly not theirs. The issues in question can be found here"
The Matrix

Submission + - Fundamental particals not so fundamental

SpinyNorman writes: A new "string-net" theory of matter by researchers Xiao-Gang and Michael Levin, initially created to explain the fractional quantum hall effect (FQHE), has been shown to derive fundamental particals such as quarks and gluons, as well as naturally giving rise to Maxwell's equations of light. The new theory also predicts a new state of matter that appears to have been experimentally verified, and oddly enough also occurs naturally in a gemstone, Herbertsmithite, that was discovered over 30 years ago. The new theory builds on the work of Nobel physicist Robert Laughlin, and according to the New Scientist report has already attracted the attention of luminaries such as Fields medallist Michael Freedman who describes it as beautiful.
Google

Submission + - Super Mega Zoom Hack for Google Maps

Criticker writes: "Select a location and switch to satellite view. Zoom in as far as you can, and click 'link to this page' at the top right. Now replace the 'z' parameter in the URL with a higher value, e.g. 20, 22, or 23, and wait. Some locations will now show more detailed imagery.

More Details, specific Urls and images: Link"
Operating Systems

Submission + - Version control for config files, scripts etc....

TokyoCrusaders92 writes: Like a lot of other organizations (800 staff, 5000 students) we have a mix of windows, novell & linux...primarily linux...for our IT infrastructure. We now have a multitude of config files, firewall rule bases, shell scripts etc which are managed by multiple people / groups. Recently we started using RCS for version control of the firewall rulebase but this doesn't seem like it would scale up to larger groups of users. What are other people using to manage their config files....nice features would include version control, logging, multiple users, secure authentication, integrity checking...?
Linux Business

Submission + - SCO Admitting the End My Be Near?

inetsee writes: "According to Groklaw, SCO has admitted in a 10K filing that if the court grants any or all of IBM's six motions for summary judgement, 'We can not guarantee whether our claims against IBM or Novell will be heard by a jury.'"
Google

Submission + - Things to know before Google indexes your website

Etherfast writes: "The same way you clean up your house before your guests arrive, the same way you should get your website ready for Google's crawler, as this is one of the most important guests you will ever have. According to that, here are 10 things you should double check before submitting your website to the index. If you want, you can view this article as the correction of the top 10 mistakes made by webmasters.
1. If you have a splash page on your website, make sure you have a text link that allows you to pass it. I've seen many websites with a fancy flash introduction on the index and no other way to navigate around it. Well, Google can't read into your flash page, and therefore it cannot bypass it. All you have to do is put a text link to your website's second index, and the deed is done.
2. Make sure you have no broken links I know this is kind of obvious, but you'll be surprised to find out how many errors is the Google crawler experiencing daily due broken links. Therefore, you'd better check and double check every internal link of your webpage before submission. Don't forget that your links are also your visitor's paths to your content. It's not all about Google, you know :)
3. Check the TITLE tags Since you are able to search in title tags on Google and since the title tags is displayed in the top of your browser window, I'd say this is an important aspect you need to check. This doesn't mean you have to compile a >20 keywords list there. Instead, make it a readable sentence since it's viewable by both crawlers and surfers.
4. Check the META tags Rumors about Google not caring about META tags are not 100% correct. Google relies on these tags to describe a site when there's a lot of navigation code that wouldn't make sense to a human searcher, so why not make sure you're all in order and set up some valid KEYWORDS and a valid DESCRIPTION. You never know.
5. Check your ALT tags The ALT tags are probably the most neglected aspect of a website since no one bothers to put them in order. It's definitely a plus if you do, so Google spider can get a clue about all of your graphics. However, don't go extreme and start explaining in an ALT tag that a list bullet is a list bullet.
6. Check your frames If you use frames on your website, you might not be indexed 100%. Google actually recommends that you read an article of Danny Sullivan on Search Engines and Frames. You have to make sure that either Google can read from your frames, either that it has an alternative, defined via the NOFRAMES tag.
7. Do you have dynamically generated pages? I know the web evolved so much in the last period of time, and more and more websites based on dynamic scripting languages (PHP, ASP, etc) are coming out every second, but Google said they are limiting the amount of dynamic webpages they're indexing. It's not too late to consider a compromise and include some static content in your pages. It helps.
8. Update your content regularly This is an important aspect that you should consider, since Google indexes more quickly pages that get updated on a regular basis. You will notice that the number of pages indexed by the search engine will increase day by day if you update, but will stagnate or decrease if you don't bring something new. I suggest setting up a META option in the header that will tell Google how frequently should it come back for a reindexing.
9. The robots.txt This file is a powerful resource if used properly. You have the chance to filter out the bots that crawl your website, and you have the chance of restricting access to certain URL's that should not be indexed (login pages, admin backends, etc).
10. To cache or not to cache?
Google caches some webpages for quick access, but some webmasters do not like that. The procedure is quite simple. All you have to do is write a line of code between your HEAD tags.
META NAME="ROBOTS" CONTENT="NOARCHIVE" — should be enough to stop all robots from caching and archiving the page where the code is embedded. All these being said, you can now submit your website to Google's index."
Announcements

Journal Journal: PDF to become an open, ISO standard

This is great news: "Adobe Systems Inc. on Jan. 29 announced that it has released the full PDF (Portable Document Format) 1.7 specification to AIIM, the Association for Information and Image Management. AIIM, in turn, will start working on making PDF an ISO standard."
Now I won't have to start endless discussions with people not liking PDF because it is 'proprietary', an argument that IMHO made no sense because Adobe has alwa

Slashdot Top Deals

After the last of 16 mounting screws has been removed from an access cover, it will be discovered that the wrong access cover has been removed.

Working...