Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet

Google Letting Users Rank Search Results 332

Myriad writes "C|Net News is running an article about Google testing out a new system which would let users rank pages. From the article, 'Two weeks ago, Google began quietly testing a Web page voting system that, for the first time on a large scale, could eventually let Web surfers help determine the popularity of sites ranked by the company's search engine.'" As someone who has a lot of experience with systems where users self rate content, let me just wish Google the best of luck. Especially since for many unscrupulous businesses, ratings in search engines directly translate to dollars.
This discussion has been archived. No new comments can be posted.

Google Letting Users Rank Search Results

Comments Filter:
  • Perhaps they could disqualify corporate business websites from being ranked.

    Thanks,

    Travis
    forkspoon@hotmail.com
    • Or perhaps have users check off that they are searching for commercial products. Thus two ranking systems would be in place. It would be a small operation, but one that would provide us with junk-free results. If it were possible, I'd be willing to give up an extra keystroke or two.

      Users who perenially search outside of corporate sites could be able to customize their setting so that they'd have to select when they want to include corporate sites. Could it work? I don't know.
      br>
      • by Bonker ( 243350 ) on Wednesday November 28, 2001 @05:19PM (#2626875)
        Users who perenially search outside of corporate sites could be able to customize their setting so that they'd have to select when they want to include corporate sites. Could it work? I don't know.

        Google already has a 'customized' interface that allows users to do things like change language, etc...

        I think the sugestion of separating corporate and non-corporate searches has its merits. I hate searching for an anime fanfiction and being directed to Best Buy's website because they happen to carry the anime title I mentioned in the search query.

        It has its problems too, however. Tagging each of the pages in Google's truly massive search database with a corporate or non-corporate tag is a non-trivial problem. For obvious reasons, website owners cannot be trusted to tag their own pages.

        You're also opening a can of worms here, since many website owners will protest either a commercial or a non-commercial tagging.

        Even if you tagged sites by domain, you'd still have hundreds of thousands... possibly millions of domains, not to mention sites that carry both corporate and private content like Geocities, Tripod, or other free webhosts.

        Then you have to consider what to do with semi-for-profit pages? Many pages have 'tipping jars' now. Many open-source software development pages have information about for-profit works, or are developed by for-profit organizations. Should companies like Redhat be excluded from non-profit searches? Probably. How about Vorbis Ogg? That's not nearly so clear. How about web-comics, almost all of which give away their content freely, but sell merchandise, dead-tree books, or other premiums.

        In the end, I think that I'd rather put up with having to sort through twenty or so highly relevant results to get the search result I wanted rather than having to search twice to make sure that I get all the possible relevant results.
  • by jaredcat ( 223478 ) on Wednesday November 28, 2001 @04:21PM (#2626409)
    Well as the marketing director for an unscrupulous business, let me be the first to say how much I am looking foward to being able to rate my competitors' websites on one of the most popular search engines.
  • this is certainly a valid problem to try and solve. for example, i just searched for "clueless phony" and jon katz's name was nowhere to be found.
  • an IP address doesn't neccessarily equate to a person. Companies can have thousands of IPs and google can't tell if its just one entity or 3000. I would predict that if this goes into effect the gator advertising thing thats bundled with just about any free download these days will be modified to rank up pages of those who pay them the most.
  • that is all that will happen, how are they going to stop multiple "votes"? by a cookie (that the voter can erase)? By tracking IPs (they wont put the resources into that large and complex of a system?

    I wish it would work, but it will be an abismal failure... in fact it wouldnt suprise me if some corperations hire people just to "vote" for their sites...

    just look at ANY top 50/100 voting sites and you know what I am talking about
    • So we're going to see "vote here before entering this site" screens on major websites now :-). Great... will it come with infinately looping pop-up XXX ads too?
    • That's assuming their voting interface will be a simple web page...I'm guessing they would write their own client-side application, similar to the google toolbar. With a google-written client communicating with their server, they should be able to come close (or at least make it very difficult to vote twice). There are lots of techniques that could work...dynamically generated keys, encryption, etc.
      • With a google-written client communicating with their server, they should be able to come close (or at least make it very difficult to vote twice). There are lots of techniques that could work...dynamically generated keys, encryption, etc.

        "If you think encryption will solve your problem, you don't understand encryption and you don't understand your problem." Bruce Schneier

        Anything that could be done by your hypothetical client could also be done by a person who has used a debugger on it. It's just not theoretically possible to prevent something like that with an authentication key embedded in the software. Or per-client keys...remember, they have to get the key somehow. How do you restrict it to one key per person? That's back to the original problem.

  • Ugh, ugh, ugh. (Score:2, Insightful)

    by Mr_Matt ( 225037 )
    People who specialize in pushing sites into the top rankings--a technique known as search engine optimization--say the company's success has made Google a new frontier to conquer. And they assert that its system, like any other, can be outsmarted.

    This is particularly repugnant, especially given the goals set in the article (Google wants to make the search engine process more of a democracy, etc.) Is anybody else tired of soulless marketdroids essentially destroying all the good things that are the Net(C)(TM)(R)?

    On the bright side, maybe there's room to add Slashdot-styled moderation and meta-moderation to Google rankings - imagine a "+1 Funny" rank for the Onion or a "-1, Offtopic" page rank for every time you go surfing for something honest and end up at Yet Another pr0n Site. :)
    • imagine a "+1 Funny" rank for the Onion or a "-1, Offtopic" page rank for every time you go surfing for something honest and end up at Yet Another pr0n Site. :)

      And imagine a "-1, Offtopic" page rank for every time you go surfing for pr0n and end up at Yet Another Honest Site!

  • Google Attack Engine (Score:5, Informative)

    by Caball ( 58351 ) on Wednesday November 28, 2001 @04:25PM (#2626447)
    While on the subject of Google, there is an interesting article at The Register detailing how search terms are used to exploit servers, switches, routers, etc.

    http://www.theregister.co.uk/content/6/23069.htm l
  • Though Google claims the voting system won't directly, and more importantly, immediately, have any effect on results of a search, I think they're going to have to spend a lot of money on abuse detection.
  • Options (Score:4, Insightful)

    by felipeal ( 177452 ) on Wednesday November 28, 2001 @04:28PM (#2626467) Homepage
    Even if the system works fine (i.e., without abuse), it would be nice if the user still have the option to use it or the not (as the current system works very well).

    Better yet, they could have a slashdot-like user customization mechanism (i.e., where the user can set the threshold and moderate/vote a search result in many ways).

    Anyway, I wish them luck too (Google rules :)
  • Scenario: You just put up a new webpage and want to be sure it gets top hits on google.

    How you do it: After putting the page up, write a tool to hit google's voting engine over and over and over... giving yourself good ratings.

    Question: How would the system prevent this type of abuse from happening - especially the opposite approach - rating competitors' sites poorly to drop them in the list?

    Devil's Advocate Question: If you don't allow this abuse to occur, doesn't that then unfairly give extra ranking to sites based on age? A new site won't have accumulated as many votes as an old one yet, and so the ranking would always favor old (and likely to be out of date) sites over new ones.

    • As one who got his entire university cut off from Google in a shell-scripting nightmare (they turned it back on after they learned all the hits were for an AI project), let me say that Google knows when you hit them too hard. Read their terms of service [google.com].

      Also, I think they would know a thing or two about normalizing the data to correct for the age of a site.
    • After putting the page up, write a tool to hit google's voting engine over and over and over


      If I were Google, I would just keep a table mapping IP addresses to voting records. So your second vote for a page merely replaces your first vote, instead of counting as another vote. Would that be enough?

    • Google would certainly know when the site was added to the database and so perhaps they could normalize the votes over a period of time...
  • by dmoen ( 88623 ) on Wednesday November 28, 2001 @04:29PM (#2626482) Homepage
    A few weeks ago, I encountered "spam" on google. 8 of the top 10 links had been captured by a spammer using "cloaking" technology:
    One method, called "cloaking," sets up a dummy page including lots of relevant information for keywords hidden through a special link. The cloaked page is fed to the search engine to boost a site's search ranking for specific terms such as "games," "sports" or "books." When surfers go to that link, however, they see a page that is different from the one indexed by the crawler.
    I can't show you what it looks like, since Google has already fixed the problem.

    What I wanted then was a "moderate" button I could click beside the link to indicate that it was spam. With a voting system like this, Google could locate and remove spam a lot quicker. Maybe that's what this is all about.

    Doug Moen.

  • Great, but .. (Score:5, Insightful)

    by Eloquence ( 144160 ) on Wednesday November 28, 2001 @04:29PM (#2626483)
    .. this will only work when combined with trust metrics. There are certainly different views on what constitutes a quality site, and if you just let everyone vote, you get a fuzzy average (plus you have problems filtering false votes). So what you need is a system of identity where you can say "Show me all pages rated highly by people in my trusted user list".

    To establish such a system, Google needs to get users to create accounts. A more feasible solution may be cooperation with instant messaging providers, using their identity pool and friends lists as filter criteria. But if they want people to create accounts, they need to turn Google into a community. The first thing to do this would be to have an automatic discussion forum for every major website.

    That, again, would create a lot of traffic, so they might be better off using a peer-to-peer app residing on the users' systems instead, which would also allow you to add website-specific real time chat, file sharing, micropayments and other nifty things. It would also make it easier to create responsive user interfaces, which is always a problem with web UIs.

    • So what you need is a system of identity where you can say "Show me all pages rated highly by people in my trusted user list".

      Which would require user accounts, as you said, but I wouldn't have any problem at all with having a Google cookie on my browser. Once you've got that, then maybe something like Amazon.com's system would work without setting up explicit user groups: "The following pages were high-ranked by users whose page rankings were similar to yours: ..."

    • The Slashdot ranking process is a work in progress but seems to be working somewhat. Google could apply a similar "karma" system for registered users and perhaps have "metamoderation" too. Can it be abused? Sure. So what? As a small influence on ranking, it may be a good thing.

      Certainly the current Google ranking system (counts number of links) should always be of greater significance in any applied approach.
      • Re:Great, but .. (Score:2, Insightful)

        by talesout ( 179672 )
        You call the Slashdot ranking process something that's "working"? Even somewhat? Are you mad?

        I wouldn't call this working. Not in a million years would I say that. It depends entirely on the mentality of the original moderators. Dissident opinions (or in Google's case, if they were fucking stupid enough to implement such a system, sites promoting different views) from the original moderators are not modded up, thus you never gain the ability to moderate. Meta-moderation is a hack job at best, a fucking beast of a problem at worst. And nothing fixes the original problem of Slashdot's moderation system. Groupthink is promoted, dissident views are demoted, no matter how well reasoned. You don't believe me? Check the score of this post after a few hours.

        • You might be right. The score for your post is not negative yet.
        • Perhaps if you set your threshold higher than 1 then you would solve the problem, no?

          i mean, if I had then I wouldn't have had to bother reading your post, would I? (unless it gets modded up in the future)
        • Re:Great, but .. (Score:3, Insightful)

          by SEE ( 7681 )
          Hmm. Then how in hell did I ever get to a Karma of 98? Or are you telling me that the /. groupthink has the following opinions (all expressed in posts that got net positive moderation):

          1) The antitrust actions against Microsoft were gross abuses of government power.

          2) RMS is fundamentally mistaken on the nature of property. The case for intellectual property is in fact stronger than the case for physical property, since IP is entirely the product of the creator's labor, while physical property includes preexistent matter to which no one can claim a natural right.

          3) Money doesn't corrupt governments; governments corrupt money. "Unchecked corporate power" isn't a problem on its own. The problem is that whenever a government allows itself to move beyond lassiez-faire, it creates an incentive for the entrenched corporate powers to pay for regualtions and laws that protect them and squash competitors.
      • The only problem is that, when I'm searching for a site on Google, I'm not interested in taking the time to see how other people rated other sites. Metamoderation works (somewhat) on Slashdot because (when/if) people are willing to take the time to care, and look at other comments. I don't think anybody would be interested in metamoderating or generating karma on a search engine.
        • There may be situations where moderation of sites makes sense. Slashdotters might be a good example. Folks that visit this site probably peruse technology pages frequently. (Sometimes links are provided on Slashdot messages.) The slashdot user opinion of tech pages might be worth something. If a site gets good scores from users that have "high" karma, maybe there is something better about those sites? Or maybe there is at least a curios thing about those sites that other users might like too?


          I'm not interested in taking the time to see how other people rated other sites


          If Google stored a cookie with your ID and the visited page was smart enough to include a special "Google Moderate" link, you could easily and quickly rate a site when you visit it. If it is easy, you might do it.

          I would welcome that ranking option as something that I could turn on or off on Google when I do a search.
    • Maybe this could be incorporated into Microsoft's <wink><wink><nudge><nudge> punishment </wink></wink></nudge></nudge> !

      Judge: OK, Bill, in addition to spreading Windows more effectively than finely ground anthrax in a crop duster over Los Angeles, you are also going to have to allow Google to integrate their services with your .Net framework.

      Bill: Damn, I'm good.
  • Might work if... (Score:5, Interesting)

    by BluePenguin ( 521713 ) on Wednesday November 28, 2001 @04:29PM (#2626489) Homepage
    You know, this might work if Google implemented it the right way. I'm just thinking there are a few simple things they could do right off...

    1. Don't put "rate this sight" next to every hit. Instead, use a system of random assignment. Every x(where x is a random number) hits, give the user a "rate this site" dialogue. This cuts down on the potential for direct abuse.
    2. Add an option to sort by user rating, or sort by the current standard. This way, if people don't want to see user rated results, they don't have to.
    I love google and all, but some of the things that make it to the top of the list from time to time are as useful to me as a 16 bit dos driver (for my RS/6000). It'd be good to see something resembling peer review on the web after all. Who knows, even if it fails, it might spark something that works! Best of luck google!

    • It'd be good to see something resembling peer review on the web after all.

      Don't You Yahoo! [yahoo.com] ???

      Seriously, there is so much out there that lofty goals fall apart for any but the smallest of niches. The original efforts of Yahoo! [yahoo.com] just couldn't keep pace with 'Net growth, leading to gridlock on entries and deletions and deterioration of its organization and review process.

      A more recent attempt, dmoz.org [dmoz.org] seems to have stagnated under the weight of the Web. All of the specific sections I visit lack moderation, and are about as useful as a random web search ("Gee! I wonder where they get the initial link collections.")

      Peer review is a great idea, and does take place, but requires a lot of effort and cooperation to pull off on any scale above microscopic. Been there, done that, got tired.


  • Why not just monitor which links searchers choose?
  • Currently, Google's proprietary system ranks sites primarily by words listed on the page, terms used in a page's title or similar factors. It also ranks a page's popularity, determined by the number--and importance--of sites linking to a page. For example, a page that is linked to 100 times from a reputable newspaper's Web site would rank higher than a page linked to 500 times from a porn site.


    I do like this feature, it truly shows the worth of a web page in other peoples eyes, not just the eyes of the webmaster who created the thing...
  • I don't know how many times I have searched for something and get a perfect looking search result only to find out it is a broken link. I have not used all of the search engines out there but I don't remember any of the ones that I have used having an obvious method to flag a link as broken.

    I know that their spiders go through the database and verify links but I'd be willing to bet that is takes months to go over it once. Why not flag links as broken and have the spider verify/remove those first?

    Just cleaning up the broken links could improve the search results.

    Help out Project Gutenberg!
    Distributed Proofreaders [dns2go.com] http://charlz.dns2go.com/gutenberg
  • Win32 (Score:5, Informative)

    by jeriqo ( 530691 ) <jeriqo&unisson,org> on Wednesday November 28, 2001 @04:32PM (#2626513)
    This feature is only available from the 'Googlebar'.
    The problem is that this GoogleBar only plugs in Internet Explorer, so *nix geeks won't be able to rate sites..

    It consists on small faces on which you click. (happy or unhappy)

    -J
    • Googlebar for non-ie (Score:4, Informative)

      by WPL510 ( 196237 ) on Wednesday November 28, 2001 @06:48PM (#2627453)
      The problem is that this GoogleBar only plugs in Internet Explorer, so *nix geeks won't be able to rate sites..
      Well, yes and no. There is currently a project [mozdev.org] on Mozdev that aims to duplicate some if not all of the functionality of the toolbar for Mozilla, and while the current version 0.4 is still somewhat lacking, a new version that duplicates the look as well as the major search functionality (though not pagerank etc) is on the way soon, apparently. However, since this is an independent project and not affiliated with Google, I'm not sure if it would be able to access the rating system. Still, Mozilla users DO have the toolbar, and, since mozilla is cross-platform...
    • This feature is only available from the 'Googlebar'.

      This is not true. Or rather, while it *is* true that the happy/sad face voting feature is only available on the toolbar, which is only available on Win32, I gotten search results recently from Google which included at the bottom a questionaire about the accuracy of the search results, allowing me to rank various items. I don't know if it was a temporary thing or a random selection, but it was on the page itself, and I was on a Mac. No toolbar, but still soliciting user feedback.

  • Google has taken to spamming you [tru7h.org] when they detect a robots.txt file.

    This is truly idiotic, since robots.txt has never been a default part of any web server installation I've ever done, so it's completely a voluntary thing to create the file, and every webmaster should be WELL AWARE what this file does (by virtue of the fact that they had to create it). I mean, duh guys.

    Yeah, so I'm off topic. But I just got the spam this morning, and I used to respect Google quite a bit, and witnessing them resorting to spam emails, begging us to let them spider our sites really tarnished their image, so let me rant a little. :p

    Oh, and let's not forget about google suggesting robots.txt [google.com] as a method to protect sensitive data recently [cnet.com]. Be nice if they could decide if they wanted us to create robots.txt, or not..

    • I think that you misunderstood the purpose of the email. They do acknowledge and recognize that you setup the robots.txt - but offer a suggestion to allow google to index your site and no other bots (Unless their bots lie and say they are google)

      I think it's perfectly clear in there, the line that states if it is your intention, ignore the rest of the email. I can understand your stance, but I hope you take into account that this was simply an informational email - and my guess is a lot of people don't know you can setup a robots.txt entry to allow one crawler but block others.

      They still advocate and encourage robots.txt, and I personally find their email handy, well-worded, and non-intrusive. If they change their policy and start emailing them to you on a regular basis, let us know because then it's violating your personal space and I back your stance 100%. Till then, one email doesn't hurt - and it does provide useful information as well as acknowledging (not begging at all) that you may be perfectly content blocking the crawler bots.
      • > I think that you misunderstood the purpose of
        > the email.

        I think I understand it perfectly. They noticed a website has a robots.txt file, so they send an email to the webmaster making the webmaster aware the file existed (as if they weren't already), in effect asking us to remove it. It was veiled under the guise of being nice and polite and thoughtful, but they still requested it.

        > but I hope you take into account that this was
        > simply an informational email -

        informational schminformational. Spam is spam is spam, and for me to remove robots.txt would be directly to their benefit.. it makes their engine more accurate, which makes people happier, which makes more customers, which makes them more money.

        Just because they worded it nicely doesn't make it less of a spam email.
        • First point: it isn't bulk. Therefor, a count against it being spam.

          Second point: They never ask you to remove it, they suggest if you want to let google search, add User-Agent: Googlebot. Big difference.

          Third point: SPAM is SPAM. Google sending one email to a site is not. It isn't a commercial email, even though by getting more traffic to their site and happier customers generate email. It would be spam if they advertised anywhere in there that you can be listed as a sponsored link for the low low cost of $4.95 for the first hit $0.99 each additional. It wasn't.

          It was unsolicted email, yes. It is not bulk, nor commercial. There for it is only 33.3% spam.
    • I think what they were suggesting is that you had done something seemingly boneheaded like:

      User-Agent: *
      Disallow: /

      Think about it - how many people do you think are out there with a half-clue who decide that they want to prevent evil robots from indexing their site without realizing that they therefore won't wind up on search engines? Apparently Google seems to have run into this situation and now e-mails webmasters who have potentially accidently blocked all robots from indexing their pages.

      Now there may be a valid reason to completely block your site from all robots. But think about how pointless it really is - how many webmasters really want to drive away search engines? Most people want to show up on search engines, especially people whose site shows up as a domain (ie, http://slashdot.org/ as opposed to http://www.wherever.edu/~they/started/).

      Seriously, why did you block the entire domain from web crawlers? While there definately are good reasons, it seems sensible for Google to send a "are you really sure you want to do that" message, especially since the linked "spam" was sent to someone who apparently had four domains they had blocked off from search engines. This sounds like something that an amatuer webmaster may have accidently done without thinking about the consequences. In which case the e-mail makes sense: "Did you really mean to do that? If so, ignore this message - if not, here's a way to fix it."

      I really think you're overreacting to a fairly innocent e-mail.

  • A concern (Score:2, Insightful)

    by zmokhtar ( 539671 )

    I have seen all kinds of warez sites that force you to vote in order to get to parts of the site. Others could have frames that forge a vote each time a visitor comes to their site. While this is an intriguing idea, I don't see how it could work.

    The whole idea of Google's PageRank [google.com] was to count each link from another indexed site as a vote. What was wrong with that scheme? Doesn't everyone currently think Google is the best engine out there? If so why "fix" it?

    I like the suggestion someone else made about showing the vote results but not having them acutally affect the search results.

  • by eAndroid ( 71215 )
    Especially since for many unscrupulous businesses, ratings in search engines directly translate to dollars. Taco you moron have a little faith. This is google we're talking about. Name one feature they've screwed up that badly. If it can't be done so that companies can't take advantage of it realise that it won't be done at all. And taco, you ignorant bastard, you'd think that after you created a user-ranked web site companies can't take advantage of you would realise that anyone can do it.
  • When "Timmy's 3r337 Perl Hax0ring Site!!!" gets ranked #1 for a search on "duck mating habits", we'll all get a good idea what would go wrong with a system like this.

    The existing Google ranking system is already exploited by users who set up hundreds of dummy sites that all link to a certain site using a variety of keywords, thus feeding the G! machine bogus "popularity" information.

    A ranking system will just make this easier to do. Your average skript kiddie could easily bombard Google with a heap of "Yeah this is great!" ratings for his site, thus bumping it up many notches.

    User-ranking systems work as long as there's no huge desire to do so. Slashdot doesn't have *too* many problems because nobody really cares that much if they get rated "+6 - Rad!"...however, there's a much greater motivation to have one's website come up tops in one of the most popular search engines....

  • by FTL ( 112112 ) <slashdot@neil.frase[ ]ame ['r.n' in gap]> on Wednesday November 28, 2001 @04:39PM (#2626587) Homepage
    Google aren't idiots. Don't be so quick to post about how this will be a huge failure and how easy it will be to defeat the system.

    Think about it. According to the article, the system is currently just collecting information, it isn't affecting rankings -- yet. So in a couple of weeks Google will look at this new data, look at the corresponding pages, then figure out what should be done. Why are we assuming that they will just do a linar mapping between the number of happy faces and relevance?

    I wouldn't put it past them to dynamically map relevance with a far more complicated function. User rankings are another non-random data stream. All information (even negative information) is useful. Just as long as one strips it from its labels, and looks at it blindly. Can you say neural networks?

  • The idea of rating based on user "votes" is one I see bound to failure. Google would need an enormous trusted user base, and logins would be required ('cause I could spoof any IP out there for votes). Talk about unnecessary complexity for a search engine.

    What is more interesting is what a few companies have been doing recently in the search engine world (there really still is business after the dot-com fallout, even if it isn't profitable). At my work, we recently looked into a product by a company called Recommind. Their search engine was able to find similar words in documents, and could give you related documents that didn't have key words. It could even distinguish between java (the coffee), java (the language), and Java (the island near Jakarta)! Pretty cool stuff. Combine that type of "concept matching" instead of "keyword matching" with Google's technology, and you've got the next generation search engine.

    All very cool stuff. I hope they don't kill it.
  • Provided that they can keep users from voting multiple times through ip tracking (can you imagine the size of the database for that), they will probably run into the same symptoms that mp3.com's top 40 boards had, there were usually the same group of artists or songs on that board because few people ever explored the rest of the mp3.com archives. But maybe since google isn't the place housing the content, it will be different.
  • Especially since for many unscrupulous businesses, ratings in search engines directly translate to dollars.

    But we've all seen first hand how easy it is to stop unscrupulousness through meta-moderation!

  • by Tom7 ( 102298 ) on Wednesday November 28, 2001 @04:49PM (#2626665) Homepage Journal

    It's not that hard to make it really expensive to forge votes. For instance, check out the captcha project [captcha.net] at CMU. (Basically, it generates images that are difficult for a computer to recognize, but easy for a human, and challenges the user to respond to them in some way to prove that they are human.) If they could find the right balance of convenience for humans and difficulty for perl scripts, I think they'd have a great thing going. I have always wanted this feature in a search engine ... I'm glad to see it happen.
    • For instance, check out the captcha project [captcha.net] at CMU.

      I looked at captcha and found that it may generate problems with disability legislation in some jurisdictions. For instance:

      • Blind people and people behind text terminals can't pass bongo because it requires GIF images.
      • Deaf people, people behind text terminals, people too poor to afford a sound card whose hardware interface is documented, and people not highly fluent in one of the six chosen languages can't pass byan.
      • The fbw test generates sentences that still make perfect sense. For instance, it often chooses a proper name as the word to substitute, and users who do not have knowledge of the geography ("Evansville, CA" instead of "Los Angeles, CA") or the personal names of a particular region will often fail. The long sentences common to pre-1923 English literature produce a "needle in the haystack" effect. (The developers acknowledge that the fbw test is still under development.)
      • Gimpy delivers broken images.

      The only accessible test (fbw) doesn't always work, and the other three are not accessible to those with disabilities. Watch somebody get sued under the ADA.

  • Comment removed based on user account deletion
  • They aren't stupid (Score:5, Informative)

    by 90XDoubleSide ( 522791 ) <ninetyxdoublesid ... minus herbivore> on Wednesday November 28, 2001 @04:57PM (#2626712)
    If you read the article before you post, you will notice that Google doesn't plan to make the user opinions a large factor in their relevance equation, if it applies to individual sites at all:

    Rather than using the votes to tinker with the specific rankings of particular pages or sites, he said, the feature would most likely be used to bolster the relevance of overall results.

    "It will most likely have more of an aggregate impact," Krane said. "We have indexed more than 1.6 billion Web pages, so it is extremely inefficient to go after individual pages."

    Also remember that this is only one of many of Google's tools to improve relevance. You can already do your part to stop spammers by reporting them to search-quality@google.com. [mailto]

  • by thraxil ( 54926 )
    is if i had a way of decreasing the ranking of my own site for particular search terms.

    eg, my site used to be called '/dev/random' but i changed the name when i realized that it was in the search engines for that term and that most people who were searching for '/dev/random' probable weren't looking for my weblog. i'd love to have some kind of 'anti-keyword' meta tag that i could use to tell the googlebots that i'd rather not be associated with that search term anymore.

    i know... somewhat off topic and boring... sue me.
  • Meta-Rating (Score:4, Insightful)

    by jeriqo ( 530691 ) <jeriqo&unisson,org> on Wednesday November 28, 2001 @05:05PM (#2626769)
    As slashdot got Meta-Moderation, i think google should use Meta-Rating, so users could help detect spammers.

    Oh, by the way, if you're already a Slashdot moderator and want to know if you can Meta-Moderate, just check /metamod.pl.

    -J
    • Actually, the system seems to be more like Meta-Moderation than Moderation. Google uses the number of links to a site as a qualifier of relevance, IOW as a positive moderation. The votes on the actual relevance by the users will then be used to identify sites whose links are not a good indicator of relevance, to then be (mostly) ignored in future searches, IOW similar to Meta-Moderation.
  • by rjamestaylor ( 117847 ) <rjamestaylor@gmail.com> on Wednesday November 28, 2001 @05:06PM (#2626770) Journal
    Is there a provision for meta-modding at Google?
  • by loosenut ( 116184 ) on Wednesday November 28, 2001 @05:12PM (#2626823) Homepage Journal
    There was an article [newscientist.com] on New Scientist about some technology similar to this. It would analyze what parts of a web page were hit the most, and bring those to the foreground (think bigger, bolder links), and shrink or kill off the unused links.

    It's all part of the process of creating a more "intellegent" web.
  • OpenDirectory (Score:5, Insightful)

    by Ratbert42 ( 452340 ) on Wednesday November 28, 2001 @05:32PM (#2626980)
    Maybe OpenDirectory could add a rate-an-editor feature for their users. If you wanna talk about abuse, look there, not to Google.
  • This is the kind of idea (put two known good ideas, mix in "internet", boom!) that seems like it would have a bogus patents on it.
  • by telebear ( 234209 ) on Wednesday November 28, 2001 @05:53PM (#2627170)
    As someone who used to use Epinions all the time (making over $1000 from them), I have to say that the epinions "Web of Trust" system seems to work rather well, at least on a small scale (100,000 users).

    Basically, you can see what users rated the article as useful. If you think that certain people have similar tastes to you, you put them in your Web of Trust. You'll get articles posted in a different order depending on who you trusted.

    It is actually more complicated than that, as there are epinions "Experts" who are judged by epinions to have good ratings. I think Amazon has a similar system (and has way more users, but the system still seems to work ok).

    The big problem is that the internet at large has so many bloody users and so many bloody pages... I think introducing groups of users or groups of groups that you trust might be a better way for the Web of Trust idea to work with the internet at large.
  • The choices in advertising accepted by any service, such as Google, reflect that organization's political beliefs. I have noticed that whenever you search for drug info on Google, say LSD [google.com], cocaine [google.com], heroin [google.com], etc., you always get an ad from freevibe.com [freevibe.com], a government-sponsored anti-drug web site.

    However, recently the ad which appears for marijuana [google.com] changed to NewScientist.com [newscientist.com], a science journal which has been publishing much more balanced and thorough information on weed, some of which advocates that weed is less dangerous than alcohol. Also the top result is NORML [norml.org], a legalization-advocacy group. (This is probably not due to tampering w/ the search engine, but is interesting)

    I believe that The Powers That Be within Google have taken the more moderate, academic drug stance, as opposed to gov't-sponsored propaganda. Google's pretty influential, Internet-culture-wise. Food for thought.

    (Offtopic, sort of, i know, but I saw a Google story and had to run with it!)

  • 1) Get anyone who wants to rate sites to make an account. Yes, it's a pain, but that way you can track people's rating activities, like on /.

    2) Use the Yahoo! style system of having an image that you have to type the word in from to create an account. Keep changing the way the image is formed. This should *help* to prevent account creation spam.

    3) Give people a certain number of points per day / week / month (ala /.)

    4) Make it so that everyone has to balance out +ves and -ves - that is, somehow make sure that they can't just do one or the other.

    5) Make it so that each account can only rate a particular site once. Now this requires quite a bit of storage, because you've got to store every rating ever individually instead of just a counter, but that way you can prevent multiple rating on some corporate site.

    Note that this prevents the idea of rating a site based on how appropriate it is for a particular search, which is admittedly one of the really exciting parts of this (that is, if I search for Transistors and get www.electronics.com then I rate it 'Good'. If I search for Open Source and get www.electronics.com then I rate it as 'Bad'.)

    With this system instead of this I just rate www.electronics.com according to how good the site is, not how relevant it is. Maybe that's what they're aiming for, maybe it's not.

    I think that would help stop it but it all depends on the security of the account creation process - if it's easy to spam then the whole system becomes a waste of time.

    It also doesn't prevent the problem of people being paid for ratings, which is possible, or for a company getting every single one of its employees to vote for the company. Thinking about that, one solution could be to just say that a company's rating can't go above a certain level and can only increase at a certain speed.

    Or you could have metamoderation. This sounds more and more like Slash based code all the time!
  • It won't be too hard (Score:2, Interesting)

    by MagPulse ( 316 )
    No Slashdot poster has been able to reliably get their posts modded to +5 yet.
  • by dasunt ( 249686 ) on Wednesday November 28, 2001 @09:53PM (#2628285)


    If Google (or another search engine) set up all links to visit an internal google page that quickly redirected the user to the target site, it could rate on how many people visited the site, instead of a potentially biased rating of users.


    Of course, shady websites could still influence it, either by hitting the pages themselves, or by crafting their page so that the google-selected text is tempting to search engine users, but the system still has the advantage of not requiring active participation of users.


    Just my $.02

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...