Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
The Internet

On Counting Website Traffic 145

Logic Bomb writes: "The San Francisco Chronicle has an interesting article about measuring website traffic. This is kind of an obnoxious issue, but it means everything to commercial websites seeking investors. Apparently the figures reported by the sites themselves through analysis of server logs are often much higher than the ones given by firms like Media Metrix (whose numbers I see all the time in articles from Cnet and the like). The basic dispute is over whether sampling, a la Nielsen, is appropriate for the web. It seems counterproductive to purposely use an innacurate statistical measure when exact counts are readily available, but I can't imagine many things easier to fake than a server log. Anyone have a good idea about how to approach this?"
This discussion has been archived. No new comments can be posted.

On Counting Website Traffic

Comments Filter:
  • On a serious note, I work at a fairly busy web site in the data warehousing/business reporting section. We simply don't have _time_ to fake enough server log entries to make it worth our while, we're too busy processing the _legit_ stuff and filtering out such non-reportables as crawler hits. As to the discrepancies between inside and outside numbers, well, I can _guess_ about what Mars looks like from some telescopic photos, but nothing beats going and looking. Understand I'm not shooting at those outside companies for the differences in said numbers, but one must be aware that different methods can produce different numbers, and that using statistical methods to arrive at metrics.... well, that's how the US Census works.... and if you live in the right neighborhood you're going to find an awful lot of _dark_-skinned 'Caucasians' ;>
  • Revenue is more important than "hits". I just wish that some of these content sites can pay their employees and backup all the hype.
  • Short of having the measuring company installing a sealed black box that counts the request and bounces it right back to the website I don't see many other ways to do it if the server operator is malicious.

    Of course some sanity could be put in the data if the usage bandwidth from the site's upstream provider is taken into account (5 million hits and you transferred only 2 megs ? hmmmm) pity that obviously this data is usually not publicized or available to third parties for obvious reasons. (so you pay 5$/gig upstream and you charge me 10$/gig if I go over my quota eh ?)

    I am sure, though, that through the wonders of reverse engineering, IP spoofing etc. it could be possible to foil even black boxes, I mean, how much would it take to just take a machine in the office, connect it to the black box and send http requests crafted so that they appear to be from other IPs ? If the site owner has physical access to the site's hardware this would be really easy to pull off.

    It would be more complicated if one had just a colocated box or a virtual host, but even then with some 3l33t h4x0r skills one could make the black box's head spin in whatever direction one wants...

    The web, though, is very interesting in this respect, because unlike in the TV case (where you have to count the viewers at the viewers' location) it is theoretically feasible to do a precise, repeatable and cost effective analisys by monitoring only the point of origin, which means that the money that would go into finding an acceptable demographic group, providing them with set top boxes that analyze their habits etc. one could invest all of this in creating one single monitoring device installed at the site's location.

    On a related note, I have a digital cable set top box, and I am sure that behind my back the cable company is collecting my viewing habits (I mean, how hard could that be ? the digital set top box already connects to their network to download programming information and it has a unique ID) anybody knows more about this ? My cable company is pushing heavily the digital set tops even towards people interested only in basic cable, and I don't think they are doing it for charity...
  • Strange thing that categorize it like that, don't you ever buy anything? Actually, 2 of 3 things I've bought off the net during the previous 2 months have been from shops that have been linked from Slashdot. Most of the people I know that are intrested in Slashdot have more money than those who are not.
  • Radar detectors aren't reliable, because many troopers use only stopwatches and a known distance. The radar is off.

    The police radios, however, are always on.

  • You could probably set up a Perl script in a few minutes to make up the numbers for you.
  • In this country (I needn't specify which), we elect representatives. We don't directly vote on most issues, even though it would be technologically feasible for us to do so (especially with the advent of the internet). Why? Because we're not just looking for an accurate measure of what people want. We want what they ought to want, and we hope that representatives will better reflect that than their actual choices.

    Advertisers don't just want to know what the most visible piece of real estate is in the world so they can erect a billboard on it. They want to know what the next upcoming innovation is so they can be the first to ride the upsurging wave of popularity. It doesn't help that altavista [] is the most popular search engine in the world today if placing a big banner ad on google [] tomorrow will catch the as-yet unseen mobs.

    Take Netcraft [] and server operating systems. You don't just want to know what people are actually running. You want to know what they dare to tell you they're running. This is why it's ok for Netcraft to base its statistics on what servers tell each other they're running, rather than on some complicated fingerprint of their tcp/ip stacks.

    It comes down to this: Adam Smith had it wrong with his theory of the invisible hand of market forces. It's not just what the markets do that's interesting; for that tells you nothing more than what, imperically, they do. If you pretend otherwise, then you're behaving no differently from all the Linux bandwagoners or Microsoft bandwagoners who base their decisions only on the herd. Herd mentalities are antithetical to proper advertising, and advertisers are finally waking up to this fact.

  • Now, correct me if I'm wrong, but isn't "Fromage" cheese in some language?
    Coupled with the common short version of "Richard" that is pretty funny.

    My personal favorite fake name which is on my Fake ID (I'm over 21, I have it just in case) is Justin Case ;-)
  • There are other companies doing exactly this.
    One example is Raydium []
    I'm sure there are others as well.
  • If anyone is interested, we've found a unique way of measuring the web by tapping into non-confidential data at the backbone (ISP) level. You can see the results at (7 day trial available) or (currently free). ISPs get a revenue from the sale of this info to marketers, advertisers, etc. If there are any ISPs out there in Europe, Asia or the US who are interested in partnerships please contact me.
  • by Kaa ( 21510 )
    The punishment in this country for fraud is not public hanging.

    No? Really? [shakes his head in wonder] Those are strange times we live in...

    The punishment is that you have to give all your money to a lawyer.

    I was under the impression that having to give all your money to a lawyer was the punishment for needing (or thinking you need) a lawyer.

  • Silly po boy! Of course I didn't let them have my dl number. I put down some random number which matched mine for about four digits. Just in case they asked.

  • WEB BANNERS DONT WORK! Sorry to shout - but I really believe that. I've run banner ad programs for several companies...and at the same time website analysis software that I wrote I know the results are solid. I'd buy a block of 10K clickthru's, and at the same time I'd watch and count the clickthru's on my side. In *all* cases, I'd get to just over 50% of my purchased clickthru's when I'd be notified by the banner provider that I'd reached my limit...and did I want to buy more. Bullsh*t! They oversharged me by double...and I delat with many of the same ad companies that you've listed. Run like hell while you can...
  • Seems to me the most accurate count could be had from the advertising services a major site uses. DoubleClick could get an actual count, without requiring sampling, by counting referrals to their ads. Their obnoxious cookies would make an estimate of unique visitors quite good, too. So they could give the same statistics as the audited site, with some measure of third-party independence.

    'Course, DoubleClick can be fooled by having cookies disabled, a JunkBuster proxy [], or whatever, but I'd imagine at this point only a tiny percentage of users are sufficiently clued to use JunkBuster or Cookie Pal []. Certainly too few to make the count less accurate than sampling.

  • An ad that's being viewed by 4 million people has significantly more value, and thus has a higher cost, than one that's only being viewed by 2 million people

    Unlike a television ad, in which an advertiser pays a large amount for one ad, banner ads are charged per impression. So whether a banner is served up 4 million times or 2 million times, the advertiser is charged the same per impression (well, assuming it's on the same site, and assuming no volume discount for the additional 2 million impressions, but you get my point).
  • by Kaa ( 21510 ) on Monday September 25, 2000 @09:45AM (#755647) Homepage
    Well, falsifying server logs in order to get better rates for banner ads would probably count as fraud, which happens to be a criminal offense in the US. A couple of show trials followed by public hangings should solve this little problem.

    Besides, banner ads are typically served from a server NOT controlled by the company which own the page. So people like DoubleClick know for sure how many times their ad was ignor^H^H^H^H^Hseen.

  • The idea of hiring a company to generate web statistics to test for commercial viability seems impractical.

    If a company truely wanted to, they could easily obtain numerous IPs to forge the logs ahead. And think about a script kiddie exploiting java, perl, or whatever-- that would certainly make a website's statistics look better. The list goes on of ways to increase a website's usage.

    I think the only way to get this done fairly is to post a raw log, and let the investors (or whoever the target is) decide for themselves. Apache logfiles are fairly straightforward, and require little to no effort on deciding what is an actual hit and what is not. Of course, this would require honesty on part of the company, which seems to be the real issue.
  • It's really too bad that the alexa software is so annoying and invasive. Although it would be more akin to the neilsen type deal. Maybe if someone made a less visible type of "Alexa", they could give voluntary users "coupons" or discounts for online businesses in exchange for running their software.. then sell their stats. Or it could be all open source style with stats posted freely and no compensation for using the software... blah
  • If you are out trying to get advertisers to buy space on your site, or looking for someone to invest in your site for expansion they may very well care how many hits you have and who counted them.
  • by Fervent ( 178271 ) on Monday September 25, 2000 @09:49AM (#755651)
    I can't imagine many things easier to fake than a server log.

    Are you kidding? When I worked at my last internship the boss would take the server stats from WebTrends, plop it in a Word file (to look good for investors) and then sometimes "moderately improve" some of the stats before printing the document.

    Fact is, most investors don't get a verbatim server log with all the technical "mumbo-jumbo". They get a simplified version with only the information the CEO wants them to hear.

  • In fact, by looking at your own logs, you can say, "Well, Yahoo sends 10,000 people a day to my site

    Yes, but you can only look at your own logs to see that Yahoo sends you 10,000 people a day after paying them substantial money for what they said should average 20,000 people a day, based on their (or Media Mextrix's) logs.
  • What we do is having each server here (here being where I work not nerdfarm which I lose money to operate*G*) completely ignore for the most part what's going on.
    We've hacked out a packet sniffer that runs on the network picking up good data (ie, where is the hit coming from) and then between that and our dns load balancing software we get a really good monitoring package to find out exactly how busy our servers are. And BTW, our servers serve approximately 2000-6000 hits a second. yes, a second. Yeah.. it is a lot. And it all runs apache.
  • The matter at hand here is not the mathematical accuracy of web logs vs. the statistical consistency and ability to predict behavior, such as those produced by Nielsen, Metrix, and the rest.

    It's a matter of trust

    Sponsors and advertisers simply need to rely on a consistent, reliable, and _reputable_ company to provide the numbers by which they purchase advertising. It is the same reason that makes large corporations with huge accounting departments hire Ernst & Young to run their 10-Ks before posting them to the SEC: reputation and consistency.

    Everyone, including the site managers, advertisers, and Metrix, know that the numbers can be faked anywhere along the chain. The most honest option, and least culpable in terms of liability, is to have a measuring company run its analysis for all, even if that analysis is statistical predictions. If a site manager fakes numbers, he's a little untrustworthy; If Metrix screws up, they're outta business.

  • It was just an example of the size of 800 million downloads. Still I think it was quite inaccurate. You must also remember, most people on earth dont have and dont care to have internet access, and of those people who would have heard of this womans site? People in north america.

    -- iCEBaLM
  • The problem is that when people invest $10+ million dollars in a web company, no only do they want numbers, but they want EVERYONE to know those numbers. I work for a website that get ~8 million hits/day and has many regulations to conform to. The accuracy of our logs is what keeps our company alive. I've seen Nielsen, Media Metrics, report numbers for us, and they're all off from what we get. That's to be expected from sampling. What *really* matters from a marketing perspective is how much granularity you can get from these numbers. If you're logs show 20% less than what Nielsen shows, but you can drill down and get demographic/session/referrer/etc. data, then you're in a much better position. Number of hits are useless nowadays, but being able to break up this number into geographic location, time of day, site path, avg. session length, etc. is what makes logs usefull.
  • One reason that we are not exchanging anything REAL. Nothing that is tangible, is because we live in the INFORMATION age. If you want something tangible, go back to the bronze age or something. People don't sell things, we sell mindshare. This is what web ratings (and TV) are all about. Selling a piece of people's minds. Why do you think AllAdvantage even had an inkling of a chance, because they thought that they could buy people's minds at a cheaper rate then they could sell them...
    But then a lot of people just got around that, and just got to freeload.
  • A thought that I have had, though since I am busy working on another business opportunity at the moment so I am not pursueing it, is that there are some obvious places to gather very accurate data about website interest.

    These are the main routers for the hosting providers hosting the website in question. At some level these machines "know" not just how much traffic is requesting the given website, but how many different IP addresses are requesting, how many of the requests are short/quick Cache refreshes vs. long "real" sessions etc.

    Capturing this data would require some sort of sniffing, which could have a performance hit and does raise security implications but could be overcome. Additionally the main routers for most hosting companies are outside of the control of the client companies so this data could be seen as more trustworthy than the server logs from a machine to which the client company has root.

    Anyway, just my thoughts a fair amount of work would need to take place to turn this into a new profitable line of business for the hosting companies, but given the market need for accurate data I think it would be worth pursueing.

  • It is obvious that the basic methodology behind the collecting of web stats is flawed. For one thing, most web sites count requests, when in fact it is individual sessions that sould be counted.

    Now if a company is interested in gathering web statistics in order to steer corporate decision making, then they should really look at collaborative filtering [http] as a means to do this. No matter what else you have to say about, their implementation of the Net Perceptions [http] collaborative filtering engine is incredibly accurate at analyzing and predicting their customers' needs/desires.

  • This reminds me of the fun I have with Radio Shack asking for my name/address/etc. When I lived in Indiana I would regularly tell them I'm "Richard Fromage, 1060 W. Addison, Chicago IL, 60613"-- if that address doesn't ring a bell, go rent the (original) Blues Brothers movie.

    Makes me wonder how much junk mail the Chicago Cubs get and routinely dispose of...

  • by Mtgman ( 195502 ) on Monday September 25, 2000 @09:50AM (#755661)
    If someone is willing to take the hosting site's word at face value with regard to eyeball real-estate, then I've got some banner ads (and a bridge) to sell them.

    And this is the really sad part. The information age has created a new type of cyber-criminal. The false information broker. Society is moving away from products and building multi-purpose machines. As a whole were're more service oriented than we used to be. This means all our assets and business transactions are on paper. Nothing tangible is being exchanged. And typically we have such a high volume of data being transferred that it can't be checked for 100% accuracy. I signed up for one of those "saver" cards at a local grocery store(part of a national chain) and totally faked the information on the signup sheet(I get enough spam as it is, thank you very much) No one caught it, even though an application with an address of 1600 Penn Ave in Ft. Worth, Utah with a completely made up Zip code and a Texas DL number showing up at a store in Tennessee _should_ have raised an eyebrow or two.

    So now we have the buyers and the sellers. A buyer can't always trust a seller and a seller can't always trust a buyer. Enter the middleman who keeps both parties honest. Am I the only one saddened by the necessity of a service like this?

  • The only useful info that you may get from a server log is the nu,ber of UNIQUE visitors to your particular page, not page views or hits. This is why sampling may be useful! Sampling lets us see in general what people are going to see and allows a more accurate count of unique users.
  • long term how much traffic a traditional web page gets, even if they are a powerhouse will not matter that much. Traditional webpages are somewhat dull and are not a very attractive target for advertising dollars.

    Where it starts to matter is measuring audience for streaming media. And neither server logs nor sampling will give an accurate vision of what is actually happening. This is where real audience management comes in, from the likes of companies such as Reliacast []. The ability to get exact counts of the number of participants on a streaming event regardless if it is a unicast or multicast event.

    all persons, living and dead, are purely coincidental. - Kurt Vonnegut

  • I wonder about this every time I hear some no name website say that they get X,000,000 hits per month.

    I've seen some stats pages, and there are usually over ten times as many hits as there are page views.

    I'm assuming that the average visitor views more then one page. So when they say "We get X million hits a month" they're only getting roughly X hundred thousand visitors or so? Or are they using "hits" as a term for visitors?

    Do these people even know their own real stats?

  • I would like to get one of those detectors. Police frequencies aren't hard to determine.

    Wonder what kind of range it has?

  • If you're turning your logs over to someone else for analysis, you might as well post your savings acount number, PIN, and SSN to the Internet. The information contained in your logs is, IMHO, some of the most proprietary data an Internet company owns. DarkSparks
  • ...and a Texas DL number...

    You let them have your DL number? Seems kind of pointless to lie about the rest of the stuff when your DL number is on there.

    I guess I am assuming that you didn't lie on your drivers license (about more than your height and weight)

  • but I'm really stoned right now, could you repeat that?

    peas, -Kabloona
  • by crisco ( 4669 ) on Monday September 25, 2000 @11:51AM (#755669) Homepage
    The server logs don't tell you who is coming to the site. Sure, you know that (completely made up) stopped here and you can even do a reverse DNS on it, but the advertisers that pay for banner ads and the corporate marketing types want to know how much disposable income is behind that IP and what they might spend it on. That is why DoubleClick and all want to track you and even correlate you with a name and address, that info lets them classify you and sell your eyeballs to the advertisers. Have you seen the higher prices that they get for targetted ads? Nearly double their normal rate last time I looked [].
  • You could configure George Schlossnagle's mod_log_spread to multicast apache log entries to a third party audit host. That would be realtime, very hard to fake, and transparent to your config.
  • I'm using Wusage...

    Wussgage? is that some kind of measurement of the tendancies to give up a in a fight, or complain or something? Exactly what kind of scale does one use to guage the amount of wuss in a person? Is this differnt than measuring the amount of wussy in a person?

    Or is this the thing I keep hearing in that Budweiser commercial: "Wusaaaaage?", "yeah, wusage.", "Wusssaaaaaggeee!".


  • You could configure George Schlossnagle's mod_log_spread [] to multicast apache log entries to a third party audit host. That would be realtime, very hard to fake, and transparent to your config.
  • Yes, you're right, but it seems nobody has mentioned what is painfully obvious to me from dealing with my own clients. "What's a server log? Do we have one of those? How much is that?" Most businesspeople (suits) need somebody to translate the tech for them, and whenever there's a translation, there's the opportunity for deceit.

    The Divine Creatrix in a Mortal Shell that stays Crunchy in Milk
  • Whoever ranked that funny is on the ball
  • It is virtually impossible for a device to detect what radio station you are tuned to. There's no way for anyone to tell what station you're listening to, short of getting into your car and looking at your radio. If you still have an all analog radio, then maybe you could detect harmonics caused by the filters at the local oscillator, but I don't see that as being reliable from any distance. If you've got a recent stereo, then its probably DSP driven anyway. So how could you possibly tell then what station the person is tuned to? Telnet to the proc and do a ps aux | grep LO-RF?

    I'm sorry... I just don't buy it... a device that could detect what radio station you are listening to? Nope. Don't buy it.


  • Get bandwidth statistics from their ISPs if you can.

    The ISP that I work for generates stats on almost every interface on our network, save a few odd pieces of hardware that do not support it or are not worth supporting it. You cannot count the hits, but you can count the proverbial p0rn that they are pushing... or pulling.
  • by iCEBaLM ( 34905 ) <icebalm@[ ] ['ice' in gap]> on Monday September 25, 2000 @09:58AM (#755677)
    I was thinking about this when I heard on Entertainment Tonight about Guinness crowning the "Most downloaded woman on the internet". And when I heard her astronomical number of 800 million downloads I thought it was incredibly inaccurate. Every man, woman and child in the US would have to download 4 of her pictures. How does Guinness come up with the final numbers? Do they even check the logs themselves? Are thumbnails viewed on a page included in the final numbers?

    When I eventually went to her site (I can't even remember her name for gods sakes) she had almost no pictures on it of herself, lots of other girls however, I tried in vain looking for some of her and I was thinking to myself that the numbers were severely inflated.

    While this might be an "obnoxious" question I think a standard way of evaluating just how many hits and downloads a site gets needs to be determined, expecially for awards like the Guinness Book.

    -- iCEBaLM
  • This is entirely different. Police re-broadcast... your car's radio doesn't.

  • For an example on one of my web sites I actually keep track of many different statistics on a daily basis. I count each individual pages on my site by themselves and also all pages as a whole. For all of the above I have two counts, one is the # of times each individual pages are requested and the other is the # of times each individual pages are requested by a different IP. This allows me to do two things: count the # of raw page servings, and the # of unique visitors on a daily basis. Of course if i really wanted to go crazy i could make a decision that a unique visitor is each unique IPs on a per month basis or perhaps count a unique visitor as each hits on a per IP address basis as long as no visits have been made by that IP address in the last hour....personally i just keep a count of everything and ask sponsors which count they prefer :)
  • When the FBI has carnivore attached to every ISP and RIAA / MPAA / DC / AnyOtherNameForADMCALovin'Company / UCTIA backed spyware company I'm sure they'll have LOTS of reliable statistics about your website (and shopping patterns, and bank accounts, and how much porn you download off of newsgroups)... and you'll be able to get it... for a price of course (unless you do something the Gvm't doesn't like, then you'll get it free under the rules of discovery. Too bad you'll be in jail soon after) Nathan "They're watching me, I swear" Cento....
  • I know someone (via the newsgroups) who had one of these "pay for clickthrough" deals at his site. So like any good script kiddie, he wrote a VB app to do lots of clickthroughs to get paid.

    He got caught. He deserved to be caught. I thought it was damned hilarious!

    Basically, he had just enough knowledge to be dangerous, but not enough to make his hits look like real, unique hits. The best thing about it is the guy has a massive ego and it took a huge deflationary hit ;-)

  • A television ad, like a magazine ad, is guaranteed a certain viewership. The Super Bowl has such expensive ads because it has a huge rating, and you get lots of unique impressions that are impossible to get otherwise. Four ads on a 15 share program like the Olympics aren't worth anywhere close to one ad on a 60 share program. That's because TV ad buyers would rather have 60 million people see their ad than 15 million people see it four times. So a banner ad served to 4 million unique people is worth more than a banner ad shown twice to 2 million people.

  • i agree that the 800 million is probably calculated in whatever way is most favourable to her goal. (which is to say horribly inflated)

    of course, that's for the total amount of time she's had the site. i have no idea how long that is, but i seriously doubt it existed before '96.

    most people on earth dont have internet access or dont care about the internet, but that still leaves a lot of people who do. on a quick look, i couldnt find any numbers to supply. considering she's backed by playboy, i would expect her audience to be worldwide.

    (not to mention it's probably comes up in every search done on the net.)

    Darth -- Nil Mortifi, Sine Lucre

  • Please, send her over! I'll gladly give her triple what she recieved for her last album gratis, in the name of continuing art.

    [technos begins scrawling in the checkbook.. Pay to the order of: Courtney Love, Date: September 25, 2000, Amount: $3,000 and no cents]
  • I strongly disagree with this.

    I work for a company that recently made its way into the Media Metrix top 20, and I know that we built our name by focusing on popular, yet niche, content. Some of it didn't "rock," but that's all subjective, and we invested in the numbers.

    Once we have the numbers, we can develop the investment and provide quality assurance in creative and informational aspects of the network. In fact, that happens to be what more than a few people do around here.
  • Those that are pointing out that "sessions" are a more accurate measure overlook the fact that banner ads are typically delivered per page pull - and users who view 10 pages may see 10 completely different banner ads in that time.

    These numbers are important for both advertisers and web operators alike, because many banner ads pay on a cost per thousand (CPM) basis. Advertisers need to see how many people are actually seeing their ads, and operators need accurate numbers to be sure they are paid fairly.

    Any web operator you meet, can tell you that the page impressions counted by the banner ad company *never* match the page impressions reported by the logs, and often are on the order of 1/3 the number webtrends or webalyser reports. Add that to the idiotic way companies like Media Metrix are "projecting" traffic based on relatively small sample sets, and the result is, that website operators always manage to get screwed in the deal.

  • The information age has created a new type of cyber-criminal. The false information broker.

    Nope, that sort of criminal has been around for quite awhile. The classic example is the "blue book" purporting to give average market values of used cars. In fact, the blue book is put out by the used car industry with higher-than-market prices, solely for the purpose of allowing used car dealers to advertise that their prices are "below blue book", and/or convincing consumers to agree to artificially high prices for used vehicles. (There are other used car price guides which are more accurate in their values.)

  • Fact is, most investors don't get a verbatim server log with all the technical "mumbo-jumbo". They get a simplified version with only the information the CEO wants them to hear.

    Good point...and its not like web logs are the only thing that gets treated like this.

  • I would guess the device works by picking up the audio coming from your car, then comparing it to the output of known radio stations in the area.

    Just gotta mic each parking space.

  • When the FBI has carnivore attached to every ISP and the RIAA / MPAA / DC / AnyOtherNameForADMCALovin'Company / UCTIA backed spyware companies are collecting all of that data on you I'm sure they'll have LOTS of reliable statistics about your website (and shopping patterns, and bank accounts, and how much porn you download off of newsgroups)... and you'll be able to get it... for a price of course (unless you do something the Gvm't doesn't like, then you'll get it free under the rules of discovery. Too bad you'll be in jail soon after)

    Nathan "They're watching me, I swear" Cento....

    (Yeah yeah yeah... I sould learn to use PREVIEW)

  • by technos ( 73414 ) on Monday September 25, 2000 @10:04AM (#755691) Homepage Journal
    NOTE: By reading this post, you have agreed to run around the room which you are currently in, flapping your arms, and sqawking like a chicken.

    Okay, I did it. Unfortunatly, I was reading your post at the same moment my boss was entering the cube, and I've been fired. Under the terms of the 'technos' AUP (As amended September 12, 2000), and UCITA, you are hearby notified that you owe me $28,941,285.42.

    Referencing clause two of the AUP, this number reflects the sum of my maximum earnings potential until retirement age, as well as the cost of obtaining said employment (six years of college at a major University), as well as an additional 34% transgressive penalty and a 9% compounded cost-of-living increase.

    You have ten business days to remit the sum, in whole, or I will be forced to submit a class B lien request against both your holdings and those of your employer in the State of Maryland.

    Clause six clearly states you indemnify me against any legal malfeasance or action, so don't even try to get cuetsy with a countersuit. It has a binding compensation clause of $2,000,000.
  • by komet ( 36303 ) on Monday September 25, 2000 @10:26AM (#755692) Homepage
    Where the fuck does the idea come from that your should show your web server stats to marketing/sales people? Because current programs are really just some measurements of technical data, useful for planning server loads and Internet uplinks, but not for demographic data. PHBs want something like this:
    • Yesterday, 1308 people visited your site. Of those:
    • 183 weren't paying attention at all anyway.
    • 22 were your competitors.
    • 318 were poor college students drooling over, rather than contemplating buying, your products.
    • 139 were actually looking for pornography and left your site immediately.
    • 38 were webdesigners stealing your HTML code.
    • 133 were here to compare your prices with the competitors. Of those, 29 decided to buy your product.
    • 84 were in your target demographic, but were so stoned at the time that they didn't read your sales pitch.
    • 12 people actually bought something online.
    • 18 people liked your product and went out and bought some offline.
    • Of those 30 people who bought something, 28 sent the URL to a total of 56 friends to show off what they had just bought. Of those friends, 3 subsequently bought something.
    Ok, so where's the software which can get that data out of your server logs?
  • Actually, every super heterodyne radio receiver (which almost all are these days) generates a radio signal of its own. This is mixed with the incoming signal to generate a signal at 455 khz IIRC. The generated signal can be detected and used to determine which frequency your radio is tuned to.....

  • well, you could still fake the data for something like doubleclick too. you'd just have to write a script that simulates clicks on the link and you could "stuff" you total clickthrus.

    the trick would be in writing a script to do it that would be subtle enough to not be caught by someone analyzing the logs and yet get enough clickthrus to make it worth the profit.

    Darth -- Nil Mortifi, Sine Lucre

  • i have a cron job set up to run webalzier on my acces_log every day. it gives me referrers.
  • This is one reason why a genuine 'audience' is going to be lower than the raw logs. Local traffic and robots aren't real traffic. I could increase the raw hits on a site to almost any level, simply by throwing a few htdig processes at it. Wouldn't mean anything though.
  • by Anonymous Coward
    Our investors, partners, advertizers, etc. aren't interested in hits -- at least not since maybe 1995 or 1996.

    We get asked for detailed reports on "impressions" (an old print advertizing concept) and "page views" and "unique visits" and "return visits", and "length of visit", and "pages per visit" ... and all kinds of other goodies.

    Who the hell get's paid for hits?
  • by Anonymous Coward
    I just urinate on the corner of the website and 'mark' my territory in a way familiar to all canines and trolls.

    You mean they prefer post-its? I was wondering why they always had the shotgun ready when I returned....
  • The point isn't to prove it to yourself, it's to prove it to the advertisers who might want to put an ad on your site. You dolt.
  • by oliverk ( 82803 ) on Monday September 25, 2000 @10:29AM (#755700)
    I work for a major ad agency that produces the full spectrum of work, online banners and applications, broadcast and print spots, etc., so really from our perspective its about comparable measurability. We deal is a world where the media mix can contain any number of mediums, and right now the online space is the most difficult to measure and justify to our clients. This isn't so much about what

    I come from a good (read: more than five years :) ) background in the interactive territory, and I've gotten pretty used to the issues of measurability on the internet. The reality is that, for those of us creating work online, we've gotten overly accustomed to the nuances of online and forget too often to explain it all over again. There's also no major player that will admit that measurability across sites and users is nothing more than a statistical crap-shoot. I don't know why none of them will admit this -- certainly the polling that's done by Nielson and the like is nothing more than statistical projections, and really it's a lot better to have something imperfect rather than nothing at all.

    In reality, our clients still really don't understand why these numbers are so different and then question our recommendations based on what they read. It challenges our reputation and affects the trust the clients typically feel in our creative or media teams. Broadcast and print, as well as the other "offline" mediums, really then have one big advantage: those mediums have been in use long enough that our clients no longer ask the questions of "how can we justify those reach numbers" or "sure I see what you're saying, but my other consultant says that you're only reaching half that audience with that commercial."

    So, maybe the challenge really lies with each of these "measurement" firms not admitting that they could be wrong. Maybe its that the sites that are polled are financially incented to inflate their numbers to justify acquisition or second-round financing. Maybe its that the technology exists to perfectly track a user's path anywhere, anytime but one of the first "features" in the browser was anonymity. Maybe it's the convergence of all of these different pieces at the same time (which is most likely the case).

    Sad. The interactive space has such opportunity to get around lofty advertising and blink-tag style direct marketing. But unless we can justify the funds, apportioned largely based on reach to the market, we won't end up with the type of experience marketing that actually ads value to those of us online.
  • If you want to buy banner ads on a particular site and you're worried that their demographic info is inflated, here are some things to try.
    • Have them pull the banner from your server, not theirs - never ever let them put your ad banner on their server. Do a test ad run with them, then analyze your own server logs. You'll be able to see if your banner was really pulled, say, 10K times or if they quit showing it after far fewer impressions. I've caught several places shorting me. You can expect some discrepancies due to caching and other issues, but if you're supposed to get 10K impressions and the image only gets served 2K times, consider it a lesson learned and advertise somewhere else.

    • If you want proof of their traffic claims, ask them to embed a 1x1 GIF from your server (or one of those little FastCounters set to 1x1 size) on their page. Check your own logs, or view the FastCounter in full size, to see if they're really getting the traffic they say they are. Most one-man websites will be happy to do this when faced with the chance to gain you as an advertising customer; but don't expect Excite et al to bend over for you like this.

    • Whenever possible, purchase ads by click-through, not CPM. Click-throughs will cost you more, but I'd rather get 1K guaranteed clicks than 10K ignored impressions.
  • We've been with several banner ad networks, and well, if you're getting more than $1.50 CPM (cost per thousand impressions), you're incredibly lucky.

    I was recently speaking with a company who advertised on our site through a network. They were spending $25 CPM for the ads, and we saw about $1 CPM by the time it made it our way (due to advertising agency and network costs, which seem to be much larger than stated).

    My suggestion to content sites: learn how to sell your own ads. Even if you sell just a handful of your impressions, you'll probably make more than any network could bring you. Keep the sales in house.
  • Some Solutions:

    • make web application code open and available for audit in order to prevent invalid/illegal logging.
    • cryptographically sign the logs at periodic intervals and/or when the applications are stopped and started. This will help prevent tampering. Even encrypting the logs so that only particular individuals can access them might be suitable.
    • use W3C [] standard log file formats.
    • hire a reputable, independent auditor to validate your metrics at regular intervals.
    What's all the fuss about?


  • See what happens when your morning is full of meetings? Sheesh...

    Anyway, although now it's looking old and stale, I still consider the following paper of mine, which was published a few years ago, to be relevant to this topic (IOW, things haven't changed enough since then to make it irrelevant):

    Examining the Validity of World-Wide Web Usage Statistics []


  • ContentZone calculates their page views slightly differently thanm other advertisers. It's a bit tough to explain, but if you don't set up enough unique page codes or whatever they call them, you won't make as much money, it's true (and they say so I think on their page). Engage on the other hand, may credit you for more ads, but DO NOT USE THEM. flycast used to make me 30 bucks a day, then Engage came in and I now make 10 bucks a day on TWICE the traffic. They can't sell even 25% of my ad inventory and their pay is pitifiul on what they do pay. Incidentally, i get about the same amount of hits as you, so really, engage will probably equal 10 bucks a day for you. They blow hard.


  • Server logs can tell you a variety of things, but I don't necessarily think they're useful for marketing purposes except for the owner of the server, and not so much for advertising. I run a small site that gets about 100 unique visitors a day and about 25 regulars. Using the logs and parseing out the data, I can determine that almost all of the people who visit my site stick around for a little while, but don't come back later. At least, thats what the logs say. I can also see the referring site, which tells me where any advertising should be focused on, as well as if someone actually clicked on a link or entered the URL straight (or from bookmarks) which would indicate if a user has visited before. Of course, any user using a dialup connection will probably have a different host/ip the next time they visit, and the logs will still show them as a separate user. AOL's proxy is especially bad as the host will change EVERY TIME the user makes another hit on the site, which makes it very difficult to track. Cookies and user accounts would be much more useful to determine exactly how many visitors you have and how many of them visit frequently. However, I still believe that this information really is only useful to the server operator and not to someone looking to advertise on the site. Marketing as it stands should probably be a trial and error operation. Spend some money and see what happens. When I ran a business several years ago I tried advertising in a variety of different places. Ads for computer sales got practically no response from a computer magazine but got a LOT of response from a simple 4 line classified ad in the newspaper. Sometimes you just have to throw some money around and see what you get back. Yes, there is some risk, and yes, you will probably lose some money before finding a medium that works well for you, but thats the name of the game. -Restil
  • What if a company who is a third-party, independent of either the advertiser or the web hoster were to set up a box through which all internet traffic to the server was transparently passed to. The third party logs the traffic to determine if his logs match what the hoster is claiming. The advertiser can trust the third party because he hires one he trusts to provide this service for him.

    The hosters guys can't access the box because it is literally black boxed (locked up, no physical access, and no knowledge of the logins/passwords)

    The third party logger can remotely access his box, download logs or whatever and provide that info to the advertiser. The advertiser can then check the logs of the hoster and compare them to the thirdy party (aka verifier). If the verifiers logs match the hosters you know the data is somewhat accurate (at least as accurate as these things can be).

    I mean, nielsen does this with those boxes they give to their test families, why can't some enterprising third-party verification company (hmmmmm?) do the same with web-hosts.

    This looks like a nice little niche market for exploitation and mucho money to be made off of. I mean you write a few scripts to keep control over your logs and to send the logs back to a central server that formats this stuff into nice pretty print outs for the suits to drool over at their next board meeting.

    Just a thought...
  • There's no way to monitor the traffic effectively for a web site from just server logs. As others stated, problems with proxies, robots, and whatnot make it impossible to tell if you have a live human at the other side or if it's just 1000 AOL user in a cache or if it's just google for it's monthly visit.

    The only effective measurement of web traffic is by having volunteers that use a special proxy that reports what sites that the user visits back to a server, and to generate it from there. Exactly how the Neilsen boxes do it for television, which unfortunately means the same problems will crop up (Neilsen families tend to be favored around east/west coasts, thus making shows that appeal to midwest or plains state viewers less popular by appearence). Additionally getting volunteers might be a problem, as you'll most likely create a biased set by whom you select. And probably most importantly, privacy issues are more apparent for net ratings.

  • by jd ( 1658 ) <{moc.oohay} {ta} {kapimi}> on Monday September 25, 2000 @10:33AM (#755712) Homepage Journal
    Measuring web traffic accurately is a complex science, and not for the faint-of-heart. Why? Let's start with:

    • European users, especially, use web caches, rather than direct-through connections. So there isn't a 1:1 correspondance between server accesses and users.
    • Connection freezes, time-outs, etc, will end up showing more connections than actual users. (The user has to reconnect, which is a fresh "access".)
    • Framing, deep-linking, etc, will "smudge" the access count between any number of arbritary sites in an unpredictable manner.
    • Browser caches don't refresh on every access.
    • Dynamic IP allocation means that there is an n:n correspondance between addresses and users.
    • Network Flooding != Popular Site
    • Search Engines != Users
    • You don't control how the content is used. For all you know, Joe Bloggs, down the road, has linked his web browser to Internet Conference (a whiteboard from VocalTec that lets you send stuff via OLE to other machines)

    In the end, the only way to guague how many people have read your site is to place unique or unusual information on it, and then find out who knows it.

  • by waldoj ( 8229 ) <> on Monday September 25, 2000 @10:15AM (#755715) Homepage Journal
    I've got a problem with that right now. A site that I operate, [], serves up about 600,000 pageviews each month. But we're regularly credited by 24/7 Media (aka ContentZone []) for just over 400,000. But they don't give two shakes for our logs, and say that we just have to trust them. That's like the U.S. government saying, regarding carnivore, "trust us."

    BS. So I applied to Engage [] (formerly Flycast) last night to get our ads through them. Are they any better? I have no idea. But I do know that ContentZone is screwing us over, and that's incentive enough for me.

  • I disagree. When 300 UNIQUE visitors view my page using the same proxy, they look like one visitor. The only thing web logs can tell you is how many requests your websever received. I call those "hits". Your definition sounds different.
  • by jafac ( 1449 ) on Monday September 25, 2000 @10:41AM (#755718) Homepage
    Gee, I guess somebody finally figured out what the third kind of lie is!

    I think that if you're investing in a web company, you should IGNORE the statistics. Go to the site. If it's lame, don't give them your money. If it rocks, go for it? How hard could that be?
  • My admin has that too. In fact, we just(read: lazy) turned off any LOCAL referrers(which comes from people surfing around different pages in 1 session). It's amazing what you'll find sometimes linked to your page.

    For some bizarre reason, there were 15 counts of a referrer from osdn regarding the Slashdot cruiser. I checked the Slashdot cruiser web page only to find nothing linked to my site. Strange(but then again a page doesn't have to be linked. It could be 15 people were at that page first, and then went to something on my site)

    Other than that, I found the usual google returns, and plenty from articles I commented on from here.

    And when i hosted a real wacky e-zine(the boulder news frenzy) which had tons of vulgar language, every pervert with keyword searches of "toilet sex", "rape", etc went to the zines on my pages, only to be disappointed to find an ASCII rag.

    So take a look at those referrers. You'll be amazed what you find. Often you'll see someone on a webboard post a link to one of your pages with a positive/negative comment.
  • by Mike Schiraldi ( 18296 ) on Monday September 25, 2000 @09:31AM (#755722) Homepage Journal
    But when you advertise on the web, you can look at your web logs to gauge the audience - you don't need to trust their logs, or Media Metrix', or anyone else's.

    In fact, by looking at your own logs, you can say, "Well, Yahoo sends 10,000 people a day to my site, but only 10 of those people buy anything.. Meanwhile, Slashdot sends 1,000 people, but 500 of them end up buying stuff."

    So why are such ratings needed?
  • by Anonymous Coward on Monday September 25, 2000 @09:31AM (#755723)

    Carnivore is the answer. Let the feds provide accurate and unbiased information!

  • Good luck. That's why we have audio amplifiers in our radio. After the IF stages you're left with a filtered signal that is extremely weak. The purpose of the local oscillator (at 455kHz) is to turn it to the frequency you're listening into the audio range. Once you pass the local oscillator it has to be amplified. You aren't going to be able to detect the broadcast frequency being mixed with the LO... if you could, then you would have needed an audio amp to being with.

  • you can only look at your own logs to see that Yahoo sends you 10,000 people a day after paying them substantial money for what they said should average 20,000 people a day

    In the example you provided of 10K clicks when 20K were expected, this can be chalked up to a crappy banners by your graphic artists. However, there are brokers out there who buy ad space from many websites and sell it at a reduced rate to companies, some of these brokers buy space from the companies that pay their users to click on banners, in which case you'll get a high clickthrough but nobody who clicks through is interested in your website and will click the back button immediately. If you are contacted by a broker, ask them what websites your banner will be on, if they do not mention any of the websites where people are compensated for banner clicks, and it turns out that the majority of the banners are going to these companies, contact the broker and tell them to pull your ad then contact your credit card company and dispute the charge. This has happened to my company several times, the brokers have never attempted to get their money after being told the charge was disputed.
  • Sampling is for media where it wasn't feasible to produce accurate counts (I say "wasn't" because media like TV can do much better than methods like the Nielsens these days -- they've got a severe case of "but-that's-the-way-we've-always-done-it"-itus). Now, measuring website traffic can be done with some accuracy, as long as you're careful what you're measuring -- some sites still count each page hit as a separate visitor! And provisions should be made for filtering out (to the extent possible) results from "visits" from the usual 'bots and trolls.

    As for the ease of faking server logs, not a problem (inserting standard I-am-not-a-lawyer disclaimer here): if you're using them as proof of traffic to your advertisers, write that into the contract -- then faking the server log becomes fraud, with the appropriate legal remedy's available. This is not my favorite solution (especially not with anything to do with the Internet), but displaying advertisements for money is a business relationship, and can be managed as such.

  • I hate to break it to you, but you seem to be harkening back to an idealistic time that never was. Decitful business practices are nothing new. If they were, we wouldn't already have such things as:
    • Laws against fraud
    • Underwriters Laboratory
    • the Better Business Bureau
    • Truth-in-advertising laws
    • Consumer Reports
    • Ralph Nader

    Sure the Internet is providing a new avenue for many past practices, and the information-centric focus does create greater opportunity for "fudging", but this isn't anything that hasn't happened before.

  • the web is a popularity contest because in the "new economy", it's all about marketshare. That's it. Nothing else matters. Revenue doesn't matter. Profitability doesn't matter. A business plan doesn't matter.

    The premise behind this "marketshare is everything" is that, since the internet is a "new thing", the guy who takes over the most marketshare first, is going to be the dominant player - people think this way because they saw what happened when Microsoft entered a new market, and got the most marketshare. They dominate. They're damn near owning the whole freakin world. If they had played it more laid back, and done more honest hard work up front, they probably would have avoided this whole DOJ mess, and ten years from now, *would* 0wn the whole world. But no, the execs got lazy and greedy, and when it became apparent early on that Microsoft was only interested in putting out "good enough" products and killing off competition (instead of allowing competition to exist, albiet in a weakend state), the threat was so obvious, they had to be stopped. Act like a bunch of gangsters, get treated like gangsters.

    Anyway, the investment and business community is expecting SOMEONE to take over, and they want a piece of the action, of course, so that's why people are willing to risk a few investment bucks on who they perceive will be the Genghis Khan of the Internet.

    That's the "new economy" in a nutshell. And frankly, AOL/TW is "it".
  • Are you kidding? The punishment in this country for fraud is not public hanging. The punishment is that you have to give all your money to a lawyer.
  • It seems counterproductive to purposely use an innacurate statistical measure when exact counts are readily available
    How exact are these `exact counts'? ISTM that all sorts of things (caches, reloads, etc) can cause hits to be mis-counted.
  • by 198348726583297634 ( 14535 ) on Monday September 25, 2000 @09:35AM (#755752) Journal
    I've been put in charge of producing the stats for my company's websites. I'm using Wusage [], which is plenty configurable, scriptable, very well-priced for its functionality, etc., and I've set up a number of exclusion-filters.

    What I'm blocking out so far is:

    our company's internal IP traffic


    funky robots like Keynote-Perspective that the old webmaster had let loose on our sites.

    This gives us some numbers I have confidence in (even though they're 10x less than the numbers the old guy was producing through Webtrends), but I'd like to find out what others are doing for making their own web stats.


  • Why must the web be a popularity contest? At most the website itself should only be conserned about how many people visit they're website so they can keep their servers up to speed. They can get this form their own logs.

    Seriously, who really cares if NewsTrolls is visited more than Slashdot (just an example). The important thing is that they're getting visitors and the owners are enjoying their job.

  • by Webmonger ( 24302 ) on Monday September 25, 2000 @09:37AM (#755755) Homepage

    Excuse me while I go "Grumpy old man". This is an old, old problem. It goes back to the days when I first started using the web. See "Why web statistics are (worse than) meaningless []." It's an old article. That's the point.

    In short, spiders, proxies and caches make it impossible to be accurate in measuring traffic. But everyone else is affected the same way. So your relative stats are relevent-- they just aren't hit-for-hit accurate.

    What your server logs are really for is resource planning. They'll help you find out how much traffic your server is serving, which should help you plan bandwidth and hardware upgrades as needed.

  • the one thing I can count on is that my site doesn't (and won't) get any hits :)

  • I recently had to analyse some companys web statistics as well. Some nightmare! They had changed the structure on their web, changed hosting company and lost some logs. Not to mention how noone even was sure when those changes had taken place...

    Fortunately, I was only intrested in changes over time, so I concentrated on inventing a measurement that gave a fair comparison.

    My work was done the old fashion way... Look at some server log filter out the obvious (gifs, anything with a sessionID in it, internal or developer hits etc) throw spss (statistical analysis tool) at it and start scratching your head...

    I started my presentation by telling everyone to please ignore the absolute figures and focus on trends and variations.

    What really bothered me was the thought of how good an site analysis tool I could have hacked together in those hours I spent decrypting archived data. The intresting part was to see how some people really care about being anonymous on the web. Makes an slashdot addict glad to see stuff like referer="none of your business" and cookie: Note="like most people I prefer my browsing habits to be anonymous"

  • When 300 UNIQUE visitors view my page using the same proxy, they look like one visitor.

    I still hope to get some time to work on BBStats [] again, my webstats package - in terrible shape as it is - but I was hoping to solve this by using optional session cookies or the ability to import other cookies from the site.

    For example, /. could log my IP to see whether I am unique, but they could also fetch my cookie (and still fall back on IP if it doesn't exist).

  • The fundamental thing with statistics, and the reason that most people are so easily confused by them, is that statistics are meaningless without a strict context. Not only that, statistics can be harmful when used for purposes outside of that context. We can use these facts to look at the current example of Web Site hit counts.

    First, suppose I am using a number of web sites to promote my online store, In this case, I may be most interested in the amount of sales each site produces from click through users. For this purpose, I can simply assign a sale to a certain site. For the purposes of this discussion, I will assume that all sales can be assigned to a certain web site. At certain intervals, I can find the percent profit attributable to each site, and create a statistic with the ratio of the % profit from a site to the cost of advertising on that site. This statistic will create a valid comparison between sites.

    Second, suppose I am most interested in branding, as Verizon is of late. In this case, I might want to pay an external agency to monitor the sites on which I advertise. Such an agency would presumable use a consistent and statistically sound method to determine the number of eyes that has seen my brand. I can then set up a statistic with the ratio of # of eyes to the cost of advertising for each site. Again, this will create a valid comparison.

    It is notable that in either case the web logs for particular sites are not clearly useful. Even if the information itself was not suspect, web logs would not be comparable between sites. It would be difficult to set up a useful statistic to compare the value of each site with respect to my product. To put it another way, the web log for a particular site are useful to that site for generating a number of site specific statistics, but few if any of those are going to be of interest to me as a paying advertiser.

  • The FBI could get into the business of counting hits. I mean, they'd be reading through all the traffic anyway; they might as well do something useful with it...

  • by Erasmus Darwin ( 183180 ) on Monday September 25, 2000 @09:38AM (#755775)
    So why are such ratings needed?

    They're needed because they have to have numbers to show to their advertisers. An ad that's being viewed by 4 million people has significantly more value, and thus has a higher cost, than one that's only being viewed by 2 million people. If someone is willing to take the hosting site's word at face value with regard to eyeball real-estate, then I've got some banner ads (and a bridge) to sell them.

To do two things at once is to do neither. -- Publilius Syrus