The Real Problem With Alexa 372
As the defacto 'Guy in Charge' of a reasonably large web site, I am routinely asked questions by a variety of people that lead inevitably to Alexa. It might be a question from my Boss at SourceForge about traffic. Or it might be a sales guy asked by a possible advertiser why some other random website is bigger or smaller than Slashdot. Most often it's a random reporter doing background for a story that has nothing to do with Slashdot. Why I'm considered an expert is very confusing, but why they always regard Alexa rankings as meaningful is even more so.
Here's the problem: Alexa doesn't work because of who will install it, and perhaps more importantly, who won't. Let's start with a place I'm very familiar with: Slashdot readers. Until recently Alexa didn't work on Firefox... instead only IE users participated. On the internet as a whole that's fine: like 80% of users run IE. But on Slashdot only like a quarter of you do.
What about re-installing the plug-in after you update your browser? When Firefox 2.0 came out, almost a third of Slashdot readers upgraded within a few days. You upgrade Minor Firefox releases overnight. Even IE users of Slashdot update relatively fast, from 6 to 7 or even minor revisions. New versions often break old plug-ins. When you get that alert that a plug-in is out of date do you just forget about it? I know I do. And that's not even counting clean OS installs. But if I went to random non-technical friends and family installations, I frequently see versions of software so dated it makes me cringe.
And that's not even talking about the fact that Alexa's toolbar is pretty much spyware. How many Slashdot readers are giddy to install spyware? You either? Big surprise. Because of who we are, and what it is, our population will self select out of consideration.
Did you know Alexa excludes SSL? How many etrade users do you think there are? Now personally I'm glad that they aren't tracking my browsing at my credit card company, but it's just another factor reducing accuracy.
Equally perplexing is the accounting of iframes. Let's look at someone like double click's alexa rating. Now it's hard to say, but I don't think I've ever visited their website. Have you? But according to Alexa, they have nearly a 1% share of the internet. I'd tend not to believe it... but they have iframes on zillions of web pages and counting those sure would account for this huge ranking. What about all those badges for the popular social networking websites? What influence are those iframes having on Alexa rankings? Alexa's FAQ says they don't count, but I'm skeptical.
In Fact, Alexa KNOWS that it is a flawed metric for measuring. Have you ever tried actually looking up alexa on alexa? Unsurprisingly, it is unavailable. Why? Visitors to Alexa.com would be the most likely of any user population on-line to have installed their plug-in. I don't know what their 'Rank' would be, but I bet it clearly would be an apples to oranges comparison against ANY other site on-line.
Of course who do you think actually will go out of their way to install something like this? I have a good guess... if you are obsessed with acronyms like SEO or terms like PageRank you are very likely to care very much about these things. I spend a real percentage of my week dealing with people flooding my systems with garbage content designed to screw with these ratings. And you know they all have the toolbar installed so their zillions of worthless spam websites are being counted.
This problem has parallels elsewhere of course: The Nielsen ratings struggle to account for PVRs. Since you got a TiVo, when was the last time you watched "Live" TV? This is part of why Science Fiction shows struggle on TV... scifi fans are early adopters. So we stopped getting counted and our favorite genres are butchered by networks and lost to the void. PVR users tend to be wealthy (those boxes are expensive) and educated. Now I'm not saying that the dumbing down of TV is exclusively the fault of Tivo, but it sure didn't help that we weren't being counted as excellent "Smart" TV shows get canceled while we keep getting more seasons of Survivor. Who we are and how we live causes us to not be counted, and this has unintended consequences.
So what do we do? I wish I had a good answer to this. My first suggestion would be that if anyone mentions Alexa to you that you freak out and go on a 5-minute rant about how Alexa is stupid and anyone who is using it to seriously make a business decision should be fired. It doesn't actually help, but i estimate that every time I do this, I burn the same number of calories as I might on an elliptical trainer. I assure you the beer gut ain't getting smaller on its own.
Alternatively you could just install the toolbar on every machine you can find and skew the numbers ridiculously towards people that are likely unrepresented. Of course, the conspiracy theorists amongst you will just bitch that I'm trying to fudge Slashdot's own rankings in a system I'm claiming to hate. But that only helps proves my point... the conspiracy theorist is a demographic strongly represented on Slashdot that is unlikely to trust this software. We all ignore a broken status quo "Gold" standard that would fail a 100 level college science class on the grounds of flawed methodology. And this only leads to us not being counted.
Alexa's Spiders (Score:3, Interesting)
Turns out, Alexa's spiders were ignoring the robots.txt file, and capturing usernames and passwords. It logged into the administrative area, and followed the "delete" link for every entry. My dumbass boss still didn't want to uninstall Alexa. Could have strangled the man.
Been complaining for years (Score:5, Interesting)
I've been doing this for years. The problem (or actually just what marketers perceive as the problem) is that there is no generic public way to compare web site traffic. The only true way to get traffic metrics is from the web site owners. And they could easily make it up to take in more advertisers. So people in advertising look to Alexa as the only third party source.
The biggest sites don't have as much of a problem because they can work closely with advertising partners. Medium and small sites, however, don't get as much personal attention. So proving themselves as worthy web space for ads is more difficult.
The only people I've heard of that install the Alexa toolbar are web site owners because they want to see their rank often. Ironically so few people have the toolbar installed that they drastically boost their own rank.
We need to convince marketers that Alexa is pointless. But I'm afraid that without a good replace they'll keep using it.
Re:Do it to ourselves, and that's what really hurt (Score:3, Interesting)
Re:Do it to ourselves, and that's what really hurt (Score:5, Interesting)
"It isn't surprising that people who spend money on advertising want to have some metric by which to predict (estimate, guess, what-have-you) the impact of each dollar spent on web advertising."
There are several easy ways:
If you're so naive as to not insist on hard numbers for actual views (the log files are best , you deserve to get hosed - you can analyse the log files and factor out multiple views per host ip to get the actual number of real views, and reduce fraud; ditto with geolocation of ip addresses to factor out bots in 3rd world countries; ditto for bots that crawl every link on a page; ditto for pages that are loaded then immediately dumped for another page).
As an advertiser, I'd want unique eyeballs - real human eyeballs - that can be verified.
Re:Do it to ourselves, and that's what really hurt (Score:5, Interesting)
I would LOVE to have a similar scenario for other ad-driven media. Imagine if I could flag TV commercials with "not interested" and then never see that commercial again, or any commercial for a similar product. Once it got a good feel for what I really like and don't like, I probably wouldn't feel the need to skip commercials. The same could be said of web ads. If I could cherry-pick which ads I was interested in and which I wasn't I might not be so inclined to block ALL of them.
Ads are useful to me sometimes, but picking the signal out of the noise is usually such a hassle that I'd rather just skip the whole process. If everyone could make a very personal statement about what they want to see ads for and what they don't, I think the benefit for both parties would improve.
Spyware yup. (Score:5, Interesting)
http://www.symantec.com/security_response/writeup
Re:whine, whine, whine (Score:3, Interesting)
I'd argue it is rather different. TV is one way. Your television browsing habits are slightly less revealing than say, your banking activities or the blog entries you post.
Also, Alexa claims to give you some value in exchange for letting them piggy back on your browsing. Nielsen is more public and more respected. This helps mitigate the sampling problems.
If his "boss" (or any of the other scores of people who accost him about the popularity of websites) would let him pick the metric, he wouldn't have this problem.
The point of the article is that he has to defend someone else's choice of metric.
Or perhaps, the point is more of an "Ask Slashdot" sort of thing...
As in, "Hey all you /. geeks, what's a better way to do this?" Taco's comments on the flaws in Alexa's system and Control Group's comments on some of the particular challenges against this demographic in general support that.
Heck.. it seems like an interesting enough problem to me, but then again, I don't have a sig like yours:
If you hate it that much, why are you hanging out here?
(Sorry, I really need to stop feeding the trolls...)
Re:Do it to ourselves, and that's what really hurt (Score:2, Interesting)
And add to this mix that we collectively HATE advertising. So we all use ad blockers, flash blockers, script blockers, image blockers, and anything else we can find which reduces or eliminates advertising which gets in the way of reading the content of a web site.
So even if we do get "counted" and the advertisers can determine what it is that we browse, the current method of "in your face" ads will quickly push us towards a way of either blocking the ads, or simply not going there any more.
And I DO click on ads, but only if they are:
- NOT in the way of the content
- NOT blinking, flashing, moving
- NOT trying to distract my eye towards them
If ANY of the above happen, I am gone from the site, and will NEVER go there again.
(Hey, this is my 1,000th post. Woo Hoo!)
Pfft, screw that. (Score:3, Interesting)
Re:Asked and answered (Score:5, Interesting)
I am not saying that Alexa is good for looking at traffic trends either - their numbers vary WILDLY from what our actuals are. Oddly enough, Hitwise does a much better job, but I suspect that is a lot of blind luck on their part as I think they take data in a similar fashion.
I'm not sure I had a point, except that web logs aren't really feasible when your traffic crosses a threshold - I'm sure
Re:Alexa's Spiders (Score:3, Interesting)
The HTTP spec clearly says that GET requests should only be used for idempotent [wikipedia.org] actions. Technically, deleting an entry is an idempotent action, so using a GET link for a delete entry is - well, brain-dead stupid. But it doesn't break the spec.
See, an idempotent action is simply an action which has the same outcome the second time you attempt it. Deleting an entry twice doesn't change the final state of the system - the entry is still deleted. That makes it idempotent.
Of course, anyone with an ounce of sense would realize that what they really meant was that GET requests shouldn't change state and that POST requests should be used to change a system's state. (Or PUT, or DELETE. But no one ever uses those.) Which was the point of the parent poster in any case.
But before someone pulls out the "GET is supposed to be idempotent" part of the HTTP spec, remember that deletes are, technically, idempotent. They're safe to attempt multiple times, and leave the system in the same state afterwards.
Alexa ratings (Score:3, Interesting)
Re:Rant as news (Score:3, Interesting)
On that note, I don't actually have anything to say about the topic at hand, but then again, neither did the parent.
Re:Asked and answered (Score:3, Interesting)
-Mandrake
Re:Asked and answered (Score:1, Interesting)
Re:The Rant and the Slashdot problem. (Score:3, Interesting)
Re:*I* figured out why Taco's on a rant! (Score:5, Interesting)
No, like another poster said, it is quality over quantity.
If you think some of the arguments on Slashdot are asinine, wait until you read the ridiculous ones on Digg. And give everyone the power to moderate and you have people burying others' comments because they disagree with them.
Add bad grammar, spellings and l33t speak and you have a ridiculous combination of utter rubbish that only a bunch of emo sixteen year-olds can spew forth. Give me Slashdot any day.
At least some you trolls have character.
Re:Alexa's Spiders (Score:1, Interesting)
Still, many apps do use GET requests to delete things when the desired UI is to have a delete link (i.e. text rather than a button). This is somewhat preferable to having a hidden form that gets submitted (or triggering an XMLHttpRequest) since it doesn't require the user being browsing with JavaScript enabled. This practice is becoming less acceptable since you can pretty much style an HTML button to look like a link in almost any browser.
Firefox... problem solved (Score:2, Interesting)
dealing with http logs on busy sites (Score:3, Interesting)
At W3C [w3.org] we log almost everything as well, and we end up with way too much data as a result.
But we use the logs to detect and prevent certain classes of abuse as well (e.g. too many requests in a short time interval [w3.org] or re-requesting the same resources over and over [w3.org]), and we also want to be able to track trends over time, so we have been reluctant to just throw that data away.
I have a plan that I have yet to implement, which is to log only 0.001% of the requests for certain very popular resources (e.g. HTML DTDs and valid-HTML icons), which would allow us to monitor trends without logging tens of gigs of data per day; we'd just need to compensate for it when calculating stats later.
Then I planned to monitor for abuse by also logging every request to a script that watches for abusive traffic patterns, an easy adaptation from the current script that wakes up and skims the logs every 10 mins.
(in your journal entry, when you say you are MD5ing IP addresses for privacy reasons, are you adding a random bit of data to the IP address before calcuating the MD5? If not it's pretty easy to find out which IP address corresponds to a given MD5 sum.)