Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Best website statistics package? 79

goodminton asks: "As the webmaster for a small but growing e-commerce site, I'm becoming increasingly interested in the quality of our site metrics. We currently use a Javascript-based counter that provides good but basic information, however, a recent Slashdot posting has me thinking the stats from our system may not be as accurate as we'd like. What do you think is the best website statistics package, and why?"
This discussion has been archived. No new comments can be posted.

Best website statistics package?

Comments Filter:
  • Google (Score:4, Interesting)

    by $exyNerdie ( 683214 ) on Saturday May 27, 2006 @12:05AM (#15414384) Homepage Journal
    You might want to try this: http://www.google.com/analytics/ [google.com].

    It's free!!
    (you can register for the invite until it becomes publicly available)
  • by perlionex ( 703104 ) * <[joseph] [at] [ganfamily.com]> on Saturday May 27, 2006 @12:08AM (#15414390)

    For those subscribers using Slashdot's new discussion system, this link [slashdot.org] will work better.

    From the posting, though, I don't understand why you think your (Javascript-based) stats would be inaccurate, though, since only about 1.34% of users disabled or did not support Javascript.

    That said -- I personally use Analog [analog.cx], and although it does give some fairly useful statistics such as search engine terms, most popular directories, referers, etc., I don't find it gives me a very high level of insight into surfing habits. A log analysis tool such as that may be a good starting point for you, though, if you don't currently do analysis of that sort.

  • by Toveling ( 834894 ) * on Saturday May 27, 2006 @12:08AM (#15414391)
    Webalizer [mrunix.net]. Just feed it some nice Apache logs, and let it do the talking. Or, if you're less of the command-line guy, I've heard Google Analytics [google.com] is great.
  • None (Score:5, Interesting)

    by Bogtha ( 906264 ) on Saturday May 27, 2006 @12:10AM (#15414397)

    If you are trying to find out how many people are visiting your site, or how popular particular browsers are, just give up now. No stats package can tell you that. Some pretend to, but it's snake oil.

    The basic problem is that not only are you fighting against the basic nature of a stateless protocol, but the things that skew your numbers (proxies, caching, etc) skew your numbers by an unknowable amount. Some things inflate your numbers, some things hide visitors from you. They don't cancel each other out like some people tell you (just think about it). In some cases, your visitors might not even communicate with your server at all.

    Web statistics are good for measuring server load and monitoring things like search terms people use to find your site, inbound links from referrers, etc. What you will find is that you can install any old stats package, and it will give you lots of pretty charts and numbers, but at the end of the day, you might as well make the numbers up, because they don't reflect reality. And yet for some reason, people still like having them, even when they know the numbers are totally wrong. I have yet to figure out why.

  • by ChaosDiscord ( 4913 ) * on Saturday May 27, 2006 @01:02AM (#15414568) Homepage Journal
    Just because the information you get is flawed doesn't mean the information is worthless. Most data is the real world is deeply flawed, and yet useful information can be extracted, useful trends determined. Sure, your log files will be skewed by who choses the participate (That is, who isn't caught by caches and proxies. If you're using Javascript, who is allowing the javascript in question). But any survey is skewed by those who chose to participate.

    Throwing your hands up in the air and declaring that because you cannot be sure it's all garbage is foolishness. Know the limitations of your tools, accept the error, and take what you can get.

  • Er, nope (Score:3, Interesting)

    by cliveholloway ( 132299 ) on Saturday May 27, 2006 @02:48AM (#15414784) Homepage Journal
    We use Urchin - now "Google Analytics". Unless you want to delete cookies every page hit, and use the Web Developer Firefox plugin to remove hidden fields for every form submission, we pretty much have you tracked. This isn't 1995 y'know...
  • web mining (Score:1, Interesting)

    by Anonymous Coward on Saturday May 27, 2006 @06:40AM (#15415209)
    For me the most interesting feature of a statistics package is being able to do web mining. Very few do this, I only know the one I am using (metriserve web analytics [metriserve.com]). Basically it allows you to find hidden links between pages of your site even when they do not directly link to each other. This gives interesting results on one of my pr0n sites. You would not believe the hidden relations you can find between models, poses, etc as surfed by my visitors. I am sure the data could actually be used for an interesting phd study on the the matter ;-) Or maybe I should implement a nice "we also recommend..." feature.
  • Re:Er, nope (Score:2, Interesting)

    by mabinogi ( 74033 ) on Saturday May 27, 2006 @07:28AM (#15415287) Homepage
    > Unless you want to delete cookies every page hit, and use the Web Developer Firefox plugin to remove hidden fields for every form submission, we pretty much have you tracked. This isn't 1995 y'know...

    or just completely block *.google-analytics.com because urchin is the single most annoying thing on the internet.

    I'm so sick of waiting for pages to load, only to see "contacting google-analytics.com" in the status bar.

    It's the one thing that made me install the adblock extension. I don't care if you're tracking me. I do care if you're ruining my browsing experience.
  • by Ankh ( 19084 ) * on Saturday May 27, 2006 @08:51PM (#15418285) Homepage
    As with most things, it's not really that one package is "better" than another so much as that one might be more useful to you at any given time.

    I use my own package when a Web site is smaller (say, below a million hits per month) because I would rather sample some actual sessions and see where people went and what they were searching for than get an overview. If you see people are searching for Argyle Socks and are finding your page about the Duke of Argyll, you might want to add an extra page and link to it, "if you were looking for...".

    The statistic you most want is the things people looked for that might have reached your Web site and didn't, and that's the one you can't easily find!

    For a site getting under 1,000 hits per day, look at the server logs in detail at least once a week, and make navigation easier, add more content where it looks promising, think about why some areas don't get traffic, etc etc.

    When you're getting 10,000 hits/day, unless most of them are for graphics, the data can become overwhelming. And if you're over 100,000 hits per day you probably need to go to the sorts of reports that give you a very broad overview.

    A link checker and a 404 report can be useful -- Cool URIs don't change! [w3.org]

    Oh -- for anyone interested, although I do have hololog [sf.net] set up on for example my words and pictures from old books [fromoldbooks.org] Web site (in a private directory, sorry), the sourceforge page doesn't have a download, mea culpa. If it looks useful to anyone I've shared copies of "hololog" in the past. It could do with some cleaning up, alas!

    Liam

If you have a procedure with 10 parameters, you probably missed some.

Working...