Long ago the we measured traffic to our site via the log files of our webserver. Pretty reliable because you can exactly measure which files are requested by whom: you woke up in the middle of the night because a certain trend in the logs made you make abrupt changes to your site.
Nowadays these logs are more or less seen as interesting for “technical issues” and not so much anymore for analysis of website traffic.
But… a couple of things happened since long ago:
- rise of clientside trackers – After a period of webbased “counters”, who simply insert an element in a webpage and count the number of requests along with some meta information (and which we placed on our sites just because they were cool as a graphical element), Google Analytics came on the scene and got a great following of users followed by companies using the same technology e.g. Clicky or e.g. the site stats inside every WordPress blog using JetPack. It has gotten to a point where these “counts” (page views, uniques) which were just “cool” long ago are now actually used to determine site value, marketing strategies and so on and used by a lot of folks as core metrics e.g. in Flippa they are often requested by potential buyers.
- rise of clientside anti tracker – However at the same time, users actively started to block these trackers. E.g. 15.527.966 users are using AdblockPlus for Firefox, 800.000 users are using Ghostery, 300.000 users are using DoNotTrackMe and there are by now zillions of products, pretty much installed by default by many users (including blocking certain hosts directly from their .hosts files) that I must conclude that the only thing client side trackers are nowadays tracking is the amount of users who are not technically enabled to install plugins or users who are actually a bot. Which Firefox users do not even have the most popular plugins installed? The Firefox users that have no clue on what a plugin is!
What is your estimate of the amount ofusers that has a client-side blocking plugin installed?
p.s. I expect a rise in statistical cloud apps that will trigger on serverside calls (so e.g. a request of my page1.aspx will trigger on the serverside a call to the statistical server of whatever cloud app for visualizing stats) versus cloud apps where you sync your webserver logs with to run analytics on a non-real-time basis (and as a note: for “the masses”). So e.g. on a Nginx based server some app that real-time monitors the ngx_http_log_module or maybe an extension on it and interfaces with a stats service on a remote server or one layer higher such as e.g. http://code.google.com/p/php-ga/ and such.
In fact … this is what I expect to be landing on my cool page somewhere soon by some hip company:
It will give me both real-time info as well as longer trail info and it will give me the technical information I need often not completely different from other essential information to understand what is going on. (I did not dive deep enough in OWA to understand if it already does the thing above).
I notice that a lot of websites from even large companies out there do not use this model since “long-tail” 404’s still exist on all kinds of places. 404’s might be a good measurement for ‘fail': main pages on a domain might have some fancy 404 page but long forgotten sub domains that no longer exist often are deserted places. But also places that might never exist but may be interesting e.g. when I type http://forum.apple.com I would expect another reply (of course it gives server does not exist) but what if 1000 users would type this in their browsers. Would it ever get noticed by yourself on your domains?