Eric Peterson has an interesting post on his blog about a couple of ideas he’s had for measuring ‘Web 2.0’ usage – by which he means sites that use AJAX for in-page events, and (in particular) mashup other apps or web services.
Essentially, Eric suggests two methods:
- Web service apps (e.g. Google Maps) implement the ability to have a user identifier passed in when they’re called, and then expose an API to extract usage data for this identifier at a later date, which can then be rolled into the usage data for that user on the calling site
- Apps expose an API for passing in a tag destination URL and a user identifier which the app will ping with activity data for the user (so that the app can contribute usage data to whichever web analytics app the creator of the mashup is using)
I think both of these ideas have merit, but the second seems way more practical to me, since there would be an enormous overhead in retrieving the usage data – the analytics tool would have to pass in a list of user IDs which could be millions long.
We went through this kind of pain at WebAbacus when a client implemented a SOAP API for retrieving CMS data; because the API was designed to return information about a document at a time, it was appallingly slow when you wanted to retrieve data about thousands of documents in one go.
The challenges facing the second approach are also considerable, but manageable. The key thing would be constructing the right tag request with the information in it. The method that Eric suggests is too primitive; a better method would be to pass through the name of a JS file that the remote app could include, and the name of a JS function that would be called. But there would still be the problem of capturing the individual events within the app.
Some of this comment is to miss the point of Eric’s post, which is to propose some standards for Web 2.0 analysis and reporting; something I wholeheartedly endorse.