October 09, 2017

The Electrification of Marketing

weave room

At the tail end of the nineteenth century, electricity was starting to have a profound effect on the world. As dramatized in the excellent novel The Last Days of Night, and shortly in the forthcoming film The Current War, Thomas Edison battled with George Westinghouse (the latter aided by Croatian genius/madman Nikola Tesla) for control over the burgeoning market for electricity generation and supply. The popular symbol of the electrical revolution is of course Edison’s famous light bulb, but perhaps almost more important was the humble electric motor.

The electric motor was so important because it revolutionized manufacturing, enabling factories to create assembly lines and realize huge efficiency dividends. The Ball Brothers Glass Manufacturing Company, for example, replaced 36 workers with a single electric crane for moving heavy loads across the factory where they made their famous Mason jars.

But for all the benefits of electric motors, many factories were slow to embrace the new technology. As this article from the BBC World Service’s “50 Things that Made the Modern Economy” podcast explains, by 1900, almost twenty years after Thomas Edison started selling electricity from his generation plants in Manhattan and London, only 5% of factories had switched from steam to electric power. Powering a factory with a steam engine was costly, complicated, and dangerous. So why the reluctance to move to electricity?

The reason lies in the way those factories were organized to take advantage of steam power generation. A typical nineteenth century factory, for example making textiles, looked like the image above. Mechanical power was generated by a single large steam engine which ran more or less continuously, and was transferred to individual machines (such as looms or lathes) via a series of drive shafts, gears and drive belts. Because the power was being transferred mechanically, the machines were packed closely together. This, combined with the constant spinning of the drive shafts, made these factories very dangerous to work in; in 1900, over half a million people in the US (almost 1% of the population) were maimed in factory accidents.

Simply replacing the central steam engine with an electric motor did not deliver significant benefits – the drive shafts and belts to the machines still broke down, factories were still crowded, inefficient and dangerous, and the central motor (now powered by comparatively expensive electricity) still had to be kept running constantly.

To truly capitalize on electrification, factories had to reinvent themselves, replacing all their individual machines with versions that were powered by their own electric motors, with power transferred to them via unobtrusive wires rather than spinning drive shafts. In turn this meant that machines did not need to be so tightly packed together; factories could be reorganized to be more spacious and facilitate the flow of items, paving the way for the production line and improving factory conditions and safety. Ultimately, it was the qualitative transformation in the way things were made which was electrification’s biggest benefit.

Reorganizing the marketing factory

The story of electrification and how it impacted manufacturing in the first decades of the twentieth century provides an interesting parallel to the impact of data and AI on the marketing industry in the first decades of the twenty-first.

Today, many marketing organizations have adopted data in a similar way to how factories first adopted electricity: by applying it to existing business processes and ways of working. In direct marketing, the core processes of list-generation and campaign delivery have not changed fundamentally in fifty years – marketers build target audience lists, map messages to this list, deliver those messages, and then measure the response. The sophistication and complexity of all these steps has changed dramatically, but the process itself is still the same.

However, as electricity led to the development of new kinds of manufacturing machines, so data is leading the the development of new kinds of marketing machines, powered by AI. These new systems, which I have written about before, promise to transform the way that digital marketing is done. But just as before, getting there won’t be easy, and will require marketing leaders to embrace disruptive change.

The current ‘factory layout’ for many marketing organizations is based around individual teams that have responsibility for different channels, such as web, search, email, mobile and so on. These teams coordinate on key marketing calendar activities, such as holiday campaigns or new product launches, but manage their own book of work as a sequence of discrete activities. At Microsoft we’ve made progress in the last few years on bringing many of these teams together, and supporting them with a common set of customer data and common marketing automation tooling. But individual campaigns are still largely hand-crafted.

AI-driven marketing systems use a wide range of attributes at the customer level, combined with a continuous testing/learning approach, to discover which of a range of creative and messaging should be executed next, for which customers, in which channels. They break down the traditional campaign-centric model of customer communications and replace it with a customer-centric, ‘always on’ program of continuous nurture. For these systems to work well, they need a detailed picture of the customer, including their exposure and response to previous communications, and they need a wide range of actions that they can take, including the ability to choose which channel to communicate in for a given message and audience.

A fairly traditional marketing organization that is looking to evaluate the potential of AI-driven marketing will, prudently, lean towards trying the technology in a relatively limited pilot environment, likely choosing just one campaign or program in a single channel for their test. These choices make sense – few companies can easily try out new technology across multiple channels, for both technical reasons (i.e. wiring the thing up) and organizational reasons (getting multiple teams to work together).

But this approach is a bit like a 1900’s factory owner deciding to replace just a single machine in the factory with an electric version. Dedicated (and expensive) wiring would have to be laid to power the machine; it would still be crammed in with all the others, so its size and design would be limited; and it would likely need a dedicated operator. In this environment, it would be unlikely that the single machine would be so transformatively efficient that the factory owner would rush out to buy twenty more.

And so it is with AI-driven marketing. A test within a single channel, on a single campaign, will likely generate modest results, because the machine’s view of the customer will be limited to their experience with that brand in that channel; its message choices will also be limited, since it can only communicate within the single channel. These problems are exacerbated by the expense of laying dedicated data ‘lines’ to the new system, and of building many creative variants, to give the system enough message choice within a single channel.

What’s needed is for AI-based optimization to be applied as an enabling capability in all marketing campaigns, across multiple channels and products. This requires significant investment in data and channel integration; but even more importantly it requires marketers, and marketing organizations, to operate differently. Digital advertising, CRM and e-commerce teams, and their budgets, need to be brought together; instead of marketers creating many discrete campaigns, marketers need to create more evergreen programs that can be continuously optimized over time. The marketing factory needs to be organized around the customer, not the product or channel.

This kind of model represents very disruptive change for today’s marketing organizations, as it did for yesterday’s factory owners. In the end, much of the rise of electrified factories a hundred years ago was due to the efforts of newcomers to the field such as Henry Ford, who jumped straight to an electrified production line in the production of his Model T. Today’s marketing chiefs would do well to heed this lesson from history, as disruptors like Amazon, Tesla and Stitch Fix use process innovation to create streamlined, customer-centric marketing functions that are poised to exploit the transformative technology of AI.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

October 24, 2011

Wading into the Google Secure Search fray

broken-chain-1024x768There’s been quite the hullabaloo since Google announced last week that it was going to send signed-in users to Google Secure Search by default. Back when Google first announced Secure Search in May, there was some commentary about how it would reduce the amount of data available to web analytics tools. This is because browsers do not make page referrer information available in the HTTP header or in the page Document Object Model (accessible via JavaScript) when a user clicks a link from an SSL-secured page through to a non-secure page. This in turn means that a web analytics tool pointed at the destination site is unable to see the referring URLs for any SSL-secured pages that visitors arrived from.

This is all desired behavior, of course, because if you’ve been doing something super-secret on a secure website, you don’t want to suddenly pass info about what you’ve been doing to any old non-secure site when you click an off-site link (though shame on the web developer who places sensitive information in the URL of a site, even if the URL is encrypted).

At the time, the web analytics industry’s concerns were mitigated by the expectation that relatively few users would proactively choose to search on Google’s secure site, and that consequently the data impact would be minimal. But the impact will jump significantly once the choice becomes a default.

One curious quirk of Google’s announcement is this sentence (my highlighting):

When you search from https://www.google.com, websites you visit from our organic search listings will still know that you came from Google, but won't receive information about each individual query.

This sentence caused me to waste my morning running tests of exactly what referrer information is made available by different browsers in a secure-to-insecure referral situation. The answer (as I expected) is absolutely nothing – no domain data, and certainly no URL parameter (keyword) data is available. So I am left wondering whether the sentence above is just an inaccuracy on Google’s part – when you click through from Google Secure Search, sites will not know that you came from Google. Am I missing something here? [Update: Seems I am. See bottom of the post for more details]

I should say that I generally applaud Google’s commitment to protecting privacy online in this way – despite the fact that it has been demonstrated many times that an individual’s keyword history is a valuable asset for online identity thieves, most users would not bother to secure their searches when left to their own devices. On the other hand, this move does come with a fair amount of collateral damage for anyone engaged in SEO work. Google’s hope seems to be that over time more and more sites will adopt SSL as the default, which would enable sites to capture the referring information again – but that seems like a long way off.

It seems like Google Analytics is as affected by this change as any other web analytics tool. Interestingly, though, if Google chose to, it could make the click-through information available to GA, since it captures this information via the redirect it uses on the outbound links from the Search Results page. But if it were to do this, I think there would be something of an outcry, unless Google provided a way of making that same data to other tools, perhaps via an API.

So for the time being the industry is going to have to adjust to incomplete referrer information from Google, and indeed from other search engines (such as Bing) that follow suit. Always seems to be two steps forward, one step back for the web analytics industry. Ah well, plus ca change…

Update, 10/25: Thanks to commenter Anthony below for pointing me to this post on Google+ (of course). In the comments, Eric Wu nails what is actually happening that enables Google to say that it will still be passing its domain over when users click to non-secure sites. It seems that Google will be using a non-secure redirect that has the query parameter value removed from the redirect URL. Because the redirect is non-secure, its URL will appear in the referrer logs of the destination site, but without the actual keyword. As Eric points out, this has the further unfortunate side-effect of ensuring that destination sites will not receive query information, even if they themselves set SSL as their default (though it’s not clear to me how one can force Google to link to the SSL version of a site by default). The plot thickens…

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

April 21, 2009

Google adds rank information to referral URLs

The Google bus drops of another visitor in VisitorVille An interesting post on the official Google Analytics blog from Brett Crosby appeared last week, in which he announced that Google is to start introducing a new URL format in its referring click-through URLs for organic (i.e. non-paid) results. From Brett’s post:

Starting this week, you may start seeing a new referring URL format for visitors coming from Google search result pages. Up to now, the usual referrer for clicks on search results for the term "flowers", for example, would be something like this:


Now you will start seeing some referrer strings that look like this:


Brett points out that the referring URL now starts with /url? rather than /search? (which is interesting in itself in its implication for the way Google is starting to think about its search engine as a dynamic content generation engine); but the really interesting thing, which Brett doesn’t call out but which was confirmed by Jason Burby in his ClickZ column today, is the appearance of the cd parameter in the revised URL, which indicates the position of the result in the search results page (SRP). So in the example above, where cd=7, the link that was clicked was 7th in the list.

As Jason points out, this new information is highly useful for SEO companies, who can use it to analyze where in the SRPs their clients’ sites are appearing for given terms. Assuming, of course, that web analytics vendors make the necessary changes to their software to extract the new parameter and make it available for reporting (or, alternatively, you use a web analytics package that is flexible enough to enable you to make this configuration change yourself).

As you can see from the example above, there are various other new parameters that are included in the new referring URL, which may prove useful from an analytics perspective (such as the source parameter). It’s also worth noting that whereas the old referring URL is the URL of the search results page itself, the new URL is inserted by some kind of redirection (this must be the case, since it includes the URL of the click destination page).

Using a redirect in this way means that as well as providing more information to you, Google is now also capturing more information about user click behavior, since the redirect can be logged and analyzed. Crafty, huh?

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon


About me



Enter your email address:

Delivered by FeedBurner