Tuesday, 28 February 2012

Google+ is backwards. I can't believe it took me this long to notice.

I've said before that I like Google+. I still like it even though I'm not really using it for the moment, and that's because most other people seem to have said 'meh' and carried on regardless with Facebook.

What Google+ has been quite good at, is grabbing some smaller communities for whom look and feel matters. My circles have fewer marketers than Twitter, but more data visualisation people. Photographers have jumped on. There are a lot of sports enthusiasts (yes, I'm talking about paragliding again) sharing photos and video.


Circles were supposed to make those distinct groups work, but they don't. Google+ is engineered backwards, on a push rather than pull model.

Take yesterday's Wallpapering Fog post. I shared that on Google+ and I shared it publicly, because I'd like as many people as possible to read it (let's face it, we're pretty much all vain that way.) It was a post about analysis software and so probably most interesting to my data visualisation circles, but if I restrict it to them, then people who follow me but I haven't placed in a circle, can't see it.

So I write a blog post on marketing or software and post it publicly. Everybody who follows me sees it.

A few people follow me on Google+ because I also post about paragliding. They don't want to read about Visual Basic and Excel. Oddly, some people do, but the paraglider pilots don't.

Circles are absolutely no help to divide up the content you post. They were supposed to let you have multiple online 'personalities' but they don't, because most of the time you want to post publicly and if you do that, circles go out of the window. You're forced to post primarily on one topic and have people follow you for that reason. If you were very polite, you could put everybody who follows you into a circle and split up the content that way, but it would be far too much effort. It would also end the asymmetry of allowing followers that you don't follow back.

There's a solution, but it goes to the heart of how Google+ works and isn't a quick fix. We don't just need circles - circles aren't enough to control our content - we need topics.

What if you visited a Google+ profile and it said that the owner of the profile (see if you can guess who the owner is) posts on marketing, data analysis, paragliding and politics? When you add them (to a circle) you get a quick choice to subscribe to some of those topics, or all of them. Instantly, you've segregated that user's online personalities and allowed them to post as much as they like on any topic, without causing followers of other topics to be bombarded with content they're not interested in. All it would need is at the point of posting, for the user to be asked, "what's this post about?" and be presented with a list of their topics.

Just think, you could follow your friends but untick the 'babies' topic. Now that would be fabulous.

You'd need a way to be informed if somebody you follow changes their topics, but that's easy enough as a separate news feed. It would even be an interesting little aside, "oh look, Paul's started posting about sailing, I didn't know he did that."

Topics would shift Google+ to a pull model where you consume the content that you're interested in, which has always been key to how the web works. At the moment, all users are broadcasters. Circles don't fix that.

Monday, 27 February 2012

Why VBA macros got everywhere

Last week, I wrote a post that suggested Visual Basic for Applications (VBA) could be declining in importance for today's analysts. I do think that it is and that it will continue to do so, because it's not evolving to meet the needs of bigger data, or to compete with new and better ways of building dashboards.

VBA has got a valuable lesson to teach though. Never mind that it may now be feeling the pressure from new competitors, how on earth did a BASIC programming language that was bolted to the back of Microsoft Office - and particularly Excel - get to be so important in the first place?

I touched on one of the reasons in that previous post.

"it will let you do things that are otherwise the preserve of IT, which should be the ambition of any good analyst. If you need IT to sort data out for you, then you've failed."

VBA is a fantastic tool for empowering analysts to build their own solutions to problems. It gives analysts the power to create innovative new bits of kit without learning the sort of heavyweight programming, that is the preserve of full-time coders with computer science degrees. What analysts produce in VBA - and I speak from personal experience here - is quite often horrifying to their IT departments. Even very good code by analyst standards is a world away from the way that a good programmer might chose to solve a problem. For one thing, no programmer worth the name would have started their build in VBA.

The thing is, even with that coding deficiency, VBA works. It makes an awful lot of businesses run. (And along the way, it's completely hamstrung Microsoft with Excel upgrades, because it's too embedded in too many places to change it now.)

You could see sloppy programming as a failure on VBA's part; as evidence that all these macros should have been built by IT in a proper language, with version control and a detailed specification. I disagree. I'm an analyst, not an IT person and so I think what VBA did is amazing and we should be trying to repeat it.

VBA survived and it prospered. It did that because it met a need and it's a need that large companies in particular, go out of their way to avoid happening with other bits of software. Excel was the Trojan Horse that put IT capability into the hands of people who aren't supposed to have it. As VBA starts to show its age, we're in danger of drifting towards a world where centralized IT departments control access to data and access to the tools that can work with it.


There's no Trojan horse yet for Big Data. Sure, there's free software which can work with it but as a junior analyst, it's likely to be a struggle to persuade IT that they need to unlock the admin account on your PC so that you can install R, MySQL and FileZilla (and especially that you also need some space on an SQL server.) Getting software paid for will be even harder as the prevailing attitude is still one of, "you've got Office and that's all you need." The majority of IT departments never liked VBA in the first place, because macros that they didn't build crash and cause problems that the IT department is asked to fix. Never mind that a few macros which don't crash are saving hundreds of man hours per year in the finance department.

I should say at this point that I'm not trying to give IT procurement a kicking. What I do want to do is recognise that there are smaller, incredibly useful tools, which don't need a long IT build and which we can't always specify precisely at the start. They need to evolve and they need to be developed by the people who work with data and spend time with clients. By the people whose job it is recognise opportunities for data analysis and exploit them. VBA did that. As VBA ages, what's going to do it in future?

There are three ways that this could go. One is depressing and the other two are interesting.

First, the depressing one. The age of amateur coding within large companies could be coming to a close. I don't think this is all that likely as the benefits are too great, but we might be entering a phase where IT controls access to any kind of developer tools, before the pendulum swings back the other way. What will swing the pendulum back, is larger companies realising that they're taking a pasting from smaller and more agile competitors, where analysis teams are able to run with their ideas.

As a second possibility, somebody could develop the next VBA Trojan Horse. That somebody won't be Microsoft, which is unfortunate because in Office, they've still got the capability to deliver it. Microsoft currently seem most concerned with creating tools for centralised IT to use, which is why from an end-user point of view, all of Microsoft's BI tools are crap. If the Trojan Horse comes, I think it's more likely to be from a new developer delivering a platform that doesn't need admin rights to install on an analyst's PC. That platform could well be cloud based, which is awkward where data is highly confidential, but not an insurmountable problem. In the same way as for VBA, by the time people who are inclined to centralise IT processes work out what's going on, it will be too late.

The final possibility is by far and away my favourite and I think, also the most likely. We could finally recognise the benefits of giving all sorts of teams - not just analysts - some control over the software that they choose to use to do their job. When you think about it, the way we look at software currently is awfully nannying...

"Here's a PC, it's got Office and a web browser on it. That's what you get."

"I could do a much better job with a copy of xxx"

"Write a business case that costs more in terms of your time invested than just paying for the software would have, we'll think about it and get back to you in three months. Probably with a no."

Of course you still need some central control, but there are huge benefits to a flexible approach to software. On a factory production line, you use the tools you're given, but the companies we work in aren't a production line. A better analogy would be a construction site, where you have all sorts of skilled technicians doing different jobs and where you wouldn't dream of telling the carpenter that he can't use his choice of chisel, because it's non-standard.

The CIA recently hinted that it might be heading in a more flexible direction, when it told vendors that it wants to start paying on a metered "pay as you go" basis for its software. You'd do that so that you can install many and different pieces of software and pay for the good ones that end up being used a lot. You'd do it so that you don't have to enforce the same few tools across multiple departments doing different jobs.

We don't know how many analysts the CIA has or what its budget is, because both are classified, but some old guesses put it at around $27bn. The Twin Towers and quite a bit of inflation has happened since then, so I'd say it's a fair bet that we're looking at well above $30bn. That's a lot of analysts and a lot of software. Definitely worth keeping an eye on how they choose to do procurement.

If I were a software developer, I'd be looking for Trojan Horses to sneak my product past IT. Tableau Public is a nice idea, which aims to create critical mass from outside companies by letting bloggers use the software for free. It's not quite there though... Google Docs is probably closest to the cloud idea of not requiring an install and could be the future, but it's nowhere near mature enough for use by analysts. Just a good toy for the minute.

And for me, as once again a small cog in the enormous WPP wheel, I'm hoping that marketing can move to the more flexible software model outlined by the CIA. Back to the original argument, we should recognise what VBA does so well and look for ways to make it happen again. Give staff responsibility for knowing what tools they need and let them do their job even better.

Wednesday, 22 February 2012

Pinterest's interesting. But...

The next darling of the social web has arrived and its name is Pinterest. A lot of people seem to like it and I have to say, I'm quite impressed. If you like paragliding videos (and let's face it, who doesn't?) then feel free to take a look.

Edit: Don't click that link to my favourite videos - it will 404. Pinterest's got a huge copyright problem that led to me deleting my account not long after I wrote this post. Whatever the legalities of Pinterest's attempts to transfer copyright violation responsibilities onto its users, I'm not going to use a website that's obviously built to share content from all over the web, which then attempts to absolve itself from responsibility for being used for that purpose.

Beyond a first play though, I'm finding Pinterest a bit disappointing. It's very dry looking and not like real life pinboards at all, which can be fun and have all sorts of things stuck to them at all sorts of angles, overlapping and scribbled on.

It's nice that you can put lots of cool stuff in one place for people to find but it would be so much better if Pinterest...
  • Took a leaf out of Twitter's book and restricted boards to the visual equivalent of 140 Characters.
    Give people an actual board - with a set size - to fill up and when it's full, they need to start a new one. You'll get much higher quality content if you force users to refine their choices.
  • Let users play with their boards.
    Stick pins on sideways, stick them on top of other pins, let users go nuts have fun, like the Parisians do with post-it-notes. Right now, all boards look the same; they're just a load of links to pictures and videos and don't have an identity of their own. Users need to be able to give their boards a personality.


  • Stick more than pictures to them, like people do with real pinboards.
    They write each other notes, they hang a calendar off them and they pin up tickets to events they're going to. It's part of the more flexible look and feel, but as a for instance, why can't visitors to my boards write on them and say hello if I stick a post it note pad on there?

You could do some of these things by pinning pictures to other people's boards, but the site doesn't make it obvious and the board structure is too clean to encourage you to play. Pinterest's nice, but if it's going to hold our attention, then it needs to be more than just a collection of links to pictures and video that people think are cool. We need to be able to make the boards look individual. We need to be able to play with it.

Tuesday, 21 February 2012

Nikon show their workings

Here we go again with another of Wallpapering Fog's occasional "Showing Your Workings" awards, reserved for ads which  put viewers right at the heart of the creative agency's brainstorming process.

How do you sell the benefits of a camera that starts taking photos before you press the shutter button? Hmmm, that's tough. Get the flip charts out because it's brainstorm time. Let's play a game to get the creative juices flowing... What if it wasn't a camera? What if it was other random things like a kid's toy? Or a coffee machine? Then what would it do?

Hang on, this brainstorm might just make a reasonable ad!




Sorry about the American voiceover but I couldn't find the UK version on YouTube. (unless you're American, obviously, in which case you probably don't mind. Still, apologising's very British, so sorry anyway. Sorry.)

Tuesday, 14 February 2012

Losing touch... or why Excel and VBA won't cut it any more

Thinking through this post is making me feel old. There's going to be a lot of 'in my day' type reminiscing and I'm only 34. It's all this new fangled technology that's doing it. The world's changing fast. I hate people who say that the world's changing fast, but this time it's true.

I got my first proper job twelve years ago this month, as a junior analyst with a small econometrics consultancy and although the statistical techniques I use are roughly the same as back then, I've started to realise that our software tools are going through a revolution. Hence this post - I'd like to stop and look around for a minute to see what's happened.



Fairly quickly after starting that first job, I discovered that data processing in Excel was a hell of a lot faster and easier if you learned Visual Basic for Applications (VBA), so I did. With the help of our IT department and a lot of practice, I got pretty good and it went a long way to getting me promoted because I could make dull work happen quickly, make other peoples' lives easier and build some nice interactive spreadsheet tools for our clients.

Up until fairly recently, if an aspiring analyst asked what they should do to get ahead at work, I'd say get good in Excel. Really good. And learn VBA. The first bit's still true, but VBA? Not so much.

The trouble is, VBA's getting left behind. It's still worth knowing some, but it's nowhere near as important as it was, because creating tools in Excel is nowhere near as important as it used to be. It's also not a good gateway into other types of programming because as a language, its structure is out of date. Although some programming skills are always transferable, you need to pretty much start again when you want to learn another language after VBA.

There's also a problem for the next generation in that they need to get luckier with where they start work to get exposed to the right kit. Everybody uses Excel, so at some point, every inquisitive analyst ends up in VBA. The new generation of tools probably won't be on your PC unless you decide to put them there.

So, you're ambitious and you're six months into your first analyst's role. What do you learn now? Even if your company doesn't use these, this is where I'd start. It's the kit I'm using (and still learning) and it's free, so you can pick it up as a CV booster without buying expensive software. If you're a junior analyst reading Wallpapering Fog then I hope this list might help. You also have excellent taste in blogs, so well done on that.

Let's look at what you need to be able to achieve, as an ambitious analyst...

Collect data

This is much more important than it used to be. Ten years ago, if you didn't have the dataset and the client didn't have it, then you'd have to buy it. Either way, almost certainly it would turn up on a spreadsheet or csv file. You often needed VBA macros to clean it up and make a tidy spreadsheet.

Now, some of your data will arrive like that (so a few simple macros are still handy) but very often, you'll want to trawl the web for it. Senior staff love it when you tell them you can scrape the data that they want off the web, automatically and for free. It will make you famous.

You could learn a proper programming language, but we're statisticians not programmers, so unless you want to do that for yourself anyway, then you need a tool which is designed specifically to work with statistical data. For analysts, R is the new VBA. It's free and it's well worth the effort that it takes to learn.

Learning R gives you the same head-start that VBA gave ten years ago. You don't need to buy new software (just like VBA, which was always in your copy of Excel anyway) and it will let you do things that are otherwise the preserve of IT, which should be the ambition of any good analyst. If you need IT to sort data out for you, then you've failed.

If you get good in Excel and good in R, you'll be in a promising place from which to get your data assembled, which brings me onto...

Process data

Excel worked well when data came in thousands of rows. It still works well for lots of things and the latest versions have finally broken the 65k row limit, but there's a problem. If you throw lots of data at Excel - properly lots - you'll break it. Or wait forever for it to calculate. Excel isn't designed for processing databases and that's what we're working with now.

R can do it, but you need a good level of SQL too, even if it's just to make Access work properly. SQL turns up everywhere and it's easy to learn.

To be fair, you've needed SQL for ages but I keep coming across analysts who aren't comfortable using it. You can't get away with that any more.

Build your models

Excel for the simple ones if you like - it's still a very powerful bit of software. For more complex statistical models, you need something else. Again, R is good. Some of the older competition like SAS (which is another reason to get a good SQL grounding) is starting to look very dated. It's also hugely expensive, particularly when compared to open source.

There's no way I'd adopt SAS now and it's being kept afloat by a legacy of systems embedded in big firms. If you end up using it, fine, but don't learn it unless you have to.

I'd go with R again. And I have.

Make some output


The days of the interactive Excel workbook, emailed to a client, are over. Or rather, they're not quite but they should be and soon will be.

You need to be able to make good looking charts and output in Excel (start here) so that you can illustrate your PowerPoint decks because unfortunately, PowerPoint is still an essential tool to know.

For interactive output, you want dashboards. There's only one bit of kit to learn for the moment and that's Tableau. If you can't persuade your company to buy you a copy, then get the free version and have some fun publishing to Tableau Public. Give it a couple of years and there are going to be some exciting roles around for people who can do good things with this piece of software.


So there you go. Learn a few macros by all means and definitely get very good with the front end of Excel, but take it from someone who's invested a lot of time in VBA and never uses it any more, there's a new world of software coming and you need to learn it. What worked ten years ago, won't cut it in another five.

The scary thing is, that means old buggers like me need to learn a load of new kit, and quickly. Back to the books...

Thursday, 26 January 2012

Rigging the Scottish Independence vote

Yesterday, Alex Salmond released his preferred wording for the question that will decide whether Scotland should remain as part of the UK.

"Do you agree that Scotland should be an independent country?"

And immediately, a cross-section of the research community cried foul.

Anyone who works in marketing research will be pretty familiar with the best ways to rig a survey question; it's how PR companies get exciting sounding press releases to plant their client's name in the newspapers. In a previous job, we "proved" that British women would swap a shopping spree for a night of passion and bagged the Daily Star front page under the headline, "Sex? I'd rather go shopping." Was it true? Frankly, who cares? Probably not, but it was a PR fluff piece that got us loads of free publicity and the survey was designed to produce exactly those kind of answers.

One of the more subtle ways to rig a survey is to ask people to agree with something. When in doubt, respondents have a tendency to agree with a statement, particularly if it's a complicated concept that they don't understand, or if they don't really care either way (which is handy if you're trying to rig a PR survey.) It's called Acquiescence bias and Wikipedia explains the issue well, with a few examples.

The Guardian today gives an example that acquiescence bias could easily create a 9% swing in the response to a positively worded question. That's a lot and could easily decide a tight vote.

The first thing I'd want to do with that question above, is to get rid of the word "agree", which loads it towards a "yes" response.

"Should Scotland should be an independent country?"

Yes, or no? That's much better.

Better still, would be not to demand a yes / no response at all. There's still an issue with "Should Scotland..." because you could ask:

"Should Scotland be an independent country?"

Or... 

"Should Scotland remain part of the UK?"

You're not asking for agreement, but there's still an element of potential bias. Actually, this time the option to "remain" is likely unfair as it invites respondents stick with the status quo, which they will have a tendency to do, when in doubt.

You could argue that this is pedantry (fun though, isn't it? And if nothing else, you know how to rig a survey now) but for me, it's very important. The one thing that you don't want from a referendum is the possibility that the answer is ambiguous and can be challenged. Alex Salmond's preferred question undoubtedly can be. 

For the same reason, politicians shouldn't be allowed to avoid questions about what they will do, for example if the vote is very close, by saying "it's hypothetical". Yes it is, but it's very, very important and we need to know up front what we'll do in that situation, not to argue the meaning of the result afterwards.

The best way to ask about independence is an option that won't sit well with politicians at all, because it isn't a yes or no question.

Which of these would you prefer for Scotland?

1. To be a country within the UK

2. To be an independent country that is not part of the UK

And if you're going to be really thorough, rotate the answers so that remaining in the UK only appears in the top slot half of the time.


Don't get me started on the suggestion of third options and "Devo-max". What are you going to do if they come out with 33% of the vote each? Have a bloody great row, that's what. Which is exactly where we're headed.

And finally... if you really want to know how to rig a survey, ask the experts.


Monday, 16 January 2012

Are you ready for Real Time Planning?

This is a reproduction of an article I've written for this month's Admap. They've chosen to title it 'Track the data on the dashboard', which I think rather misses the point but there you go. On Wallpapering Fog, I choose the headlines.


Real-time planning is a tactical tool that, through analysis of customer behavioural data, enables the short-term refinement of communications strategy, explains Neil Charles of MediaCom.


Real-time planning is one of those marketing terms that has a danger of meaning different things to different people, so I'd like to start off with a brief definition. For me, real-time planning means adapting marketing schedules on the fly, in reaction to new data about how customers behave.


The challenge that this type of adaptable marketing presents is to process new data and then react quickly enough, to take advantage of opportunities as they are identified. However, too often, marketers expect data on its own to be enough and that deep insights will reveal themselves if only we can bring different data sources together. Analysts have known for a long time that this is rarely the case, but large quantities of consumer data are seductive. Surely we could build a more efficient, more flexible media schedule if we had more up- to-date tracking of consumer behaviour?

Inevitably, the data that has provoked this new marketing philosophy flows from the web. We have faster access to more granular data than ever before, both in terms of marketing response through clicks and traffic tracking, and also the ability to ask questions of large online research panels cheaply, and to see the results in a very short period of time.

In practical terms, the web will largely be the focus for the outputs from real-time planning too. Traditional media - where the creative process and buying deadlines are longer-lend themselves much less readily to the type of quick schedule changes, which allow us to take advantage of new data. This online focus should put real-time planning in context for marketers as an exciting new possibility, but one which must never be allowed to compromise an overall campaign. The Internet Advertising Bureau and PricewaterhouseCoopers put UK internet spend at £4 billion in 2010, accounting for 25% of all advertising spend. So while we may have the ability to monitor consumer behaviour (on the web at least) in almost real-time, only a part of the marketing budget is as agile as the response data that we can monitor. Of course, TV or press schedules can be adjusted, but once a commitment to TV has been made, barring disaster, the ads will run largely as planned.

Crucially, most ads should run largely as planned. We often preach the benefits of consistency in advertising and of seeing a brand campaign through, for its full benefits to be felt. Real-time planning doesn't replace the normal planning process, but is about tactical adjustments to a campaign that has been well planned in advance. If our understanding of new data is allowed to constantly re-shape a brand's proposition then we risk compromising our ability to put across a consistent message to consumers.

So, with real-time planning in context as a tactical, rather than strategic tool, and one that is based on very recent data about our customers, what do we need to do to make it work?

It is easy to generate and to track extremely large volumes of customer data. Over the past few years, dashboard software has become cheap and capable, and for a small IT investment, marketers can easily bring together their sales information every week, their own brands' and competitors' advertising spends, response data from off-line direct marketing channels and web tracking from a count of homepage visitors, right down to the number of clicks on individual Google keywords. We can also incorporate brand mentions and sentiment from social networks, track PR coverage both online and offline and conduct quick consumer research polls.

Collecting this data and visualising it, in the hope that it will provide insight and lead to greater marketing efficiencies usually results in disappointment. Large volumes of data, without analysis, are more of a hindrance than a help and, unfortunately, analytical insights very rarely jump off the page from a single chart.

Even where a relationship is obvious - such as when the number of brand term searches is charted against TV investment - what do we do with this information? It's not enough to know that TV is driving additional customers to search for us on Google. We need to know whether this means we should increase the TV budget, attempt to convert more of the online interest that TV is shown to be generating - both, or possibly neither. After all, the current schedule appears to be working!

Rather than tracking large volumes of data and hoping to generate insights from them that will lead to more efficient marketing, the data that we choose to track should flow from analysis work that has already been completed.We need analysts to identify from the vast quantity of available information, variables which are useful, show how they can be used and then to hand that information to marketers.

A recent client example concerned a business which had no concrete data on overall sales volumes in its market, but many variables that might indicate whether they were rising or falling. Sales in the client's business were rising and they wanted to know whether - as some believed internally - this was bucking the market trend, or following it. The answer would have significant implications for advertising, since if the overall market wasn't getting stronger, then the most likely candidate to have caused the extra sales was a recent increase in marketing spend.

Large volumes of data were available that might provide insight, from a set of total market sales estimates that may or may not have been reliable, through to a number of Google searches for various brand and product terms and government economic data on the health of related sectors. The data contradicted each other and tracking alone raised many more questions than it answered.

A long-term econometric study into the drivers of sales had recently been completed, which identified a few key Google search terms that accurately mirrored market trends. This prior analysis flagged up data that was worth tracking and which could answer the question: No, marketing response didn't appear to have changed, and yes, increasing sales were being led by a market recovery.

The key point here is that the data we track to aid our marketing efforts, and which we aim to use to refine campaigns on the fly, should already have an identified purpose at the point when we decide to track it. Data that we do not yet understand in detail doesn't allow us to plan in real-time, it raises questions, which first need to be answered. Answering those questions is an analysis process that can take from a few weeks, to several months.

Together with data, which is already well understood, a second ingredient is needed for real-time planning to work. We need to know beforehand, what our likely reaction will be to a change in the data.

Marketing dashboards, metrics and tracking should be like the petrol gauges or the speedometer on a car. When they change, we already know why and so we already know what to do about it. When the petrol gauge gets too low, we stop and fill up, to avoid an embarrassing call for a tow from the side of the road.

A lot of information about your car isn't displayed on the dashboard. Not because it isn't useful at all, but because it isn't useful minute-by-minute and would be a distraction from driving. This sort of information - on engine efficiency for example - is checked annually when the car is serviced. Marketing analysis should work the same way, meaning that we track what we already understand and can respond to, and ask mechanics (our planners and analysts) to react to more complex data once or twice a year. What the analysts discover might increase the scope of real-time planning as different data becomes well understood. To stretch the car dashboard analogy, we might gain new warning lights on the dashboard, but we are unlikely to start visualising large quantities of new data.

Without prior analysis, there is a sub-set of data that is always useful and, realistically, this is where a lot of brands already do 'real-time planning', whether it is labelled as that or not. Based on direct response data from clicks or phone calls, under-performing press insertions, search keywords and display placements can be pruned from a schedule in real-time without any need for further analysis. Their budget will be allocated to ads with a better response rate, and so all we need to know is that there is a better ad where we could be spending the money instead. It doesn't matter what is the true return on investment to a display ad - only that we can move budget from an under-performing ad to a stronger one that generates more clicks for the same money.

I should point out here that I'm absolutely not arguing against collecting marketing data. We have incredible quantities of information at our disposal to track and better understand consumers and we should keep them, because we don't always know what will be useful until later. This article is about reacting to that data in real-time and day-to-day, those volumes of data become a hindrance rather than a help. Once we focus only on the data that we genuinely understand, a lot of available data - from follower counts to web traffic - becomes surplus to requirements until somebody can work out why it's useful and what it means when it changes.

Even a measure of total sales or footfall to a store, is of dubious value to a marketer who doesn't know what impact the brand's marketing activity will have on the metric. A drop in sales presents two immediate possibilities - spend more on marketing in order to restore sales to where they were previously, or cut marketing in response to a worsening business environment. The data only becomes useful and real-time planning becomes possible, if econometric models or other in-depth response analyses are already in place. Then it is possible to estimate what marketing can achieve, given the data that we're tracking and to decide on the best course of action.

In summary, I would argue that real-time planning is a tactical, rather than a strategic tool. It creates efficiencies on smaller parts of an over-arching marketing strategy and allows us to quickly remove inefficient parts of the marketing mix, or to take advantage of short-term opportunities. It also allows us to increase the amount of marketing investment when that money is shown to be working harder than usual. The overall marketing plan, though, should be driven by longer term in-depth insight work and certainly shouldn't be compromised by trying to make too many short-term tactical gains.

To make real-time planning work, we need data and we need to have done some prior analysis. Monitoring data series that start a debate when they change can be helpful, but it doesn't allow us to make rapid changes to a marketing schedule. An upfront investment in statistical modelling, so that we fully understand the data that we monitor, allows us to predict the likely outcomes of making a change to the marketing schedule.

Real-time planning is about investing in analysis and preparing for situations that could be faced in the future, and if you haven't done that prior analysis, then you're not ready for real-time planning.

As a small illustration of these principles in action, Brilliant Media has a retail client where analysis has revealed that strong online sales can be generated, by up-weighting search activity against a competitor's television schedule. The competitor TV activity is largely predictable and the benefits of diverting the online interest that it generates have been proven. As a result, competitor TV schedules are closely tracked and search terms up-weighted to take advantage of the spikes in search volumes that they generate. This adaptable schedule has real benefits in terms of additional sales and has arisen as a result of a piece of investigative analysis that identified data that was worth tracking and could be responded to very rapidly.

In the end, I would argue that real-time planning is something of a contradiction in terms. We shouldn't attempt to plan in real-time; we plan and we analyse, so that we can react in real-time.

Reproduced with permission of Admap, the world’s primary source of strategies for effective advertising, marketing and research. To subscribe visit www.warc.com/admap. © Copyright Admap.