Thoughts about media in Canada

Data isn’t that evil, if you avoid common mistakes

Very few media planners came into the industry thinking “I’ll create the most epic pivot tables!” or “I just love databases” or “I really am eager to calculate the ROI of this campaign.”

Very few digital planners came into the industry thinking “Trafficking is going to be awesome” or “In 5 years, I see myself writing snippets of codes for Google Analytics

The reality is: you don’t get into this industry thinking you’ll need a math nor a programming degree. So when you realize that you kinda do (a bit less true for bigger offices, but still…), it’s easy to panic.

Now now… relax. Take a few seconds to calm down while looking at this cute cat picture, and then read below.


Data isn’t THAT evil.

Shall you crunch the numbers yourself or have your analytics team to do it; it doesn’t have to be a nightmare. You just need to avoid these few common mistakes.

#1: Execute First, Think After

This is by far the worst thing that can happen to you and your data team. If you arrive at the end of the campaign, promised the client that you would report on lift in purchase intent and you had flash banners with no interaction, no site analytics access, no survey implemented, no predetermined test per market… how are you supposed to track purchase intent?


All you usually need is a quick chat with your data guy before starting the campaign. Explain your KPIs & your campaign. Or just ask yourself “Will my CTR tell me anything about Purchase Intent?”

 #2: Botch trafficking

Yes, trafficking sucks. We all know. The thing is, if you bought 3 types of targeting/segmentation with a vendor in an effort to figure out which one drives the most conversions on site, you need at least 3 lines of trafficking. Trust me, the minutes you will spend trafficking will save hours of reporting nightmares. You don’t want to have number crunching nightmares, nor do you want your data guy to hate you with a passion, nor do you want your client to think your report is terrible.


#3: Say anything else than “I’ll get back to you” if you don’t know the answer

Don’t commit to something that can’t be done, your client will be unpleased. Don’t commit to something “for free” that will take 40 hours to do. Don’t say “no, we don’t provide that service”; it might’ve been a good growth opportunity. Never say “I don’t know”, you will lose credibility. You are the expert, act like it. Saying “I’ll get back to you” is always best. Always.


#4: Jump to conclusions

I have a catch phrase that I use as often as possible: “If a number looks weird or if it creates an emotion in you, call me”.

How often did you optimize a campaign, added a vendor that looked promising to the plan… and be very displeased by the end result?

–          An Online Video campaign with a CTR of 5% may mean people are annoyed and are trying to close the video.

–          A paid search AdGroup’s CTR of 5% may look awesome, but it may be the worst in driving on-site conversions

We recently ran a banner campaign and were seeing terrible results for onsite sales from noon to 1pm. Given the nature of the client, it just didn’t look right.

So we’ve dug a little.


Obviously, if your pages take 20 seconds to load during lunch hour, it’s not very good. Jumping to conclusions would probably have made us stop advertising during one of the best period of the day instead of just fixing servers capacity.

#5: Be Narrow Minded

Remember what you’ve just read about Average Page Load Time 3 seconds ago? You or your data person will not catch this if the contract is “Report on metrics A, B & C” or if you say “plunk in numbers into this template I’ve made”. You will miss important insights, always. Implement flexibility somewhere.


#6: Disregard Costs

Too often, I see very good and detailed analysis that goes like this:

Excellent Placement #1 rocks!


Then I ask to perform a “Cost per” analysis



#7: Botch Methodology

A while back, I executed a survey. I called residential phone numbers randomly. I asked them “Do you own and/or rent a home?”

Results: My survey eradicated homelessness!


#8: Assume that you have all variables at hand or that one test is enough to generalize

If you use these tips above, you will, for sure, make everyone’s life easier and everybody happier, but what do you think will happen if you think these 8 little tips are the answer to all problems, including world hunger?

Image “Uh-Oh” – Dustin Hoffman – Rain Man – 1988

The same applies to data. Don’t assume Rich Media is better than everything because it performed better in one campaign. Don’t assume creative A was better than creative B if you used different targeting for each. Bottom line is, you generated a directional insight. Something interesting to retest, to reconsider. It doesn’t mean ANYTHING else.

I invite you all to share tips like these. Drop me a line whenever you think of one. Let’s all make our jobs easier. Cheers!


  1. Pingback: “The Rain Man Methodology” aka the stupidity of many case studies | Thoughts about media in Canada

  2. Pingback: Why I don’t trust (most) case studies | Thoughts about media in Canada

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: