Published: 25 August 2016

As a Digital Marketer, I religiously follow blogs like Think with Google to get a daily dose of statistics and inspiration for new ways to engage audiences. Currently I’m keeping an eye on how VR will affect advertising and thinking about the challenges VR faces in becoming more mainstream.

The wonderful thing about statistics and data gathering is that its so freely available. With a little time spent in Google Analytics you can learn a great deal about your audiences, their interests and their viewing habits. From that, you can determine where your strongest content lies, or develop an entire strategy for a new website update based on ratios for device usage.

However these conclusions and decisions need to be corroborated by every ounce of data you can squeeze out. It’s very easy to fall victim to confirmation bias – finding data to support a pre-determined hypothesis, and ignoring the wealth of data that doesn’t support it, or even worse, proves it wrong.

My favourite example of this is what happened to a hugely unsung artist named Drew Struzan. You may not know the name, but you definitely know his work.

During a career spanning decades, Drew hand-painted more than 150 movie posters, including the iconic artwork for legendary series like Indiana Jones, Back the Future and Star Wars. The man was so talented and reliable he painted the poster for 1982’s The Thing (bottom row, second from the right) in a single night.


Amazing work, right? So what happened to Drew? In the mid 2000s a group of ‘analysts’ were hired to find ways to cut down on a Hollywood studio’s advertising expenditures. The analysts quickly identified Drew’s art as an easy target, and set out to remove it from the equation. They did this by showing a focus group one of Drew’s painted pieces alongside a cheaper poster made in Photoshop, and asked them a bunch of questions like “Which one makes you want to buy popcorn more?”

When the members of the focus group shrugged and answered each irrelevant question with a befuddled “neither” or “I don’t know”, the analysts diligently checked their boxes, but did not include the group’s added comments of “but I prefer the painted one”, “I’d buy the DVD if it had the painted artwork on the cover” or “I’d buy that poster and hang it on my wall”.

The analysts weren’t asking those questions. The comments did not support their hypothesis, so they ignored them. In the end, they used the results to ‘prove’ that a Drew Struzan poster made no difference in bringing revenue to a movie release, which in turn brought about the end of Drew’s career, and introduced the age of blue and orange

We’ll never know if Drew’s artwork brought any real difference to movie revenue, as that data was never given enough attention and is now lost to time. A movie never flopped because of it’s poster (as far as I know), but perhaps more DVDs would have been sold, or a longer lasting brand impression would have been made by someone buying it and framing it on their wall. It may not sound like much, but these things matter to studios.

Remember how in science classes you learned what a “Fair Test’ is? When using Google Analytics (or any analytical platform), you’re essentially undertaking a science experiment – your tests need to be fair. Remember that the conclusions you draw and the decisions you make need to be backed up by unquestionable data. Cross-reference your stats with different segments. Adapt your hypotheses when presented with new evidence, and cross-reference again. Investigate anomalies like huge jumps in views and try to determine if there’s an external cause. What you find really might surprise you.


Kris Boorman

Digital Marketing Executive


Page Name: {% PageName %}

Page Template: {% PageTemplate %}

CampaignID: {% AgentReferrer.ID %}

CampaignName: {% AgentReferrer.Name %}

CampaignPhone: {% AgentReferrer.Phone %}

Item Location: {% PageLocation %}

Search Session Exists: False