Showcase Your Business as a Thought Leader - Post Your Blog on ALL EC Today
Welcome Guest | Sign In
Content Marketing on ALL EC

The True Measure of a Marketing Campaign's Worth

By Jim Dicso CRM Buyer ECT News Network
Oct 18, 2012 5:00 AM PT

Last-click attribution assigns the entire value of a conversion to the last campaign the consumer clicked prior to converting. Naturally, last-click attribution favors campaigns that influence the lower-end of the funnel. In contrast, first-click attribution favors campaigns that influence earlier in the funnel.

The True Measure of a Marketing Campaign's Worth

Both are simple to use and are widely supported by analytics tools, and they are the most common attribution models used, despite their shortcomings. So why do the majority of marketers still rely on last-click attribution to measure performance-oriented campaigns?

Last-click is an ineffective medium to gauge the true influence campaigns have on the consumer's path to conversions. For online video campaigns -- including advertising, email or on-site -- last-click attribution suffers from similar inaccuracies. Marketers know something is wrong with last-click attribution, but they still view the world through it.

For marketers, it can be a curse to budgets, returns or both. Because evaluating campaigns and understanding cross-campaign interactions are critical in assigning budget and planning the marketing mix, companies that embrace video strategies are employing a different approach.

For performance marketers who solely rely on last-click attribution to effectively measure online advertising campaigns, I've got four metaphors for you.

The Elephant-Riding Mouse

A tiny mouse riding a massive elephant, states, "Wow, look how much dust WE are creating!" Yes, this is a bit bizarre, but so is relying only on last-click attribution. Last-click (and first-click) completely undervalues marketing efforts that influence the middle of the sales funnel, just as this mouse is undervaluing the elephant's size and strength compared to his own.

Measurement models that assume a single touchpoint influence are not great at estimating value, and sorry, Mr. Mouse, neither are you.

Packaging vs. Product

Choosing a product according to its packaging and price, rather than the product quality, is akin to measuring a campaign by a single click or touchpoint.

Last-click does not accurately capture a campaign's real influence, such as purchases, just as packaging does not accurately reflect the true quality and benefits of the product inside.

Finishing Touches

Would you build a house but only pay the painter, because he was the last person to work on it? If only this were a real-life scenario. What an incredible investment: paying (US)$2,000 for the paint job and getting a 2,500 square-foot house worth $350,000!

However, this would never happen, so why is it happening with online marketing measurement models? As customers journey down the sales funnel, they engage with numerous advertisements and marketing messages.

If these multiple touchpoints are not all measured, then marketers are not receiving an accurate return-on-investment measurement of their online ad campaigns.

Thanks for Everything

A graduating high-school senior thanking her last teacher for all her education and success would be undervaluing her four-year academic experience by acknowledging only one teacher.

The message of these metaphors is all the same: Any click-based attribution model fails to accurately capture the real influence of an online ad campaign, and therefore, undervalues brand impact. For example, with smart video advertising, a more personal and intimate medium that captures user attention and engagement for longer periods, video has more potential to influence a buy decision at distance, with or without a click.

Okay, you get it. Now what?

Video Attribution Strategies

A strategy for video attribution offers new potential to execute performance-driven marketing -- particularly through advertising. This is somewhat of a paradigm shift for traditional video marketing, which placed more focus on brand messaging than online, conversion-driven performance. This shift means that online marketers need to adopt adequate performance (and brand) metrics.

Last-click attribution models used for video programs suffer from the same shortcomings described for other campaigns. Modern attribution modeling techniques and comparison between multiple models are effective ways to gain better insights into smarter digital advertising strategies, as well.

For video, however, any click-based attribution model undervalues the effect of merely viewing the video. As a more personal medium that captures consumer attention and engagement for longer periods, video has more potential to influence a buy decision at distance, with or without a click.

Consider omnichannel strategies, in which tactics contribute to online and offline impact. Video advertising measures incremental orders, but there's also brand impact that drives value beyond the 15-to-30 days after a viewer sees the ad.

Control Group Analysis

Remember that attribution models assign value to campaigns based on the marketer's view of the weight of each point in the funnel. As such, attribution is subject to the effects of intuition, politics, budgets and corporate structure. But it is not a true, bias-proof measure of value.

To get a true value measurement for a campaign, enhance your toolbox by using a control group methodology. Control group analysis delivers a true measure of a campaign's effect, regardless of clicks, funnel position, or personal taste.

Not every marketing channel supports this approach (e.g., organic search). However, many marketing channels do, and marketers would be wise to select technologies, vendors and processes that enable it.

Using control group testing takes the guesswork out of attribution modeling. And if you employ both tools, you can use the results of control group analysis to tune and validate your attribution models.

Jim Dicso is president of SundaySky.

Facebook Twitter LinkedIn Google+ RSS
How do you feel about accidents that occur when self-driving vehicles are being tested?
Self-driving vehicles should be banned -- one death is one too many.
Autonomous vehicles could save thousands of lives -- the tests should continue.
Companies with bad safety records should have to stop testing.
Accidents happen -- we should investigate and learn from them.
The tests are pointless -- most people will never trust software and sensors.
Most injuries and fatalities in self-driving auto tests are due to human error.