Archive for the ‘Marketing’ category

Algorithmic Attribution SES Chicago

November 7th, 2013

Screen Shot 2013-11-07 at 11.29.01 AM At SES Chicago I introduced Algorithmic Attribution and discussed the implications for search marketers.  Please feel free to download and let me know if you have any questions!

Download pdf:  Algorithmic Attribution SESChicago2013

Steve Latham
@stevelatham

 

Ad:tech NY Attribution Case Study

January 7th, 2013

3

In November 2012, between a hurricane and a Nor-easter, I presented a case study on Full-Funnel Attribution at the one of the premier industry conferences: Ad:tech NYC.

For the presentation I joined by Brad May of KSL Media, who is not only a client but also an early adopter and supporter of Attribution.  Building on the insights from the Attribution Case study presented at Ad-Tech in San Fran, I was honored to speak again and present a case study illustrating how advanced analytics and full-funnel,cross-channel Attribution can be utilized to maximize performance and boost Return On Spend.

Among the highlights of the case study, we demonstrated:

  • After modeling the impact of assist impressions and clicks, Display advertising accounted for almost 20% of achieved actions.
  • Mobile ads generated low-cost mobile-generated actions (this year’s theme – mobile, mobile, mobile).
  • Search played largely a supporting role.
  • Frequency is an issues that all advertisers need to keep a close eye on.

For those who didn’t make the show, I’m happy to share the case study in two formats (both are hosted on slideshare):

 

As always, feel free to comment, tweet, like, post, share, or whatever it is you do in your own social sphere.  Thanks for stopping by!

@stevelatham

Encore TwitterContact-us3c

Encore ceo Steve Latham to speak at ad:tech NYC: November 7

October 24th, 2012

Encore ceo Steve Latham will be speaking at ad:tech New York on November 7. His session “Next-Gen Ad Attribution: Make the Most of Effective Accountable Measurement” will be led by Forrester analyst Tina Moffett and will feature case studies from Encore and other leading attribution vendors.

Read more here:  http://www.encoremetrics.com/encore-ceo-to-speak-at-adtech-new-york-on-next-gen-ad-attribution


Crossing the Chasm: an Insider’s Perspective on Media Measurement

September 15th, 2012

 

Back in 2005 I had a digital marketing agency and we were buying display ads, email sends and paid search for FedEx Kinko’s.  I still remember the anxiety I felt when we presented results from our media buys.  Paid search of course looked great, with a very low Cost Per Action.  Email produced decent results but clearly took a back seat to search.  The media step-child we call Display accounted for a majority of the spend, and appeared to be a complete waste of money with a CPA that was 10x that of search.  Naturally my client asked the question: “why on earth are we spending money on display when search is so much more efficient?”

My answer was that: 1) we were buying all the search that was available, and 2) we believed the investment in display was creating awareness that ultimately drove more searches.  We cited the increase in clicks and leads from search that correlated with the growth in display advertising, but lacked hard data to support our thesis.  Fortunately my client agreed, but also challenged me to do a better job of justifying the media plan.  We did what we could at that time and found that 20% of those who converted from a paid search had previously clicked on a display ad.  But that was the extent to which we could attribute credit for awareness created by Display ads.

Now, 7 years later, we’ve evolved significantly as an industry. We can now view comprehensive conversion paths and use machine learning and statistical analysis to allocate credit to each impression, click and interaction that influences an online conversion.  We can show the true ROI from online advertising, show the role and ROI for each channel, publisher and placement, calculate optimal frequency and quantify wasted spend efficiently and cost-effectively. Capability-wise, we’ve come a long way since the days of last-click reporting and since 2010, the industry has been ripe for massive adoption (learn why: Five Forces Driving Attribution).

But despite these circumstances, most advertisers still measure and optimize the same way we did back in 2005.  They realize there’s a better way to do it, but for a variety of reasons (read more: “Ph.D Targeting, 1st Grade Metrics”) they stay rooted in outdated metrics that preclude them from optimizing spend and maximizing ROI.

In Geoffrey Moore’s classic high-tech marketing book “Crossing the Chasm” he defined five groups of adopters: innovators, early adopters, early majority, late majority and laggards.  According to Moore, an industry “crosses the chasm” when the early majority adopts new technology, which sparks rapid growth and value creation in that sector.  Adoption by the early majority constitutes the evolution from “fad” to “norm”.  When it comes to measurement of online advertising, these days are still ahead of us.

While some advertisers (innovators and early adopters) have raised the bar in media measurement and optimization, the early majority is still in the consideration stage.  For years they’ve been hearing about Attribution from innovators, and are now seeing case studies and thought leading research from early adopters.  Resembling the tortoise more than the hare, the early majority are reading POVs, evaluating solutions and building the business case to invest in Attribution and advanced analytics.  For the past 2 years I’ve wondered “will this be the year we cross the chasm?”  Despite high hopes for 2011, the big event has not yet happened.  A handful of brands and agencies dipped their toe in the water, but only a small percentage of advertisers have taken the plunge.

Earlier this year I again wondered if 2012 would be the year we cross the chasm.  Given that the year is 2/3 over, and most new initiatives take place in Q1 or Q2, I have my doubts.  But there is a lot more activity and focus on Attribution than in prior years, helped in part by Big Data fever that has everyone atwitter.  As noted in my prior post, mass adoption will happen when Advertisers start demanding better metrics, deeper insights and demonstrable improvement in ROI from their agencies.  It’s happening today in small numbers, but at an increasing rate.  It “feels” like we are close to the Tipping Point and I am hoping 13 (as in 2013) will be our lucky number.

As always, feel free to comment, tweet, like, post, share, or whatever it is you do in your own social sphere.  Thanks for stopping by!

@stevelatham

Encore TwitterContact-us3c

Takeaways: Display Ecosystem Panel Discussion

May 7th, 2012

 

Last month I had the pleasure of moderating the Display Ecosystem panel (View the Video) at Rapleaf’s 2012 Personalization Summit.  On my panel were experts from leading companies that represented numerous categories within the display landscape.  Panelists included:

  • Arjun Dev Arora – CEO/Founder, ReTargeter @arjundarora
  • Key Compton – SVP Corporate Development, Clearspring @keycompton
  • Tod Sacerdoti – CEO & Founder, BrightRoll @todsacerdoti
  • Mark Zagorski – CEO, eXelate @markzexelate

Our discussion addressed many of the issues that we are grappling with in the Ad-Tech industry, including:

  • Complexity: The challenges of planning, executing, measuring and optimizing display media are exacerbated by the complexity in our space.  How can we reduce the cost and level of effort required via integration, prioritization, standards, etc.?
  • Consolidation: What will the landscape look like in 2 years?  Will there be more or fewer players?  Where will consolidation take place?  Who will be acquired and by whom?
  • Effectiveness: What can the industry do to improve performance and effectiveness of advertising? How will better targeting, data-driven personalization, frequency management and 360 customer-centric approaches improve efficacy of online marketing?
  • Accountability: Where are the gaps today, and how should we be measuring results, performance, ROI, etc?• Outlook for publishers, ad networks, DSPs and agencies.  What must each do to survive / thrive in this hyper-competitive marketplace?
  • Other issues: privacy, legislation, new platforms, etc.  In order to fully realize the potential of display advertising (i.e. Google’s $200bn forecast) these will need to be addressed.

After our discussion, I thought about the implications for the Display Ad ecosystem, and for the Ad-Tech industry as a whole.  Here are a few of my thoughts…

  • No other industry is as innovative, adaptive and hyper-competitive as ad-tech. Where else can new niches evolve to multi-million dollar categories overnight with hundreds of startups raising billions in financing every year?  We’ve all seen industries where startups disrupted an established ecosystem for a period of time.  But where else does this happen over and over and over again?  Our industry is all about disruption and it doesn’t take long for the challenger startups to become the established incumbents or targets.
  • No other industry creates wealth like ad-tech.  Where else can companies launch, raise capital and exit for hundreds of millions (or more) in less than 18 months?  Where else are so many successful entrepreneurs (and their benevolent VC backers) rewarded with lifetime wealth for 1-3 years of work?  It’s pretty amazing if you think about it… our modern day decade-long gold rush.
  • Success in our industry requires mastery of several disciplines: marketing, technology and data science.  You can’t be a world-class ad-tech company without expertise and experience in all 3 of these categories.
  • While we are making progress as an industry, we still have so far to go.  Despite the advances in targeting, real-time bidding dynamic creative optimization, analytics and optimization techniques, most media buying is still done the same way it was 5 years ago.
  • There is still much confusion about how real-time exchanges work, and how they can be utilized by agencies and advertisers.  When you overlay that with efforts to aggregate 1st party data, creating proprietary cookie pools and using that data to find new audiences, many marketers become quickly overwhelmed.
  • We still have a scale problem that must be addressed.  While there is a huge supply of impressions available for real time bidding, there are only so many unique audiences in the warehouses operated by the data providers.  The more granular you get from  a targeting standpoint, the smaller your reach wil be.  Frequency capping is challenging, so you end up with hundreds or event thousands of impressions being served to a small pool of unique users.
  • We still have a people problem.  All the technology in the world won’t save us if we don’t have people trained to leverage these capabilities.  We also need a deeper pool of managers and leaders who can bring operational excellence to a fledgling, always-evolving industry.

The wall mural below sums up the discussion – and made for a nice graphic snack for attendees.

 

As always, feel free to comment, tweet, like, post, share, or whatever it is you do in your own social sphere.  Thanks for stopping by!

@stevelatham

Encore TwitterContact-us3c

 

Encore CEO To Present on Attribution at ad:tech San Francisco

March 9th, 2012

Latham to present Beyond the Last Ad: Better Decisions Through Better Attribution

Conversion reporting has become an everyday part of how campaigns are optimized.  As digital budgets grow in size and complexity, collective skepticism continues to build against the current “Last Ad” reporting standard.  The demand for more advanced attribution methods has spawned a host of new analytics and technology capabilities, yet going beyond the Last Ad has yet to “cross the chasm.”  Spiritually – advertisers, agencies and publishers are on board – but their reporting and optimization methods have yet to budge.  Led by Young-Bean Song, an expert in digital advertising effectiveness research, this session will reveal how current standards bias our view of the digital marketing world, and our spending as a result. We’ll also feature the latest data and case studies that quantify the impact of new attribution models.

Key takeaways:

  • Learn how leading brands are tying display ads directly to purchases
  • Discover pre- and post-campaign testing methods that are cost-effective and easy to execute
  • Get insight into some of the most cutting-edge attribution research

Session Leader: Young-Bean Song, Principal and Founder – AnalyticsDNA

Presenter: Steve Latham, Founder and CEOEncore Media Metrics

Session:

ad:tech San Francisco
Tuesday, April 3 at 3:45pm
http://na.ad-tech.com/sf/sessions/i-love-data-attribution-and-online-display-advertising-beyond-the-last-click/

 

Encore logo

Encore TwitterContact-us3c

MediaMind and Encore Partner to Deliver Integrated Attribution

March 7th, 2012

MediaMind announced today its partnership with Encore Media Metrics to help marketers understand Attribution credit across digital campaigns. The integration between the MediaMind platform and Encore will give marketers immediate access to attribution reports that show what worked and what didn’t, in order to easily implement budget allocation to the best performing ads. Below is the first in a series of articles on Attribution written by Steve Latham, Founder and CEO, Encore Media Metrics.

Read the Press Release

Read CEO Steve Latham’s Guest Post on MediaMind’s CreativeZone!

Encore logo

Encore TwitterContact-us3c

It’s Hard to Solve Problems from an Ivory Tower

March 2nd, 2012

Today a colleague sent me a link to a new article on Attribution and media measurement with a request to share my thoughts. Written by a statistician, it was the latest in a series of published perspectives on how Attribution should be done. When I read it, several things occurred to me (and prompted me to blog about it):

  1. Are we still at a point where we have to argue against last-click attribution?  If so, who is actually arguing for it?  And are we already at a point where we can start criticizing those few pioneers who are testing attribution methodologies?
  2. Would a media planner (usually the person tasked with optimizing campaigns) understand what the author meant in his critique: “the problem with this approach is that it can’t properly handle the complex non-linear interactions of the real world, and therefore will never result in a completely optimal set of recommendations”?  It may be a technical audience, but we’re still marketers… right?
  3. The article discusses “problems” that only a few of the largest, most advanced advertisers have even thought about.  When it comes to analytics and media measurement, 95% of advertisers are still in first grade, using CTRs and direct-conversions as the primary metric for online marketing success. They have a lot of ground to cover before they are even at a point where they can make the mistakes the author is pointing out.

In reading the comments below the article, my mind drifted back to business school (or was it my brief stint in management consulting?) and the theoretical discussions that took place among pontificating strategists.   And then it hit me… even in one of the most innovative, entrepreneurial and growth-oriented industries, an Ivory Tower mindset somehow still persists in some corners of agencies, corporations, media shops and solution providers.  Not afraid to share my views, I responded to the article in what I hope was a polite and direct way of saying “stop theorizing and focus on the real problem.” Here is my post:

“…We all agree that you need a statistically validated attribution model to assign weightings and re-allocate credit to assist impressions and clicks (is anyone taking the other side of this argument?).  And we all agree that online is not the only channel that shapes brand preferences and drive intent to purchase.

I sympathize with Mr. X – it’s not easy (or economically feasible) for most advertisers to understand every brand interaction (offline and online) that influences a sale. The more you learn about this problem, the more you realize how hard it is to solve.  So I agree with Mr. Y’s comment that we should focus on what we can measure, and use statistical analyses (coupled with common sense) to reach the best conclusions we can. And we need to do it efficiently and cost-effectively.

While we’d all love to have a 99.9% answer to every question re: attribution and causation, there will always be some margin of error and/or room for disagreement. There are many practitioners (solution providers and in-house data science teams) that have studied the problem and developed statistical approaches to attributing credit in a way that is more than sufficient for most marketers.  Our problem is not that the perfect solution doesn’t exist. It’s that most marketers are still hesitant to change the way they measure media (even when they know better).

The roadblocks to industry adoption are not the lack of smart solutions or questionable efficacy, but rather the cost and level of effort required to deploy and manage a solution.  The challenge is exacerbated by a widespread lack of resources within the organizations that have to implement and manage them: the agencies who are being paid less to do more every year.  Until we address these issues and make it easy for agencies and brands to realize meaningful insights, we’ll continue to struggle in our battle against inertia. For more on this, see “Ph.D Targeting & First Grade Metrics…”

I then emailed one of the smartest guys I know (data scientist for a top ad-tech company) with a link to the article and thought his reply was worth sharing:

“I think people are entirely unrealistic, and it seems they say no to progress unless you can offer Nirvana.”

This brings me to the title of this post: It’s hard to solve problems from an Ivory tower.  Note that this is not directed at the author of the article, but rather a mindset that persists in every industry.  My point is that arm-chair quarterbacks do not solve problems. We need practical solutions that make economic sense.  Unless you are blessed with abundant time, energy and resources, you have to strike a balance between “good enough” and the opportunity cost of allocating any more time to the problem.   This is not to say shoddy work is acceptable; as stated above, statistical analysis and validation is the best practice we preach and practice.  But even so-called “arbitrary” allocation of credit to interactions that precede conversions is better than last-click attribution.  It all depends on your budget, resources and the value of advanced insights.  Each marketer needs to determine what is good enough, and how to allocate their resources accordingly.

Most of us learned this tradeoff when studying for finals in college: if you can study 3 hours and make a 90, or invest another 3 hours to make a 97 (recognizing that 100 is impossible), which path would you choose?  In my book, an A is an A, and with those 3 additional hours you could have prepared for another test, sold your text books or drank beer with your friends.  Either way, you would extract more value from your limited time and energy.

To sum up, we need to focus our energies away from theoretical debates on analytics and media measurement, and address the issues that prohibit progress.  The absence of a perfect solution is not an excuse to do nothing. And more often than not, the perfect solution is not worth the incremental cost and effort.

As always, feel free to comment, tweet, like, post, share, or whatever it is you do in your own social sphere.  Thanks for stopping by!

@stevelatham

Encore TwitterContact-us3c

 

 

Conversion Paths vs. Full Attribution

February 24th, 2012

Attribution is a hot topic!  As marketers are shifting their focus to measurement and optimization, Attribution is rising to the top of the priority list for 2012.  However, like many things, Attribution has many flavors and often means different things to different people.  In this and future posts, I will shed some needed light on this topic and help marketers make sense of this complicated and ever-evolving discipline.

For starters, let’s define Attribution is simply the process of attributing credit to each interaction in a user’s path to conversion.  These interactions may include display ads, paid searches, natural searches, emails, social and other media.  To truly optimize your online marketing efforts, we must measure each channel, vendor, placement and keyword’s contribution, and give appropriate credit in the final analysis.  While the industry generally agrees on the problem (last-click measurement is woefully insufficient) and the objectives (give credit where it’s due), there are many divergent opinions on which approach is best for solving this problem.  With the goal of illuminating and educating (vs. selling) here is my perspective.

Analyzing Conversion Paths

Conversion path analysis is quite popular these days and is usually at the top of marketers’ wish lists.  Not to be confused with site-specific conversion analysis, media-centric “conversion path analysis” looks at the digital channels that influence customers throughout the conversion cycle.  In short, marketers want to a macro-view of all the touch points (we call them “assists”) that drive a conversion.

To capture the data needed to view conversion paths, you need to match impression cookies (set by your ad server when a user is exposed to display ads) and your site visitor cookies (set by your site analytics software).  You’ll also need to maintain all the details for each impression or visit as time-stamped, individual records are a key requirement for conversion path analysis and more advanced attribution.

Once you have the detailed history of impressions, clicks, visits and actions for each visitor, you can query the data to visualize the conversion paths for those who converted.

The table below shows the “average” path for all visitors, as well as the common paths for 4 unique groups of converters (segmented into natural clusters by a machine-learning algorithm).  As noted, the “average” converter saw 6.8 display ads and visited the site 2.9 times before converting, with natural search accounting for 0.4 visits, paid search 0.4 visits and display ads 0.9 visits.

 

Most marketers are content with channel-specific conversion paths, but we’re seeing more and more interest in vendor and placement specific paths and expect this will become more common over time.

Conversion path analysis is a good start towards cross-channel / full-funnel Attribution and should provide a foundation for more advanced (and necessary) analysis.  That said, there are a few limitations that marketers should be aware of when looking at conversion paths.

First, it’s important to note that Averages can be misleading and there is usually a broad distribution of paths that are not represented by the mean. While the average number of impressions was 6.8 in the case above, the number varied between 1.5 and 20 for each group (that’s a big range).  Likewise, while Display accounted for 70% (on average) of interactions that led to a conversion, it ranged between 38% and 88% among the four clusters.

Second, while conversion path analysis is insightful (and may help justify your display buys), you’ll need more information to truly understand campaign performance and determine how to optimize your media plan.  This is where Attribution comes into the picture.

Moving Beyond Conversion Paths to Full Attribution

If you have detailed conversion paths for each visitor, you have the data you need for advanced analysis.  Now you need a model that allocates credit for every impression and click assist in a way that makes sense.

And now we move into the realm of debate and disagreement that is characterized by “my math is better than your math.”  Truth is, Attribution models come in all shapes and sizes; some are proprietary and some are based on well-known statistical methodologies.  While there is no universally-accepted algorithm that constitutes the gold standard in Attribution modeling, there are numerous approaches that are more than sufficient.  The good news is that you don’t need a 99.9% solution to be successful.  In most cases, a 90% solution is sufficient and more cost-effective.

So without getting too deep into Attribution modeling, let’s talk about the Questions your attribution model should answer, such as:

  • What is the relative contribution of each channel, vendor, placement or keyword (i.e. how many conversions should each get credit for)?
  • What is the attributable cost per action (or return on spend) for each channel, vendor, placement or keyword? (see sample report below)
  • How many impressions are required to influence a visit and/or a conversion?  (i.e. what is the optimal frequency?)
  • How does the optimal frequency vary by vendor or placement?
  • What was the actual frequency (and how many impressions were wasted)?
  • What is the appropriate look-back period (how far back should we give credit for assist impressions and clicks)?

 

 

 

 

 

 

 

As always, feel free to comment and share!

The Encore Team

Encore logo

Encore TwitterContact-us3c

FAQ: Can you tell me about a success story you’ve had with a particular engagement?

February 16th, 2012

In a recent campaign we measured for a leading media agency and their client (national retailer), we not only identified the top performers from a CPA standpoint, but we also found that 60% of impressions were wasted.  In fact, 82% of impressions seen by visitors were served to 6% of visitors with an average Frequency of 515 impressions over 45 days (average frequency for remaining visitors was 9).  We then determined that only 4.2 impressions were needed to drive a conversion, indicating a significant opportunity to optimize the campaign by limiting frequency and re-allocating budget to the top performing media vendors.

As always, feel free to comment and share!

The Encore Team

Encore logo

Encore TwitterContact-us3c