Cross-Device Attribution

December 13th, 2016 by Steve Latham No comments »

Courtesy of TapadAdExchanger just published a great article about the latest cross-device attribution work we did for Monarch Airlines. By incorporating cookieless tracking and the Tapad device graph, we were able to deliver unparalleled insights into the cross-device customer journey.  In the article, our client discusses the growing need for cross-device insights and how the advanced insights enabled a new level of campaign optimization. Read the article on cookieless, cross-device attribution.

In conjunction, we also announced the successful results of our partnership with Tapad and the eye-popping results from the Monarch case study.  Read our cross-device partnership announcement.

What’s Wrong with Cookie-based tracking?
As data increasingly shows (see case study below), the cookie  is becoming less and less reliable for identifying users.  Whether on desktop (Safari, Firefox) or mobile / table (IOS), cookie blocking and deletion is disrupting the ability to capture the customer journey via cookies (even one the same browser).  If you take into account the growth in cross-platform engagement (searching on mobile, buying on desktop), the problem is exacerbated and the cookie-based customer journey is quickly disaggregated.

Solution: Cookieless Tracking + Cross-Device Attribution 
To address these challenges, two key ad-tech innovations are needed: cookieless tracking and device mapping from partners like Tapad and Oracle / Crosswise.  Cookieless tracking uses probabilistic matching of non-PII data to identify browsers and devices without the use of cookies.  Cross-Device data enriches fractional attribution by linking touchpoints across disparate browsers and devices, enabling marketers to stitch together the cross-device customer journey.

By incorporating cookieless tracking and 3rd party device graph to join user IDs we can:

  • Provide a more comprehensive view of browser-specific and cross-device engagement
  • Allocate fractional credit to Impressions and Clicks that would otherwise be undercounted
  • Deliver even more accurate insights and recommendations, enabling better optimization

monarch-logoCase study: Monach Airlines
With the objective of providing comprehensive, cross-device insights, Encore by Flashtalking utilized cookieless tracking and the Tapad Device Graph™ to provide cross-device, fractional attribution across mobile and desktop media.  A few of the key takeaways are summarized below:

  • 38% of visitors blocked or deleted cookies
  • 42% of converters were matched to the Tapad Device GraphTM
  • 61% of matched converting IDs were bridged (>1 browser ID)
  • Among bridged converters, there were 2.0 IDs per user
  • Unified, modeled data resulted in a 35% lift in ROI attributed to Display advertising

“Both desktop and mobile play important roles in the customer journey. As such, the need for accurate, cross-device insights is critical.  By analysing the cross-device customer journey, we can measure and optimise media investments with even greater confidence” – Monarch Airlines

For more on cross-device attribution, please feel free to contact me below.

Find me on LinkedIn

It’s Time for Brands to Lead

January 23rd, 2016 by Steve Latham No comments »

vonage“Better” begins with the Advertiser
I love the new Vonage ad about “Better.” It’s not only clever and funny, it emphasizes that “Better” should be the goal of every brand.

When it comes to digital attribution, “Better” is being achieved by too few advertisers; most still rely on antiquated last-touch performance metrics for two reasons: operational friction (multi-touch attribution has historically been complex, costly and cumbersome) and the lack of involvement from the brands who rely on their agencies to figure it out on their own. A new breed of attribution vendors are taking the friction out of the measurement process.  It’s time for brands to become active participants in developing (and demanding) better insights that will drive better business results.

Brands and Digital Attribution

Smart advertisers realize they can no longer abdicate responsibility for insights and optimization to their agencies. They recognize the need to make insights a priority and participate in the process of developing strategy, assigning responsibility, defining requirements and selecting a solution. In other words, they need to lead. And, they know it.

As many early adopters of attribution have learned, not all solutions are the same and “one size fits all” does not apply. Without a clear understanding of goals, requirements, limitations and tradeoffs, advertisers often invest large sums in solutions that never meet expectations. So leading goes beyond approving a test. It means partnering with their agency to ensure they are aligned and have a shared understanding of what they need and what it will cost in terms of budget, time and energy – both upfront and over time.

Attribution Readiness Roadmap

To avoid common pitfalls and realize the potential of attribution-informed planning, advertisers should work with their agency to define their needs, evaluate partners and choose the solution that best fits their needs. To assist in that process, here is an attribution readiness roadmap for success:

1. Prioritization: Make “Better” a priority by elevating measurement, insights and active optimization. Set clear expectations and work with your agency to ensure everyone is on the same page.

2. Active Involvement in Strategic Planning: Brands should either lead or be active participants in strategy development, resource allocation and operational oversight. It’s OK to delegate; just don’t abdicate. Work with your teams or agencies to outline a simple plan that defines conversions, performance metrics, criteria for success and a timeline for implementation. Keys to successful planning include:

Define key requirements and document what you need in a solution. Prioritize capabilities based on what you need vs. what is possible.

Crawl, walk, run. Before trying to tackle device bridging, multi-platform conversion pathing and audience integration, become proficient at fractional attribution and optimization of desktop media on a regular basis. Build a solid base from which you can grow.

Pursue quick but meaningful wins, with the goal of always getting “Better.” Remember Voltaire’s quote “perfect is the enemy of good” and focus on incremental progress.

Sourcing Up: After determining what you need, defining criteria for selecting a solution – before diving in to each vendor’s capabilities. Key considerations should include:

  • Accuracy: Fractional insights should be based on validated algorithmic models; it should be easy and efficient to ingest and match spend for each placement
  • Actionability: Insights should translate into clear recommendations, with customizable forecasts for quantifying expected improvements
  • Set-up: Confirm how much time and effort is required to onboard the solution, and if it will utilize existing data vs. redundant tagging of ads. Set-up times may range from one week to three months. Make sure you ask for details.
  • Usability: A good tool should be intuitive and easy to use. There may be a slight learning curve, but it should not take an analytics expert to get value from the solution.
  • Extensibility: How easily can 3rd party data (e.g. audience segments, device bridging, offline sales) be incorporated? Make sure you ask.
  • Portability: Make sure the underlying data is exportable and useful for other applications (i.e. your Big Data initiatives). Also ask who will own the data. You may be surprised.
  • Timeliness: Modeling, analysis and reporting cycles may range from one day to four weeks. If you want to fast insights, make sure your vendor can deliver them.
  • Support: Efficient onboarding, training and ongoing support is a key requirement for success. Find out what you’ll get and what it will cost.
  • Costs: Understand the fee structure and how it will scale as you grow. Given that fees may range from $2,500-$25,000 (or more) per month, you need to know the total cost of ownership. Make sure to include the cost of log file data in your analysis.
  • As mentioned earlier, vendors are taking friction out of the process by offering integration solutions. Tight integration between the ad server, site tagging, data unification and spend reporting are critical. Fully integrated systems now enable advertisers to receive better data and sharper insights while saving time, energy and money.

4. Reporting and Optimization: Schedule monthly meetings to review results, discuss the learning and agree on actions to optimize performance. Celebrate your successes and learn from your disappointments and make the ongoing pursuit of “Better” a priority.

Through active involvement and participation, brands can help their agencies be more successful and achieve “Better” results that will benefit us all.

As always, thanks for reading and sharing.

Connect on LinkedIn


Omni.Digital Attribution Recap

September 17th, 2015 by Steve Latham 2 comments »

Screen Shot 2015-09-17 at 11.54.00 AM

Last week I had the pleasure of speaking on Attribution at AdExchanger’s Omni.Digital conference in Chicago. Our panel “The Next Wave of Attribution Vendors” was moderated by AdExchanger’s Lead Research Analyst Joanna O’Connell. As usual, there wasn’t time to fully answer all of the questions, so here is a recap in Q&A format.

What is Attribution and what’s it good for?

While most agree on how to define “multi-touch attribution” (attributing fractional credit to the interactions that result in a conversion), each set of stakeholders often use it for different reasons. For example:

  • Analysts view it as a means to delivering more accurate reports to the media team.
  • Media buyers often use it to validate performance of their media buys.
  • Advertisers often use it to confirm that their media budgets are being properly invested.

While each use case is useful, it is also limited. Fractional attribution by itself is not an end, but rather a means to learning and optimizing. Through statistically validated insights, brands and agencies glean a much better view into which media partners, strategies, formats and creatives work (and which do not). They can also make more intelligent decisions for cutting waste and re-allocating budgets. If marketers want to get the full value from Attribution, they need to act on the insights. If they don’t, they are leaving money on the table.

What is wrong with Attribution solutions, and where is disruption needed?

This answer has several parts.

First, we can all agree that last-click attribution is flawed approach. Whether desktop or mobile, last-click rewards the lowest-funnel media and penalizes everything above it.

Second, static multi-touch models (e.g. even weighted, U-shaped, time-decay) are better than last click, but only marginally. These still reward vendors who over-serve likely converters and perpetuate the epidemic of Retargeting Gone Wild.

Third, the first wave of advanced (algorithmic) attribution solutions weren’t viable for most advertisers: complex and lengthy implementations, continued reliance on services and a high price tag that only the largest advertisers can justify.

The “new wave“ of vendors recognize advertisers need fast, easy, and affordable technology-based solutions that leverage readily-available data. Extensibility and automation obviate the need for complex integrations and labor-intensive analysis while reducing the time lag between implementation, production and insight. Through rapid onboarding, automated processing and timely reporting, the value proposition is fundamentally changing. Validated insights, recommendations and forecasting, delivered quickly, efficiently and affordably… these are the new table stakes in the Attribution space.

How big a problem is mobile in the world of attribution?

When you consider that attribution is based on conversion path modeling, the lack of user-level mobile data makes analysis very challenging. To assemble mobile conversion paths, you either need a cookie-less ad server or a partner, such as a DMP or mobile conversion vendor, to aggregate publisher data for each device. There are workarounds (i.e. manually aggregating data from publishers and DSPs) but it’s a lot of work. See The Dark Side of Mobile for more on this topic.

What about Google and Facebook?

Google, Facebook and now Verizon aren’t making life easier for advertisers seeking independent validation and advanced insights. Walled gardens can be scaled with a running start. As these publishers consolidate the market, their walls are looking more like the Wall of the North that was built to defend us against the Wildlings. Those of us on the side of openness and transparency are hopeful brands and their agencies will vote for with their budgets to reverse the trend towards data protectionism.

What progress are marketers making in adopting attribution?

The adoption curve is steepening but it’s still early. Surprisingly (or not), the majority of advertisers still rely on last-touch (click or impression) to reward conversions. Some of the pioneers are still nursing their wounds (especially those who tried to run before learning to walk) and a majority of the settlers are waiting for assurance that the path is safe before proceeding. It’s taken longer than anticipated, but progress is happening.

Among those that are leveraging attribution, most are still picking low-hanging fruit with a focus on desktop media spend. Very few have figured out mobile and even fewer are connecting the dots to gain a multi-platform view of users. While there are solutions available, few brands or agencies have the resources needed to take advantage of these new opportunities. I expect this will change in 2016.

Are advertisers using attribution outputs to plan media mix?

Savvy agencies and brands are acting on the insights, but too many just use Attribution to validate that their media is working. As most media spend is still done through IO’s, media buyers must take action to optimize spend, whether it’s pausing underperforming campaigns, re-allocating budgets to top performers or addressing frequency issues with their vendors.

While all agencies claim to be active in their approach to campaign management, too often they tend to “set it and forget it.” We have some great clients who actively review results and take action to improve performance.  But there is also a subset of agencies who are either too busy or lack the resources and/or commitment to capitalize on the insights.  As Brands become more involved in the process, I expect these agencies to become better stewards of their clients’ budgets.

Are they pushing the output of attribution into media buying systems?

While modeled outputs can be sent to a DSP as inputs for buying orders, the idea of a self-optimizing, closed-loop is still a bit futuristic.  First, it can only be done for programmatic buying, which is only one of many vendors on a given media plan (IO driven media still dominates).  Second, it will require close oversight as there are numerous factors that can produce false positives or negatives on a real-time basis.  A few examples that can wreak havoc are: accidental removal (or duplication) of a conversion tag, a glitch in how a confirmation page is served, or a hiccup in the ad server or disruption in the delivery of log files. These events happen often, so if you’re going to send buying signals in real-time to your DSP, you’ll need some guard rails.

In a more practical sense, attribution-based insights are used to compare the accuracy and effectiveness of operational (post-click and post-view) KPIs which are often relied on for daily buying decisions. In some cases we’ve seen these KPIs are sufficient for real-time decisioning. But in many cases they are subject to being gamed by vendors (cookie bombers) who are inadvertently rewarded while quality placements are penalized.

The bottom line is the industry is making progress, but we’re still a long way from Nirvana (self-optimizing systems that use modeled attribution KPIs to guide real-time decisions).

What advice would you give marketers? 

There are two important components to success in measuring and optimizing media: the System and the People.

On the People (behavioral) side of the equation:

  1. Get aligned. Many brands still have silos that make cross-channel initiatives challenging. Internal stakeholders need to agree on the end goal (omni-channel proficiency), which requires integrated planning and measurement.
  2. Delegate, but don’t abdicate. If brands choose to delegate measurement to their agency, they need to be active participants in the process. One way is through monthly meetings to review results (beyond the top line). Review the latest results (vendors, strategies, formats), discuss the lessons learned and define changes to make. Trust but validate, and keep your finger on the pulse of the campaign.
  3. Do something! Don’t let the absence of a perfect solution prevent you from moving forward (remember perfect is the enemy of good). Set expectations that will be easy to meet. Each discovery will surface many new questions, as well as insights.
  4. Rationalize Incentives. Unfortunately, advertiser objectives (maximum efficiency) are not always aligned w/ those of their Agencies and Publishers (maximum spend). Recognizing there is waste in every campaign, incentivize your agency to identify underperforming spend, re-allocate what they can and use the remainder to test and learn. Provide your agency with incentives to optimize efficiency, even if it means spending less in the aggregate (e.g. give them a bonus for saving you money).

On the System (technology and data) side, consider the following:

  1. Focus on your key needs: determine what objectives you’re seeking to achieve, and the questions you’re trying to answer. Then ask vendors how they can help you achieve your specific goals. Ask “how can you help me _____?” rather than “what do you do?”
  2. Leverage your existing infrastructure: If you have an ad server and/or a DMP, you should be able to receive a unified data set (impressions, clicks, visits and conversions per user) for attribution modeling. Tagging every ad is no longer viable (too much effort, latency and data loss) or necessary. Rather than re-invent the wheel, seek to use data that already exists.
  3. Focus on ROI. A good attribution platform should yield $20+ in savings and $50+ in revenue for every $1 invested. Put in this light, you can’t afford not to invest in insights that can drive dramatic improvement in efficiency.  And while it used to be that only the largest brands could afford algorithmic attribution, it’s much more affordable today with solutions starting in the low 4 figures per month.
  4. Learn to use it. While attribution has become much more intuitive and user-friendly, advertisers need to invest some time upfront to learn the new KPIs, reconcile them against older metrics, and teach the organization how to use them.
  5. Crawl > Walk > Run.  Start with desktop and online conversions, then connect offline conversions.  Once you’ve picked the low hanging fruit, add A/B testing to validate causality. Once you’ve mastered desktop, tackle mobile media (by then there should be more options for obtaining conversion path data).  Once you figure out Desktop and Mobile, then add device bridging to get true cross-platform, omni-channel insights.  Remember you have to walk before you run.  If you set reasonable goals and manage expectations, the probability of success will be significantly higher than if you try to do it all at once.

I hope you found this informative and thought-provoking. As always your comments and questions are welcome!

Steve Latham | @stevelatham

Encore TwitterContact-us3c

The Dark Side of Mobile Attribution

August 14th, 2015 by Steve Latham No comments »

Repost of my Data Driven Thinking byline published by AdExchanger August 2015.

The good news: Mobile will be the freight train that drives the media industry.

The bad news: The lack of data availability and transparency will cost marketers billions of dollars.

mobile ad spendSince the iPhone’s 2007 introduction, the media industry has deemed every year to be “The year of mobile.” It took longer than expected to mature, but desktop’s awkward little brother is about to dwarf big bro and steal his girlfriend along the way. Mobile surpassed desktop in consumption in 2014 and will surpass it in spending in 2016. eMarketer predicts mobile media will reach $65 billion by 2019, or 72% of digital spending.

As we move towards a “mobile-first” world, we need to address a very big problem: We still can’t accurately measure performance. The ability to target customers in new and innovative ways outpaces the ability to measure effectiveness of those tactics.

Mobile’s Measurement Problem

The digital media ecosystem was built on cookies to target, track and measure performance. Cookies are imperfect but good enough to develop accurate insights into customers’ journeys. Using cookie data to assemble and model conversion paths, marketers can use fractional or multi-touch attribution to optimize media campaigns much more effectively than with last-click metrics.

In mobile, third-party cookies are blocked on most devices and privacy regulations limit device tracking. Consequently, traditional ad servers are limited to reporting on last-click conversions where possible.

For brands seeking to drive app installs, mobile attribution companies like Kochava, Tune, Appsflyer and Apsalar can track the click that led to the download in Apple or Google stores. Some are working on post-click and post-view reports, but these will be of limited help to advertisers seeking actionable insights.

last-user-sessionThe lack of mobile data means advertisers cannot quantify reach and frequency across publishers. They also cannot measure performance across publishers via multi-touch attribution. The cost and complexity of device bridging further obfuscates user-level engagement.

Rays Of Light

Mobile data and measurement challenges won’t be solved overnight, but a convergence of factors point to a less opaque future. Here are my predictions:


1. Ad servers will adapt to device IDs

Conceptually, a device ID is not unlike a cookie ID, privacy issues notwithstanding, but it takes time and money to introduce a cookie-less ID system. Following the lead of Medialets, traditional ad servers will introduce their own anonymous IDs, instead of cookies, that map to probabilistic and deterministic device IDs. Like cookies, these IDs will allow them to log user level data that can feed fractional attribution models. We’ll probably see some early announcements before the end of year, with more to come in 2016.

2. Data unification will become readily available

To date, demand-side platforms, data management platforms, tag managers and data connectors have fixated on using data to help advertisers target, retarget, cross-sell and remarket. The same data that is used to drive revenue can also be used to connect user-level data for measurement purposes. Companies, such as Liveramp, Signal, Exelate and Mediamath, are already unifying data for analysis. More will follow.

3. Device bridging will become ubiquitous

To date, connecting devices across publishers has been a luxury afforded by the largest advertisers. In time that will change as wireless carriers, and possibly some publishers, offer device graphs exclusive of media and standalone vendors, such as Tapad and Crosswise, will reach economies of scale. At the same time, ad servers and data connectors will build or license device graphs and offer bridging as an extension of their service.

As ad delivery, data management and device bridging become more integrated (e.g. see announcement by Tapad and Medialets), costs will come down and advertisers of all sizes will be able to measure engagement across devices.

4. Mobile attribution vendors will be forced to evolve

As ad servers and data connectors incorporate device-level conversions in their data sets, including app installs, mobile attribution companies will have to expand their offerings or risk becoming redundant. Some may stick to their knitting and delve deeper into mobile analytics and data management. Others may pivot towards media and expand into desktop or addressable TV. Others may just be acquired. Regardless, it’s unlikely this category will remain as-is for much longer.

5. Last-touch attribution may finally go away.

We’ve been predicting the end of the click as a key performance indicator for years. But inertia, apathy and a continuous stream of shiny objects have allowed last-touch metrics to survive while brands and agencies fought other battles.

Now that we’ve tackled video, programmatic, social, native, viewability, fraud and HTML5, the new focus on insights and big data may finally drive the roaches away. The click will be hard to kill, but as we become smarter about measurement, it will become much less visible.

As the mobile data gaps are filled, the promise of cross-platform, cross-device, cross-channel attribution can become a practical reality for advertisers of all sizes.  From a measurement perspective, our best days are still ahead.  But as mentioned in the headline, getting there is going to be quite costly.

Steve Latham

The Problem With Attribution

July 17th, 2015 by Steve Latham No comments »

Repost of my Data Driven Thinking byline published by AdExchanger

In recent months we’ve heard some noise about the problems with using multi-touch attribution to measure and optimize ad spend (see articles in Adexchanger and Digiday).  Some claim attribution is flawed due to the presence of non-viewable ads in user conversion paths. Others say attribution does not prove causality and should therefore be disregarded.

My view is that these naysayers are either painting with too big of a brush or they’re missing the canvas altogether.

Put The Big Brush Away 

broad-brushThe universe of attribution vendors, tools and approaches is large and diverse. You can’t take a broad-brushed approach to describe what they do.

If the critics are referring to static attribution models offered by ad servers and site analytics platforms, such as last touch, first touch, U-shaped, time-based and even weighting, I would agree that these are flawed because of the presence of non-viewable ads. Including every impression and click and arbitrarily allocating credit will do more harm than good. But if they’re referring to legitimate, algorithmic attribution solutions, they clearly don’t understand how things work.

First, not all attribution tools include every impression when modeling conversion paths. Occasionally, non-viewable impressions can be excluded from the data set via outputs from the ad server or a third-party viewability vendor. For the majority of cases where impression-level viewability is not available, there are proven approaches to excluding and/or discounting the vast majority of non-viewable ads. Non-viewable ads and viewable, low-quality ads almost always have a very high frequency among converters, serving 50, 100 or more impressions to retargeted users. By excluding the frequency outliers from the data set, you eliminate a very high percentage of non-viewable ads. You also exclude most viewable ads of suspect quality.

Second, unlike static models, machine-learning models are designed to reward ads that contribute and discount ads that are in the path but are not influencing outcomes. As cookie bombing is not very efficient, with lots of wasted impressions of questionable value, they are typically devalued by good algorithmic attribution models.

By excluding frequency outliers and using machine-learning models to allocate fractional credit, attribution can separate much of the signal from the noise, even the noise you can’t see. And while algorithmic attribution does not necessarily prove causality, a causal inference can be achieved by adding a control group. While not perfect, it’s more than sufficient for helping advertisers optimize spend.

You Missed The Entire Canvas

paint-on-childrenComplaining that attribution models are not accurate enough is like chiding Monet for being less precise than Picasso, especially when many advertisers are still painting with their fingers.

It’s easy to split hairs and poke holes in attribution, viewability, brand safety, fraud prevention, device bridging, data unification and other essential ad-tech solutions. But the absence of a bulletproof solution is not a valid reason to continue relying on last century’s metrics, such as click-through rates and converting clicks.

As Voltaire, Confucius and Aristotle said in their own ways, “Perfect is the enemy of good.”
Ironically, so is click-based attribution.

While no one claims to have all the answers with 100% accuracy, fractional attribution modeling can improve media performance over last-click and static models. And while not every advertiser can be the next Van Gogh, they can use the tools and data that exist today to get a solid “A” in art class.

The Picture We Should Be Painting
I’m a big fan of viewability tools and causality studies, and I’m an advocate for incorporating both into attribution models. I am not a fan of throwing stones based on inaccurate or theoretical arguments.
Every campaign should use tools to identify fraud, non-viewable ads and suspect placements. The outputs from these tools should be inputs to attribution models, and every advertiser should carve out a small budget for testing. While this is an idealistic picture, it may not be too far away. As the industry matures, capabilities are integrated and advertisers, including agencies and brands, learn to use the tools, we will get closer to marketing Nirvana.

In the mean time, advertisers should continue to make gradual improvement in how they serve, measure and optimize media. Even if it’s not perfect, every step counts.

puzzle-paintingAd-tech companies should remember we’re all part of an interdependent ecosystem. We need to work together to help advertisers get more from their media budgets. And we all need to have realistic expectations. From a measurement perspective, the industry will always be in catch-up mode, trying to validate the shiny new objects being created by media companies.

All that said, we can do much more today than only one year ago. We’ll continue to make progress. Advertisers will be more successful. And that will be good for everyone.

Steve Latham

Shedding Light Beneath the Attribution Canopy

May 22nd, 2015 by Steve Latham 2 comments »

adexchanger_logoAdexchanger recently published a timely article “Breaking through the Attribution Canopy” on the Attribution marketplace (view it on Encore’s facebook page). Overall they did a good job of highlighting the conflicts of interest that are inherent when your media vendor is also your trusted source of insights.  They also touched on the emergence of new solutions that are designed to address the needs of the larger market.   Along with other industry executives, I was quoted in the interview.

During the interview, we discussed a lot of issues surrounding media attribution and optimization.  But as with any interview, only a few of my comments were published.  To provide some context and clarify our POV, here are the key takeaways:

  • We are glad to see that Attribution has (finally) reached a tipping point.  Brands, agencies, DSPs and media platforms are scrambling to leverage machine-based insights to optimize media spend.  Continuing to rely on last-touch KPIs for is simply a lazy and irresponsible approach to measuring media.
  • We believe measurement, analysis and optimization decisions should be driven by the advertiser, its agency or an independent solution provider, not its media vendor.  Even if the fox is friendly, it shouldn’t be in the hen house.
  • We also believe data should be easily ported, integrated and made available for analysis, regardless of who sells the media or who serves the ads.  Openness, transparency and portability are not only ideological values; they also make business sense.
  • The growing concentration of power of leading media and technology vendors should be on everyone’s radar as a threat to transparency and openness.  If you look at the markets for programmatic display, video advertising, search, social marketing, mobile advertising* and ad serving, the dominant players are making it difficult and expensive to independently analyze their data in the context of other media. The path to marketing and advertising success does not end in a walled garden.
  •  To date, advanced insights (e.g. algorithmic attribution and data-driven optimization tools) have been reserved for the largest advertisers who can afford six-figure price tags.  As the article points out, there is a large unmet need beyond the top 200 advertisers.  To address the needs of the thousands of middle market advertisers, a new model (no pun intended) is needed.  Heavy, expensive and service-intensive solutions cannot scale across the broader market.  The next phase of adoption will be won by light and agile solutions that are affordable and easy to implement.
  • To deliver modeled insights at scale, the solution must be automated, efficient, flexible and customizable for each advertiser.  It should also be affordable.  On this point, we wholeheartedly agree with Forrester’s Tina Moffett “I think one advantage [attribution start-ups] do have is they were able to see the market needs and where the gaps were … and where existing players were falling short.”

For these reasons, we are very excited about the prospects for innovators who are able to address unmet needs for the large and growing middle market.

*For more on my quote that Google gets half of all mobile ad dollars, please see the emarketer report published earlier this year.

As always, thanks for reading and feel free to share comments or contact me if you have any questions.

Steve Latham

The Value of Data: Our POV on Verizon-AOL

May 19th, 2015 by Steve Latham No comments »

I was recently interviewed by Advertising Age on the data angle of the Verizon-AOL deal (read the AdAge article).  While still fresh in my mind, I thought I’d share our POV.

First, Verizon already has unprecedented insight into what people are doing:

  • They know which device is speaking to its network and the packets and requests to each device (i.e. they capture all data sent and received)
  • They record every user session (e.g. using an app, typing an email or browsing the web)
  • Whether you’re using Verizon’s wireless service or local wifi to access the Internet, all data is captured
  • They track online behavior via cookies and relationships with 3rd parties (e.g. AOL owned Huffington Post)
  • They connect devices to each other and to desktops and households better than anyone else.
  • We believe Verizon already collects data than other providers

Acquiring AOL gives Verizon the ability to analyze the data and use it for advanced targeting of digital media.  While AOL’s sites (e.g. HuffPo) have some value, the real value is selling the ability to do advanced audience targeting to advertisers through AOL’s programmatic buying platforms for advertisers and publishers.  In short, Verizon has the diamond mines and AOL provides the mining equipment, sales and distribution.

And Verizon’s diamonds will have superior cut and clarity than what today’s competitors can offer as it can provide deeper insight to customer behavior across platforms and devices.  While competitors are trying to stitch together the pieces from the outside in, Verizon has already bolted them together from the inside out.

Not to say that AOL’s data isn’t valuable too.  In recent years AOL has done a great job of developing a very large proprietary data platform.  The Verizon deal will enhance AOL’s data in numerous ways:

  • Expand the reach of users
  • Expand the data on each user: demo, geo, behavioral, etc.
  • Enable better multi-platform / device bridging
  • Improve resolution and accuracy

So at the end of the day, this deal is about mining all that data and converting it into revenue.  As noted in the AdAge article, they will need to do this very carefully and responsibly. Verizon has a spotty record among privacy advocates so it would be smart to proceed with caution.

Thanks for your time and interest.  I look forward to your comments.

Steve Latham

fbEncore TwitterContact-us3c

Investing Confidently (and Safely) in Programmatic

March 28th, 2015 by Steve Latham No comments »

sparkle chartOver the past few years, we’ve spent a lot of time advising Brands and Agencies on the challenges and risks associated with Programmatic buying (which for this post will encompass exchange traded media, RTB, etc.).  While the idea of machine-based buying is exciting, it’s not without significant challenges and risks.  Having analyzed dozens of programmatic campaigns, we’ve found that a blind leap into Programmatic is almost always a costly endeavor.  The thesis for taking a smart approach to programmatic buying is summarized below:

  • While the promise of self-optimizing buying is intriguing, it doesn’t replace the need for objective, rational analysis.
  • Programmatic optimization is typically based on a broken model.  The continued reliance on clicks, post-click and post-view metrics may do more harm than good.
  • Algorithmic attribution is critical for measuring and optimizing media.  Fractional, statistical analysis is needed for accurate and impactful cross-channel, full-funnel insights.
  • As brands shift more of their budgets to programmatic, the need for objective, attribution-based insights will become even more critical

I recently put documented some of the key lessons learned to produce the embedded Presentation: “Investing Confidently in Programmatic“.  I thought about calling it “How to Avoid Wasting Half of Your Media Budget” but opted for the more positive spin. Either would be sufficiently accurate.

In it, I address some of the risks and challenges of Programmatic buying, along with recommendations for ensuring a successful investment in this rapidly changing arena.  Also included is a SWOT analysis to frame the strengths, weaknesses, opportunities and threats that advertisers must deal with to be successful in this new area of machine-based buying.

As always, your comments and questions are welcome – just post!  If you’d like a copy of this presentation please contact us.

Steve Latham

Encore TwitterContact-us3cfb

The Growing Need for Device Bridging

December 1st, 2014 by Steve Latham No comments »

tapad joined

As an industry, we are quickly moving to a “mobile first” world where mobile engagement is becoming an increasingly important part of the customer journey in most considered purchases. From a targeting standpoint, digital publishers have done a decent job of assembling the components to engage individuals across desktops, laptops, mobile phones and tablets. But on the measurement side of the spectrum, marketers are way behind the curve.

Defining the Problem
Traditional platforms and measurement tools use cookies or alternative IDs to track the behavior of each browser as an individual user. As consumers are increasingly using multiple devices as part of their everyday lives, single-screen tracking provides a very limited view of user-level engagement. Even within your mobile device, it’s difficult to connect your mobile browser and applications, meaning most publishers and advertisers won’t know you’re the same person who saw an in-app ad before navigating to their site through a search on your mobile browser. When you factor in the use of tablets, desktops and laptops, the challenge becomes even more complex. Let’s use the following example to illustrate the challenge:

Suppose you see an ad for TravelX (fictitious travel site) on your iPhone’s mobile browser and you remember you need to book a plane ticket home for the holidays. So you go to the App store and download the TravelX app and book a flight home. Unfortunately, TravelX will not know that the mobile ad you saw on Safari drove the purchase you just made on their mobile App. To them, you appear to be two different users.

In your confirmation email is an ad for a great rate on a rental car. You see it on your phone and make a mental note to do some research tomorrow. While at work the following day you visit the Brand X travel site on your laptop to check out rates. You’re not ready to buy but you’ve definitely shown interest and are deep in the funnel. But unless you sign-in using the same ID as your mobile app, Brand X will not know you are the same person who just booked a flight through their app. Again, they will classify you as a unique user.

To sum up, you responded to TravelX’s ad, downloaded their app, booked a flight, and are now considering a rental car.  While Google, Facebook and others may help them retarget you on your laptop and mobile device, TravelX won’t know how to accurately attribute credit for the conversions.  Given the gaps in “traditional” digital measurement, TravelX doesn’t know its integrated media plan is working so well.  They are still wondering what caused you to download their app in the first place while questioning the value of the seemingly ineffective mobile browser ad that got your attention, but not your click.

tapad uniques

This example illustrates the challenges marketers face in creating a unified view of each customer (even on the sites you frequent).  If you’re like the majority of advertisers who do not require users to authenticate, the challenge of measuring the source of new conversions is even harder.

Beyond the bridging problem, we also see a lot of advertisers who rely on their publishers to serve mobile ads and report the results of the campaign.  As with site-served desktop ads, this leaves a lot of unanswered questions about the true reach, frequency and timing of ads being served and/or rendered.

Solving the Device Bridging Problem

In recent years some innovative companies have emerged to address the gaps in device bridging and data unification.  A few of them are even independent of your media buy, which is very important (let’s keep the foxes out of the hen house). While no one solution offers the silver bullet to address this problem, point solutions are available to:

  • Serve ads to mobile browsers, using a device ID vs. cookies to track user engagement
  • Bridge mobile browsers to in-app engagement
  • Bridge mobile devices to other devices, desktops and laptops used by the same individual
  • Connect device IDs to a universal user ID that can be used for online and offline CRM efforts

tapad connectedBy bridging device-specific data at the user level, advertisers can connect all impressions, visits and conversions associated with each customer journey, regardless of platform or device.  For those brands going head-first into mobile, this will be required to truly understand reach, frequency and cross-platform synergies for each campaign.

Armed with these insights, marketers can plan more effectively, reduce waste and optimize spend while better understanding how and when consumers engage with their brand.  Without these insights, there will be some lingering questions about the relative contribution, impact and ROI from each digital media buy.

(Images courtesy of Tapad)

As always, comments are welcome!

Steve Latham

Encore Twitter

Encore Launches New Attribution Platform

November 13th, 2014 by Steve Latham No comments »

Encore’s Customer-Driven Solution Makes Algorithmic Optimization Easy and Affordable 

New York City —- Encore Media Metrics, a pioneer in digital attribution, today announced the launch of its new cloud-based analytics platform. With the introduction, Encore is making advanced measurement and impactful optimization easy, efficient and affordable for brands and agencies of all sizes.

According to Tina Moffett of Forrester, “Marketers need guidance on development, management and analysis of complex marketing efforts. Attribution vendors need to deliver what customers are looking for: turning insights into action, calculating accurate performance metrics, providing a holistic view of the customer purchase path, and taming the messiness of big data.”

Leveraging six years of research and innovation, Encore designed its platform to address these unmet customer needs, delivering clear and compelling insights that are readily available and easy to act on – without data overload. Beyond automated ingestion, algorithmic modeling and visualization of paid, earned and owned media, Encore’s new platform provides powerful insights, and prescriptive recommendations for reducing waste and optimizing spend. Encore’s platform enables customers to make real-time updates and modifications directly in the web-based User Interface, providing unique personalization and efficiency. Best of all, Encore’s solution is easy to implement, easy to use and priced to fit budgets of all sizes.

“Encore’s new platform delivers robust analysis, insights and recommendations marketers need to measure and optimize campaigns with confidence, said Gunnard Johnson, SVP Analytics, Centro. The new Interface delivers the insights that matter without overwhelming us with data. They’ve done a great job of delivering statistically modeled outputs while giving us control and flexibility in how the insights are presented.  I believe it’s a smart and affordable solution that fills the gap in today’s marketplace.”

“We wanted to make advanced analytics easy and efficient for brands, agencies and partners, said Steve Latham, founder and CEO of Encore. “Therefore, we had to develop a scalable, machine-based solution that would provide clear and compelling insights, along with actionable recommendations for optimizing spend. We also had to simplify the user experience by reducing the complexity and level of effort typically required by a customer. Lastly, we had to make it affordable for the majority of market.”

Encore’s new cloud-based platform aggregates and analyzes conversion path data using proven models for attributing fractional credit to each interaction along the way. Programmatic reports, insights and recommendations for optimizing spend are presented in a customizable User Interface that allows each customer to personalize how insights are presented and acted on. A predictive analysis module forecasts future results and the expected lift in performance from optimizing on Encore’s insights.

“Encore’s platform delivers real savings and improved returns for our clients” said Kunick Kapadia, Channel Analytics Supervisor, The Gate Worldwide. “Analyzing the true impact of each media channel is an enormous undertaking, but Encore’s real-time dashboard unlocks these data points and turns them into actionable insights. They have more than paid for themselves through the savings and enhanced ROI as a result of our partnership. I would recommend Encore as the go-to solution for clients of all sizes.”

Learn more about Encore’s new Attribution Platform.