Google has updated how marketers measure the effectiveness of their advertising campaigns by shelving last-click attribution (LCA) for a supposedly smarter machine learning tool that has a broader view of the conversion pipeline. The Drum probes marketers on how this will impact them.
What is last-click attribution?
Last-click attribution is a metric designed to quantify what digital ad drove a product sale. From this, the effectiveness of placements can be inferred. But the framework is biased toward crediting and over-emphasizing the importance of that last ad in the chain, sometimes at the expense of a broader marketing funnel that would have created product demand and awareness in the first place. It could also provide a very skewed idea of how humans discover and buy products if religiously followed – especially when considering how much media we consume each day.
Why is Google moving away from last-click attribution?
Data-driven attribution will be made available to all advertisers in Google Ads beginning in October. It’s been testing it with a few advertisers. The shift is informed by the changing privacy landscape. New frameworks such as Floc and ATT will hinder how effective LCA will be anyway.
How data-driven attribution works
Google is including more touchpoints than just the last click in the new model. It’s also obfuscating the individual. It believes by adding in “relevant data” advertising could become more effective on its platforms. Google says it will “better predict the incremental impact a specific ad will have on driving a conversion and adjust bids accordingly to maximize ... ROI”.
What marketers say about the end of last-click attribution
Brendan Clarke, director of operational performance at TMWI, believes “patience is wearing thin with outdated systems”. The industry is realizing what data is useful, and more often than not now that means stripping out “unnecessary personally-identifiable information”.
He believes that while Google has put this decision down to the changing privacy landscape, “the chances are that there was self-interest at play too”.
The effectiveness of the new measurement tool will “differ wildly” on a case-by-case basis, but machine-learning will play a bigger part in advertising, and he’s enthused to see Google embrace it.
Owen Hancock, marketing director at Impact, knows last-click attribution “isn’t perfect” but explains it’s been the go-to because of its practicality. “It’s clear, easy to understand and there’s only one winner.”
Reality’s rarely so neat. “In our fragmented media landscape, a variety of media types and formats contribute to our purchasing decisions – whereas focusing on the last click alone offers a very one-dimensional view of marketing performance.”
As for Google, the more campaigns it puts through its machine-learning system, the better it’ll get. As Hancock explains: “Google may have access to more data from advertisers to further feed its models.”
More broadly, he can see Google inputting contextually relevant information (like it plans to with Floc) rather than personally-identifiable information. “Moving away from data for data’s sake, and toward a model with more nuance, is certainly no bad thing.”
René Plug, chief business development officer of 1plusX, says the phase-out has been a long time coming. LCA, after all, “represents only a fraction of marketing value creation” and is traded on a dying currency, the third-party cookie.
The machine-learning “cuts the direct link between a user and the data that is captured about them”. You’re no longer a number, but have a statistical membership of a group or cohort (football fan, for example).
Plug acknowledges a “big shift” is needed – marketers knew it was imperfect. But at least they knew it. “Moving forward, there will need to be a new universally-accepted framework for predictive standards, methods and verification that is verifiable by partners in the value chain. This problem – as far as I know – still needs to be resolved, and will take a lot of time and debate before progress is achieved at scale.”
Until then, will marketers be happy to trust Google and its still-lots-to-learn machine learning algorithm?
Paula Gómez, head of data and adtech at Making Science, looks at what mediums will be most impacted by the changes.
“Search – especially brand search – is usually a very late touchpoint anyway so that shouldn’t see much difference, but definitely some. Brand search’s impact will be reduced, but it will still attribute the lion’s share of impact to the last touchpoint; brand will continue to hold huge value in DDA, even if it will be less than before.”
Now we’ll a “huge uplift” of value attributed to generic and broader search terms during the research process of any purchase. “Now they’ll be reporting fractional conversions from their assistance.”
Dominic Tillson, marketing director at Inskin Media, says the shift makes sense... if the new tech works.
Inskin has skin in the game. Its high-impact digital ad format ‘Inskins’ (it says) are “27% more likely to be looked at, is looked at for 39% longer, and is 140% more likely to have an impact compared to an ‘un-primed/identical’ MPU”. Tilson sees his format getting more “kudos” from this new system.
He’s keen for more clarification on what data protection the move to machine learning needs, but agrees that it “certainly does not rely on personal information to work effectively”. He believes that the innovative new formats on offer don’t rely on personal data in order to work, rather “it’s about the effective application of technology combined with human creativity”.
Tobias Knutsson, chief executive of Adverty, believes the machine learning algorithm will work well for advertisers. “Google has always been a company that recognizes the value of, and trades in, data. Ensuring that its data models are fit for purpose forms a critical part of this strategy.”
It is set to establish “even more dominance” in the market as a result, if the new framework is accepted by advertisers. “There are a plethora of alternatives in the market today and it’s imperative that brands shop around and choose providers that will help them to achieve their goals.” [He points to a current Google and Facebook blindspot in in-game advertising as an example.]
He advises: “It’s never a wise move for marketers to jump in blindly. My advice would be to ask questions about the algorithmic solution and to query how it will work for each brand and each campaign. It’s early days, and is by its nature an iterative process.”
Justin Wenczka, chief revenue officer of Verasity, makes a good point about the need to remove fraudulent accounts and bots from the machine learning system. Any non-human behavior could contaminate the model and any feedback it’s giving.
“You have to identify fraudulent activity first and remove it from your data set. If you have a skewed data set to start with, you’re building your systems on poor foundations that will soon crumble when scaling.”
And finally, Tom Jenen, chief revenue officer of Brand Metrics, pokes a hole in the framework.
“The problem with last-click attribution is the first two words. ‘Last’ means that only the final action gets credit for a huge amount of work moving the user through the sales funnel. Certain sites get too much credit for too little work; others get no credit for doing most of it. The other word, ‘click’, is a problem because it’s clear that most people don’t click on ads, and that many clicks are actually fraudulent.”
A more “holistic view” of the consumer journey is welcome, and he believes that machine learning, AI and automation has reached the point where it can justify increasing its control over this area of measurement.
The update is set to roll out as the default model across Google Ads from next month, so marketers have a short deadline to get up to speed with any changes.
from News https://ift.tt/3ihF0nc
via IFTTT
Comments
Post a Comment