The state of attribution in 2018

Author

Category

Read Time

Last Updated

Dora Moldovan, Managing Director - Braidr

Analytics, Digital Marketing, Opinion

4 Minutes

13 May 2024

The world of marketing data science is eagerly awaiting the Attribution product from Google, due to launch in Q1 of 2018. This is probably one of the most highly anticipated launches from Google, as it excites not only the data buffs, but the whole community. While we wait for the news from Google, we take a look at the state of attribution and reasons why everyone should get onboard the data-driven train.

The current state of play:

Measurement unit : Hit -> Session -> User
Attribution : Largely last-click

As Google Analytics has developed, matured and become an enterprise level product, there have been a few shifts in the unit of measurement paradigm.

Firstly, measurement and tracking have moved from the simplistic ‘hit’ approach to the more sophisticated ‘session’ metric. The need to understand user-behaviour beyond the ‘page-view’ made this decision an obvious one. With the shift in behaviour caused by mobile, the internet has become user-centric, and in a world where experience is now key, the ‘session’ metric is somewhat lacking – hence we’re seeing a second shift in measurement paradigms, towards the ‘user’ metric.

As we’ve become so much more sophisticated in our advertising, with audience consideration, at introducing channels that are brand-building or contributing to the user-experience outside the “in-market” pool, why are we still stuck on measuring conversions on the last click model?

Well, last-click attribution is roundly understood, it’s easy to grasp and widely spread. Everyone in the industry understands it and is able to operate within it’s parameters. Alongside these “advantages” it is also easy to reverse engineer, meaning that tracking mishaps or creative tagging can easily be remedied. This forgiving nature of it means that clarity in data is not a stringent requirement.

Does all of this make it “wrong”? Definitely not, it’s just a little tired, fairly unsophisticated and somewhat short-sighted when it comes to measuring the more sophisticated channels.

While some other static attribution models have been available in GA for quite a while, and the multi-channel funnel reports on assisted metrics have been pretty useful to shift the mindset from the last click method, they’ve lacked the dynamic ability to measure an ever changing landscape.

Introducing data-driven attribution

The discovery and consideration phase in the journey to conversion happens on different devices and via different channel interactions, disrupting the last -click order. It is only fair that we finally start looking at attribution through a more dynamic, fair lens.

Google understands this and employed a sophisticated games theory algorithm (Shapley Value) in their data-driven approach to attribution. While the concept has been around for quite a while, the execution at scale is definitely a job for a giant. The algorithm’s approach literally A/B tests each path to conversion to determine each channel’s contribution and assigns a fraction of the conversion/revenue against each.

The definitive guide from Google available here.

Why is this a better approach? Simply because it democratises attribution, it’s able to dynamically change the rules with every new channel combination that results in a conversion, and it takes every single touchpoint into account. It also shifts the focus from attribution for the sake of adoption, to attribution as a true and accurate means of optimisation. No surprise that Google has already rolled it out into their Double Click and AdWords products already.

While this all sounds like the perfect solution, there are some things to keep in mind:

  • It is not possible to easily reverse engineer the model, meaning that tagging mistakes are costly and may render the results unusable.
  • Sending data into the wrong channel will change the weighting of it and may cause unexpected results. Get in touch with Found for a tracking/ GA audit.
  • Years of marketing experience have no voice in the model, meaning we cannot tweak it manually to skew it towards a channel we know works well. Also, we’re not able to discount the direct channel – which has been a well-known practice adopted by Google in their default reporting model.

What to expect with DDA in 2018:
The Attribution product is pretty likely to polarize the ecosystem. There will be the “Happy adopters” – data-driven people that trust and understand the machine revolution will likely jump on this bandwagon, they will make the effort to understand it and cleverly use their data to feed the model while using its output to make better marketing decisions.

We’re also expecting a certain degree of confusion and perhaps reticence – people that will find it hard to let go of last-click – and rightly so, as they have no idea of DDA historical performance, so they will hang on to what is known.

The conclusion is that there are no winners/losers or one report being right and another one wrong, just degrees of sophistication. As trust in automated models will increase, so will adoption of DDA.

As an agency we’re keen to educate ourselves, our clients and our ecosystem as we’re seeing our world shaped and transformed by clever uses of data.