Why you should be sceptical about your web analytics

One of the most valuable assets in your marketing technology arsenal is web analytics. It provides you with easy-to-understand reporting metrics and insights into how your users find and interact with your digital assets. But can you trust everything that those reports are telling you?

The importance of engagement metrics cannot be understated as they provide rationale and evidence to help drive key business decisions. Knowing how much traffic your website is getting and from what sources can help you decide where to allocate your marketing and campaign budget.  It is key to define the success metrics of your digital assets before tagging them up for reporting so you can create tangible and achievable outcomes.

 

Success Metrics

Some common success metrics for a digital asset are Unique Devices / Visitors, Total Sessions, Page Impressions, and Conversions. A digital asset can have multiple conversions and conversion funnels, and each of those can have a different value. Consider an eCommerce website with a newsletter signup form. Both the form and the purchase checkout events can be conversions that have different values; one implied future value and one actual.

Once you have clearly defined your success metrics it’s time to start tagging. Some digital experience platforms will include analytics and reporting out of the box, however, several subsequent marketing technology platforms are required to track specific campaigns such as Search Engine Marketing, Social and Re-targeting initiatives.

Each technology platform will have its reporting capabilities to tell you how amazing it is, so it is important to cross-check and validate those metrics against your DXP or other impartial reporting platforms such as google analytics. This method of validating data creates healthy scepticism, but it can also lead to mistrust.

Let’s hypothesize that we have two different tools for generating a specific call to action, a retargeting platform that helps to drive advertisements across a network of partner websites and a paid search marketing campaign. Each of these platforms will have tools to track impressions, click-throughs, and the number of conversions they attributed. Let’s imagine that you had 100 call to actions in a month in your backend systems and your retargeting platform attributed 54 of those to its success metrics and the paid search attributed 65. How is it possible that both platforms with their combined conversions are greater than the actual number of conversions? Which of these is telling the truth and which is telling a lie?

With any marketing campaign, especially ones that run across multiple channels, it is important to understand there is likely to be a cross over. In the above scenario, both platforms are telling the truth. There is a chance within the reporting period that someone came to your website via SEM and via a targeted advertisement which will both take credit for the eventual conversion.

 

A single source of truth

Using different reporting tools for different marketing technologies can’t be fully trusted on their own, so it is important to go beyond the base tracking of these platforms and set up a single analytics platform as your single source of truth.

No two analytics and reporting platforms work the same way. Some tools collect data via JavaScript which depending on the implementation can lead to loss of data whilst others are integrated into your DXP platform and can be more reliable. When it comes to aggregating data, some of the free-to-use platforms use a method of data sampling, and whilst they are fairly accurate samples there is a margin of error that would equate to inconsistent metrics between analytics platforms.

Whilst it is best to have a single platform for tracking and reporting, in the real world it is not always possible as every platform has its benefits and features that the others do not. We are not saying to not use these platforms for their intended purpose, we are saying don’t try to compare apples to oranges and use a single platform for validation and executive reporting. Once you have identified a platform to become your source of truth, it’s time to start tagging.

It is important to configure your digital asset to track as much information as possible. Use your base tracking to identify trends in user behaviour such as the paths they are likely to take and then iterate on setting up specific tags to track events such as:

  • Entry Page and referral source (campaign, social, SEO, SEM, etc)
  • Landing Pages or Page Types of Interest (article pages, category of content, etc)
  • Low-value call to actions (enquiries, newsletter subs, etc)
  • High-value call to actions (purchases, paid referrals, etc)

With this information, you can set up different conversion funnels to track the effectiveness of your application and content strategy.

 

Interpreting the data

Looking at data and trying to interrogate the information and determine meaningful insights from it can be a daunting task, especially when a lot of decision-making is done based on the findings. Different people will interpret data differently and confirmation bias can affect a person’s objectivity, so it’s important to use a little bit of the scientific method and try to disprove your hypothesis instead of trying to prove it.

Science doesn’t prove anything, but it holds a body of evidence that strongly suggests the best-known reasons for why things happen in the natural world. If we apply that philosophy to analytics data, we can come away with some meaningful insights even though they might not be the ones we were hoping for.

When looking at your success metrics and you notice an increase in conversion that coincides with a particular campaign or change made to your website, apply some healthy scepticism, and look at it from the lens of “it might be also caused by something else”. Some common factors that could affect your results include but are not limited to:

  • Seasonal Trends – Remember to check your historical data and look back at the same period 12 months prior, to see if there is already an organic uplift in success for that same reporting period
  • Other Campaigns – Was there another campaign running that may have also been a co-contributor to your success metrics.
  • Social Influences – What are people currently talking about in other media, what is trending, and could those trends influence your brand or product in any way during the reporting period.
  • Organic Search – Search algorithms change periodically, so it’s important to keep an eye on announcements from Google to see if there was an uplift in referral traffic from organic sources.
  • Unexpected Spikes – Sometimes your website will have an unprecedented spike in traffic in a reporting period. It is worth either excluding these dates from your analysis or capturing more data to help smooth out the results.

Once you have ruled out other possible contributors, you can be more confident you have failed to disprove your hypothesis and can confidently report that your changes or campaign had an impact on your success metrics. On the flip side, if you did identify that there was a contributing factor in your reporting period, simply collect more data and try to isolate those contributors from your results. While we don’t always have the luxury of time, the more data you collect the more confident you can be in the result.

 

Next Steps

Now that you have some meaningful insights from your reporting tools, you can now use them to power decision-making within your organization. How each of your various marketing technology tools report results can now be validated against an impartial platform, giving you greater confidence with budget allocation.

You can also use analytics data to drive changes to your website to help drive up the value of your success metrics and ultimately keep your users engaged with your brand or product. Creating buckets to segment your users, you can further drill into these insights to help push more personalized content and a richer overall user experience.

 

Until next time,
Chris – Head of Technology

Recent Posts