by Doug

29/06/2015

Google Analytics

GTM Design and planning

An edited version of the first chapter from http://www.conversionworks.co.uk/gtm-best-practices/

A poorly considered system design can bake anti-patterns into your solution very easily. Parents will attest to the value in making sure children don’t pick up bad habits from an early age. Similarly, ensuring your solution doesn’t have flaws designed into it from the start is crucial to success.

Treating digital analytics measurement as an architectural layer in the system is a pattern to employ in the design process.

Good technical architects will give careful consideration to the  data, business logic and presentation layers in the system design process. Equally, applying thought to the measurement layer leads to a better and more robust architecture.

Thinking about the measurement of a site, a test or even just a new feature as part of the design and build process encourages you to think early about what data to capture so as to prepare the ideal technical solution for the instrumentation.

Design and Planning

Talking about system design at the start of this book is no accident. Whether your current system design process is iterative or linear, at some point you will address the logical system design and the physical system design. You need to include measurement considerations in this process.

The logical system design examines the data flow through the user interface to the business logic layer, the database layer and back to the user.

Think and plan what user behaviour you want and need to record. How you will do this and why you are capturing this data is dealt with later. Focus on the what initially – you can tune and refine this later if need be. Include monitoring and diagnostics data capture at this point. As you plan how to capture user behaviour data, include failure modes and how they will be tracked in addition to the measurement of correct functionality. We’ll discuss specific use cases for failure mode tracking later.

This phase of the process is focussed on scrutinising the flow of data from the core system to other connected systems. The measurement layer will receive the same degree of attention at the design phase as a CRM or billing system does. Notice there is no fundamental change to the system design process but a failure to expand your situational awareness to include all inter-system dependencies is where this process often falls down.

The physical system design will focus your attention on the format of data flow into and out of the system. Take a step back from the detail and treat the whole system as a black box for now. Think purely about the nature of the input and output as this will help concentrate your attention on the user interface.

How users engage with the front end is the crux of the measurement challenge. Think about how you extract behavioural data points and what you will use them for. Having considered what data to capture in the logical system design phase, now you need to think about how to capture this data.

The logical system design (what data you want to capture) is a key input to the physical system design process but you will not accomplish this alone.

All individuals responsible for the site need to talk, interact, share, collaborate, support, innovate and deliver together to complete the system design.

Include the right people in the design process

Who do you need to include in the system design process to ensure all relevant stakeholders contribute to a cohesive solution?

Start by considering these questions:

  • Who has ultimate responsibility for the website?
  • Who needs tags (data) and why?
  • Who builds the tags?
  • Who publishes the tags?
  • Who designs the website?
  • Who builds the website?
  • Who tests the implementation?
  • Who maintains the site?
  • Who is responsible for the strategic alignment of branding, functionality and measurement with respect to the organisational goals and KPIs?

The groups of individuals identified by these questions can be categorised by their areas of expertise, the design aspects that they can influence and their key responsibilities.

Get these people in a room before any decisions are made:

Area of expertise/influence  (Stakeholder)

  • UX, design & navigation  (Designers)
  • Aesthetics/branding  (Designers, Marketing)
  • Security & Privacy  (IT/Ops)
  • Organic search  (Marketing & SEO)
  • Measurement  (BI Analysts)
  • Multi device  (Developers & Ops/IT)
  • Accessibility  (Designers & testers)
  • Cross browser  (Developers, Designers & Testers)
  • Speed & page weight  (Developers & Testers)
  • Maintainability  (Developers & Ops/IT)
  • Strategic and tactical alignment  (CxO)

All stakeholders in the design process have a responsibility to be aware of, acknowledge and respect the other aspects of the system design. To ensure the ownership of the system design is shared, it’s fundamental to include all stakeholders in the design process.

The format of the output from the design process is unlikely to be radically different to what you currently have. However, the content may be very different. The quality of the design will be improved because you will have a shared vision of the solution. You will have a consensus that will satisfy all requirements. You may initially take longer to produce the best system design by involving the full suite of stakeholders but over time as each stakeholder learns what requirements are important to other stakeholders, the process becomes easier, quicker and better.

We’re now going to swallow the metaphorical red pill and understand how and why we include measurement in the system design. The scenario described below is a fictional tale of woe caused by a failure to involve the right people in the process.

A Bad Day…

The office HiPPO decides the homepage needs a new carousel, has a chat with the design team and gets the project rolling. Everything seems fine.

The design team present what they think is the best design based on the brief to the developers who build the carousel using Flash and JavaScript.

The bounce rate on the homepage goes through the roof and conversions hit the floor. The HiPPO wants to know why!

The analysts sigh as no data is captured so they can’t explain the issues using conventional analytics techniques. The SEO guy feels that his day has been ruined because the new carousel doesn’t have any h2 elements…

Nobody can answer the question. More expensive development, testing, analysis and reporting is required. The cost of the new carousel has tripled and the ROI has evaporated.

A lack of thought and communication has resulted in political warfare in the office and a homepage feature that is not measured or built properly. The business has suffered needlessly. This situation is common and totally avoidable.

Anti-pattern – treating analytics as a last minute annoyance

The common problem:

There is a pressing deadline. There is a shortfall in the project budget. Resource is scarce and the site has to go live on time. The snag list seems to grow by the hour and decisions need to be made. Maybe cutting corners is an option now? This is a common reality and this is not an easy prioritisation exercise.

It’s all too common to take an easy option such as ditching Google Analytics for the first roll out. All that javascript is extra weight and complexity that can be retrofitted later, right?

The ‘meh’ is a feeling of ennui towards measurement as it is perceived to add little to the solution other than weight and complexity which is unwelcome on a tight schedule. What’s the point in measuring a site if the site is not live?

This is true but only up to a point. You can safely retro-fit analytics but only if you’ve thought through the site measurement at the physical design stage. You need to have a solution design that has been designed to be measured even if the measurement is not yet implemented.

Good quality markup 

At the heart of the user interface is the markup – the HTML, CSS and JavaScript that makes up the pages. The quality of the markup will dictate whether you can legitimately delay implementing instrumentation. The markup will also have a huge bearing on the expense and risk in moving digital analytics vendor if you choose to switch in the future. Get your markup built well and and you’ll have a measurable site that will work well with most digital analytics systems.

You will be aware of the range of topics to consider in the design process:

  • User Experience – functionality & navigation
  • Aesthetics/design – branding
  • Security & Privacy
  • Organic search
  • Measurement
  • Multi device
  • Accessibility
  • Cross browser
  • Speed & page weight
  • Maintainability

The inextricable linkage between these subjects is not that they are all relevant to the site front-end (of course they are), it’s the domino effect each has on the others.

For this discussion we’ll assume that the design artwork has been agreed and signed off, the functionality is specified clearly and is set in stone. The security of the site has to be assured – this is not a mutable point. We’ll also assume that the multi-device, cross browser and accessibility aspects are a given – they have to happen and are part of the design and functional specification.  This leaves the following:

  • Measurement
  • Organic Search
  • Speed & weight
  • Maintainability

These site properties cannot dictate or influence the aesthetics or behaviour of the site in anyway – the users’ experience comes first. This quartet of facets are ideal BFFs as they are potentially massively beneficial to each other and are all totally dependant on the markup quality. The utopian goal is getting the markup designed to cater for all of these topics. In doing so, all aspects will be mutually complimentary.

Well designed markup will work well in terms of organic search performance. It will be lightweight and easy to maintain delivering fast loading pages that are semantically rich in content and meta-data and therefore supportive rather than obstructive to measurement.

Anti-pattern – failing to think beyond the pageviews

You’ve fully subscribed to the idea of building measurement into the system design process. Great. Now, make sure the measurement coverage is as full and complete as possible. During the logical system design, when you think about what to measure, make sure you consider what user interactions are important to your site.

The scope of website measurements extends beyond basic pageviews – it extends as far as your imagination and the scope of your site and your business will allow. Examine the broad picture of the site, the navigation, all the domains used on the web property (not forgetting mobile domains, secure domains and micro-site domains), on page functionality, off page functionality (consider offline data capture too) and most importantly what the desired outcomes are on the site and what they mean to your business.

The strategic and tactical goals of the site (what you want your users to do on the site and why) will form the key performance indicators on which you’ll base your definition of success. The process of defining key performance indicators from strategic and tactical goals has been well covered in other textbooks so I’m not going to labour the point here.

Instrumenting the site and the business is not a journey of a single step. Accept the reality that the site will evolve. You’ll embark on testing programmes which inevitably result in change. You will not be able to anticipate all the potential measurements you’ll ever need at this point but designing and building with measurement in mind at all times will serve you well and deliver strong returns.

The point is to avoid the trap of not thinking beyond the pageview metric. Setting up the most basic boilerplate instrumentation is not a robust, long term solution. Failure to plan your digital analytics evolution with sufficient scope and clarity will lead you into the next anti-pattern.

Anti-pattern – focusing on data volume rather than data quality

Beware the temptation to measure everything and anything. Just because you can measure something does not mean it needs to be measured. Think about why the CXO is included in the system design phase. They’re present to help make sure the right things are measured.

Data quality is king – not data volume. Where standard metrics are available in Google Analytics, use them. If every click seems to be important, ask what business question is being answered by measuring them? Will knowing about every click as an event provide insight into customer behaviour? Is GA even the right tool to capture this data? Widespread click measurement is often best served by a dedicated tool such as Mouseflow, HotJar or ClickTale.

Effective data capture refers not only to the elegance of the technical solution but also to the elegance and fitness of the data that is captured. Doing more with less is not the required mindset though, do more with better.

Don’t drown in data. Data is the nourishment for your appetite for insight. Analysis needs to be revealing and valuable rather than onerous and costly. Let the data help you make decisions but don’t let it take over.

Designing your data exhaust to be simple and beautiful rather than massive and clumsy will deliver tangible returns.

Anti-pattern – fixing the site with the measurement layer

Baking in the anti-patterns described so far will have an unavoidable knock-on effect. You know you’re going to have to retrofit the measurements and you’ll find that the markup doesn’t easily and robustly support certain tracking techniques. You may find that multiple forms are submitting to the same URL, you’ve encountered a requirement for a new metric that is impossible to track currently or that the pageview data is rubbish as it’s all /index.aspx?documentid=abc123.

What now?

‘What now’ is pain and tears and cost. Of that there is no doubt. But you can rejoice as a mirage appears on the horizon. It manifests as an easy route to salvation. This tempting apparition makes you think you can ‘fix-up’ the site without having to initiate a major refactoring exercise.

This corrupting influence is the prospect of flexing your scripting, GA and GTM muscles rather than addressing the underlying site issues.

Avoid using these hacks:

Forms with 1 URL

Here’s an easy fix – when the form is posted, just drop a value in the dataLayer to flag the form as having posted to an appropriate virtual pageview for use in a goal. N.B – this is quite a different beast to the AJAX form that will be discussed later.

Forms with 1 URL will not scale because:

Every time you add a form you need to do extra work. The fundamental issue here is that under the hood in the source code, a controller is returning content to the presentation layer without updating the URL in the response.  This is lazy and highly wasteful. This solution will not scale.

index.aspx?documentid=abc123

A great example for a lookup table. Use the ID from the querystring parameter to set the virtual pageview – simplicity itself!

index.aspx?documentid=abc123 is a failure because:

Fixing up the measurement for you is selfish when you’re leaving analysts, search engine bots and users with the awful URLs your system spits out.

Fix the URLs to be meaningful to machines and humans and you’ll offer a better user experience. Maybe you’ll find some better organic search performance? Analysts will be able to use the inPage analytics reports and will be able to link from reports to the pages.

I certainly wouldn’t want to be the one to manage your virtual pageview lookup table…another difficult to scale solution that is also not aligned with organic search performance goals.

Multiple outcomes reported on one URL

Users can choose to sign up for a newsletter, request a quote, buy a widget or submit a ‘contact-us’ form which all finish on thankyou.php

You only need to pop a dataLayer variable on the thankyou.php page to flag which outcome is which and you’ve nailed the tracking solution.

Multiple outcomes reported on one URL are harmful to measurement because:

You’re not thinking beyond the pageview! Consider the tracking of other actions on the thankyou page – if you have 4 outcomes all finishing on the same URL, you have 4 pages with the same URL. You have thankyou.php but you should have

  • thankyou-newsletter-signup.php
  • thankyou-contact-us.php
  • thankyou-sale.php
  • thankyou-signup.php

Think about measuring clicks on these four, logically different outcome pages. If you use the same page, you use the same URL, the same markup – everything. This means you can’t identify any other clicks or user interactions on these pages without reference to the page referrer – which may not be reliable and is another unnecessary chunk of complexity.

Make the GA goal definition easier and better. Implement the outcomes on different pages. You’ll have more pages on the site to maintain but the tracking is easier and the pages can then be optimised in isolation.

Occasionally, fixing up a site using the measurement layer and GTM is viable if only to demonstrate that the required fix works.

It is not a viable long term strategy.

Don’t use the measurement layer to make permanent hack-fixes to the site. This is poor software engineering and results in wide ranging long term issues – time, effort, cost and risk are all affected. Having said that, do use temporary proof-of-concept fixes to demonstrate what the long term solution will do. Just don’t leave them there. You’ll be tempted but be strong!

Design pattern

We’ve discussed the potential ramifications of adopting the anti-patterns described in this chapter, the root cause being  choosing short term gains whilst failing to think ahead. The end results are painful:

  • Technical fragility
  • Cost
  • Risk>/li>
  • Effort
  • Performance issues
  • Incomplete ROI

In isolation, the detrimental effects of these results can be assuaged with some effort. In combination with the incomplete ROI data, the compound impact of these effects are tangible, harmful and take considerable effort to alleviate.

With respect to these anti-patterns, the right approach to designing systems intelligently with future returns in mind leads us to some simple heuristics:

  • Give careful consideration to measurement at the system design stage. Don’t leave it to the last minute.
  • Include measurement design into system design. Identify the data you need to capture and build to be able to measure them.
  • Design and build the system to be measured from the start and for the future. Cater for current and future measurement in the design.
  • Data exhaust considerations are relevant to all systems and users. Think how feature usage and failure modes will be measured as the functionality is designed and built. Keep the data exhaust relevant – don’t drown in data – focus on data quality rather than quantity.
  • Include all relevant stakeholders in the design process. Ensure decision makers are involved in key design choices.

From the earlier story, let’s see how applying these heuristics can turn a Bad Day into a Great Day:

Our HiPPO talks to a friendly competitor who sings the praises of the product carousel on their homepage and decides to commission the analysts to conduct a little research.

The analysts are tasked to conduct some voice of customer research. Additionally they look into 3rd party studies and dig into data regarding engagement on the current homepage.

The insights are shared with all stakeholders during the design process and it is collectively decided that the type of carousel suggested is unlikely to work well on their type of site as the data suggested the performance would be poor.

Using the data the business collectively decides to not take the risk of a poor performing homepage. So, the chosen feature design uses a conventional menu system with top level category pages instead of a carousel.

The designers happily deliver the visuals.

The developers know that the menu system can be done with HTML and CSS and doesn’t need Flash or JavaScript. They are aware of the expected user behaviour and build the markup so that the key measurements are supported out of the box. The markup also supports extending the instrumentation in the future if required…

The SEO guy has his h2 elements in the right place on the page.

The key user interactions with the homepage are correctly measured and each stakeholder understands the reasoning behind the decision.

The chosen design gets tested. There is no observed impact on bounce rate, the data quantifies the ROI and voice of customer feedback confirms that users are delighted by the new feature.

The design is permanently implemented and conversion rates soar. The end results worked with the least investment and maximum return.


The moral to this tale is the value of investing time and effort in the design process is considerable. The time and effort spent at the design stage is not a cost – it’s an investment and an extremely valuable venture.

Comments

Leave a comment

Your email address will not be published. Required fields are marked *