by Doug

29/01/2013

Internet Marketing

Online testing – Common sense & simplicity win the day:

A rational approach to testing and conversion rate optimisation

Following on from my previous post postulating the advantages of a ‘complexity-averse’ approach to testing, this post covers two case studies: we ran a series of tests for clients aiming to answer specific business questions. Let’s be really clear here – we were not aiming to affect metrics primarily but to learn about user intent and behaviour.

Case Study 1:

The first example is based on two insights – one objective based on quantitative data from Google Analytics and one subjective based on experience, appreciation of best practice and common sense!

Data from Google Analytics clearly highlighted a form in our client’s sign up process as being a solid candidate to improve the overall signup conversion rate. We love Google Analytics for the reason that opportunities like this are really clear to see but reasoning why the opportunity is there is another matter.

Anathem by Neal Stephenson contains this poignant line:

They knew many things but had no idea why. And strangely this made them more, rather than less, certain that they were right.

Knowing why users were not completing what appears to be a simple form was key to capitalising on the opportunity. Look at the original form below.

Original form:

Two things stand out:

  1. All fields apart from one are flagged as required fields
  2. The user has to fill in their email address twice with no explanation as to why

Now, these may appear to be two trivial issues. Nit-picking? Maybe… Are users really irritated by these two potential issues? Let’s test not asking for the email twice. Let’s also flag the telephone field as optional and lose the ‘required field’ nonsense:

Test form:

Cleaner isn’t it? Not much I’ll grant but running this quick test generated a wealth of data on how these subtle changes affected user behaviour in a surprisingly small period of time.

Here’s the big win: Spending little time and therefore little money on seeing how behaviour was influenced by these small changes gave the client confidence to apply the learning across the signup process. Only the business question was tested – not the whole signup process. Risk and expense were minimised and the eventual returns from the test were amplified:

Having applied the learning to the whole signup process we observed a 58% increase in the overall signup conversion rate with 99% confidence:

Small spend = big win! You don’t have to spend a small fortune and wait for weeks or months for a complex, cross-site MVT to yield returns. Think clearly and simply to maximise returns in the shortest possible time.

Case Study 2:

Running with the idea of simple and clear thinking enabled us to help PayPoint.net develop plans for their mobile strategy in 2013. The growth of mobile usage in the UK is well documented – http://econsultancy.com/uk/blog/61925-50-fascinating-stats-about-mobile-commerce-in-the-uk

How does an organisation position itself to capitalise on the change in browsing habits and build an effective online presence that works for mobile users? One argument is to employ responsive web design techniques to cater for desktop, tablet and smartphone users in one hit.

Seems like a popular approach. Indeed, our sister agency, WebExpectations blogged about responsive design growth in 2012: http://www.webexpectations.com/blog/2012/04/04/pioneering-responsive-web-design-a-few-examples/

Our friends at http://online-behaviour.com posted a wise little cartoon recently based on the African proverb that noone tests the depth of a river with both feet: http://online-behavior.com/cartoons/testing

So is an ‘all guns blazing’ approach to tackling mobile with a rebuild of a site based on responsive techniques a prudent spend of capital? We suggest a data driven approach based on simple and clear thoughts is a wiser starting point. Try a smoke test using a minimal viable product.

We advised PayPoint to take this approach:

“Rather than build a complete website for mobile users, start with the homepage. Build a ‘one-page-smartphone-specific’ site and drive traffic to this page using AdWords to only target relevant devices.”

The next question to tackle was “What should these pages look like? Should they offer all the content found on the desktop site or should they be cut down?”. Why not test multiple scenarios and see what works?

Two initial designs were tested. The first based on a case study by Conversion Rate Experts: http://www.conversion-rate-experts.com/crazy-egg-case-study/

The second was a massively cut down ‘one function’ site that just invited the user to request a call back. Truly the minimum viable lead-gen site.

These pages were static HTML – very quick to design and build. The AdWords budget was tightly controlled but sufficient to generate useful data:

What happened? Whilst the long page delivered engagement (higher average time on site and pages/visit) the short (focussed) page delivered more and better qualified leads. Perhaps the long page format is not universally applicable and needs to be tested before being chosen?

That question again – ‘Why?’ Consider the context of the user experience. The user is on a smartphone – probably on the road. Do they want to scroll through lots of [great] content or do they want to arrange a meeting and move on? The data suggests the latter case. This is both an interesting and actionable insight. The action was the next step in the test. Considering the environment the user is in, the call back form was optimised for mobile usage – fewer fields and inline on the page (see below).

The results – a further 160% increase in lead volume on top of the already stellar (>500%) increase in lead volume over the desktop site on a smartphone.

Now, this is not the final implementation of the PayPoint.net mobile site. These tests were conducted to build a picture of users’ mobile habits on PayPoint.net, to understand intent and behaviour without major investment in building a complete site. The investment was modest but the value to the business was significant.

Last Test Variation:

Conclusion

We conducted multiple tests in a matter of days that delivered significant value by addressing business questions. The investment in testing was modest as the tests were well thought out and executed with minimum complexity. The primary motivation for testing was to learn. Using the lessons and insights has lead to cost savings, time savings and major bottom line increases.

We suggest this is the right approach to online testing and encourage others to adopt this approach to experience similar return on investments. Let us know how you get on with your testing – drop us a line.

Comments

Leave a comment

Your email address will not be published. Required fields are marked *