3 common pitfalls of simple A/B testing on your website

Spread the love

According to Google Trends ‘A/B testing’ is a phrase constantly growing in popularity and is now 4x more popular than 5 years ago. There is no doubt that evidence-based approach is good in business but there are some common pitfalls that should be considered when conducting simple split tests.

  • When you should stop testing – statistical significance problem

 Let’s start with stopping. When do you know to stop your experiment and that you are ready to draw conclusions? How many observations is enough?

Simply put statistical significance gives you the basis to say that an observed performance difference is not a result of randomness. With the use of basic statistical tests you may assess whether variant A performs better than variant B. Your conclusions will assess the data at a given point in time and with a given probability of error. This probability combines over the number of tests you perform and you may end up drawing conclusions from a coincidence rather than a real pattern.

This problem may be handled by applying sequential analysis and Wald sequential probability ratio test but it reaches beyond a basic A/B testing approach and still doesn’t address the changing audience.

  • Your visitors are changing  - conclusions generalization problem

After executing A/B experiment there is a natural need for extending conclusions to all audience and construct organizational knowledge. It may be a costly mistake.

Visitors learn. For example, global audience behavior changes as more and more online stores display popups to gather emails or provide coupons. It’s commonly known as a ‘concept drift’ and by definition the effect it may have on customer experience is hard to predict.

Visitors come from different situational contexts. Whether a visitor clicked on your ad or found your store on their own may have an influence on their motivation and decision process. Also, the sole phrasing of an ad shapes expectations regarding your website and it may be seamlessly selecting people with different psychological traits.

Visitors change over time. Perception of your website is evolving as you grow your brand and gather customer success stories. The same person who entered your website three months ago may react differently to your messages today.

Those are the root causes of ‘selection bias’ that forbids you to draw conclusions from a sample that is not carefully setup and designed to reflect your audience. Even when stratifying your sample you might end up repeating your tests and draw conclusions limited only to a particular campaign.

  • You may choose the wrong goal – fault in design you cannot mitigate

Optimizing Click Through Rate of your call-to-action button is not the same as optimizing your Conversion Rate, because those metrics might but don’t necessary have to be correlated.

You might have placed funny, clickable popup that encourages your visitors to click but doesn’t address their needs and current step in the decision process. This approach would cause promoting popups that increase the engagement of your website rather than conversions.

The golden rule of solving this problem is to measure and make decisions based on the same metric you’d like to optimize. If your main goal is to increase the number of orders or Average Order Value you have to model the influence of your variants on those particular metrics although having multiple optimization objectives at once is again beyond the basic framework.

Simple A/B testing should be used carefully and rather as a ‘better than nothing’ tool than a silver bullet. Available marketing tools vary in know-how used to build them although there are some really advanced tools on the market which allow you to take care of the problems and bring a proper empirical framework to the business world.

Przemyslaw S. Gliniecki is a Co-Founder of Cart Defender (https://cartdefender.com), which brings machine learning to eCommerce by providing behaviorally targeted popups with automated variants recommendation for online stores. Currently available on Magento Connect marketplace and for WooCommerce with a plugin and for other and custom platforms via an API.  

E-mail: psg@cartdefender.com

Twitter: @CartDefender

 

Follow us!

One thought on “3 common pitfalls of simple A/B testing on your website

  1. abiya

    I just see the post i am so happy the post of information’s.So I have really enjoyed and reading your blogs for these posts.Any way I’ll be subscribing to your feed and I hope you post again soon.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *