The primary goal with A/B testing tools is to do proper bucketing and distribute variations correctly to get representative samples of your traffic. In most cases though, the data you receive from optimization platforms isn’t enough.

 

You can select one or a few metrics to measure your performance against, which can deliver incredible insights into which variation performs better. However, this process can limit your view of the big picture. For example, Variation A may perform better than Variation B, but you may be asking yourself: Why?

 

Data Sources Explained

External data sources can help you answer that question. Integrations with data sources like Google Analytics, Adobe Analytics, or other analytics platforms can give you a more in-depth look into the performance of A/B testing.

 

For instance, if you want to test a variation, but you don’t know what the impact this variation will have on page speed performance, looking into an external data source can help. This is because A/B testing tools can’t measure something like loading time, do recorded sessions, or produce heatmaps. By integrating data sources into your testing optimization process, you’ll simply gain more valuable data.

 

How Data Sources Deliver Results

To see how connecting data sources like Google Analytics or Adobe Analytics can increase insights for your organization, let’s walk through an example together. First, we’ll start with some results from A/B testing. Then, we’ll show you how connecting data sources can give you a more detailed, behind-the-scenes look into those results.

 

In one of the experiments we ran with one of our large ecommerce clients, we were testing a redesigned map location selection module. Testing results in our optimization tool dashboard were showing a 7% drop in the main metric, which was a conversion to a successful location selection. So we knew now that our visitors had trouble with the new module, but we did not know exactly why.

 

Having used an integration with a session recording tool, we noticed that a lot of the sessions interacting with the variation had rage clicks. The reason was that the visitors were trying to interact with the map when we actually wanted them to start typing their location, which would then be dynamically visible in the map.

 

This led to the action point of making the search location bar better promoted in the module and deprioritizing the map. The next iteration of the test had much better results, with a 4% lift that enabled us to fully release the new module. Without the insight driven from the screen recording tool, we would have never realized that we were dealing with such a UX issue.

 

Summary

We hope this example shines on the light on the importance of connecting data sources to improve A/B testing. A/B testing tools are an excellent way to test any changes in your digital experiences, but it is just the beginning of gaining the insights your organization needs.

 

If you need help getting started with building a testing plan, check out our eBook: How to Build a Strong Optimization Practice. Building a testing plan can seem like a lengthy process, but this step-by-step guide breaks it down for you so you’ll get a better understanding of your data and create more successful digital experiences.

About the author

Robert Petrescu

New eBook Release: A Guide to User Journey Analysis

Cognetik eBook: Guide to User Journey Analysis

Related Articles