Getting Around

Hello my name is Jeevan Padiyar. This is my personal and professional blog.  It's a place for me to think out loud and learn, talk about things that inspire me, and share my observations with the world. If you feel like my musings are misguided or just plain wrong please feel free to reach out and correct me. I would relesh the opportunity for discourse. Thanks for visiting.

Who is Jeevan?

 

Other Places you can find me on the web:

Photo Blog

Link Blog

Entries in Jeevan Padiyar (1)

Saturday
Sep182010

The statistics of A/B testing pt 1

Hypothesis test to find your way.

Statistical methods can help us with more than just examining trends in a given population. Used correctly we can also determine the best of two options available. This is exceptionally important in A/B testing land.

For those of you not familiar with A/B testing, here is a brief primer. A/B testing is the process of modifying site elements to increase conversions. What constitutes a conversion is defined by the site owner and can be anything from purchasing a product to visiting a deeper page within the site.  In a properly set up A/B test either page A (with no change) or page B (with the change being tested) is shown randomly to a site visitor and the conversion rate is measured for that specific instance. At the end of a test the data is aggregated to determine which page, A or B converted, and site is changed to reflect the winning modification.

Below is a great diagram of A/B testing process from the Unbounce Blog

Once the data is gathered the real fun of assessing the statistical significance of A vs. B begins.

You might be asking, can’t we just look at the conversion rate and move on? The answer is sometimes. Only when there is an overwhelming winner is the decision easy. In most tests the data is less definitive. Let’s take a look at an A/B test we recently conducted at bookswim 

In our experiment, we generated the following results.

Test Case

Views

Conversion Rate

Page A

31500

1.01%

Page B

33500

1.11%

 

Strictly based on the conversion numbers it looks like Page B converted 10% better than Page A, but when we ran the numbers through a standard hypothesis test (also known as a Z test – more on that later) we found that the two scenarios were in fact equal.

So the moral of the story is, use hypothesis testing on all your A/B data otherwise you may be setting yourself up for a surprise in your conversion optimization efforts.

In my next post I will describe how we came up with the results.