A/B tests are very commonly performed in data analysis and in data science. For this project, I worked to understand the results of an A/B test for an e-commerce website. I created this notebook to help the company understand if they should implement the new page, keep the old page, or perhaps run the experiment longer to make their decision.
By using a bootstrapping technique to simulate 10.000 random samples of the difference of each website's conversion rate and by calculating the percentage of samples that are below the observed difference in conversion rates, a p-value of of 0.89 could be observed. This p-value is above the previously set threshold of 5%, and suggests not to reject the null hypothesis, which in this case is the old webpage. An additional logistic regression goes even further by suggesting that users on the old page are about 1.015 more likely to convert on average than users on the new page.