Company: alinc.com (client), Klinke Marketing (agency)
alinc.com utilizes an ecommerce suite. Consequently, much of the layout, design and page functionality is preset and/or activated by default.
Inherently, we (users) assume that these suites are designed and tested (!) for maximum effect. However, it's uncertain if that is indeed the case.
With one of their updates, the suite rolled out a Price‐Drop Notification Tool that allowed visitors to request a notification e‐mail if the price dropped on an item.
At first, this functionality seemed a great addition and good for customer service. On second thought, however, we wondered if suggesting a lower future price, visitors would actually be deterred from purchasing.
• Test to (in)validate the usefulness of default functionality in out‐of‐the‐box ecommerce software.
• Our team hypothesized that a Price‐Drop Notification Tool that was visible directly on item detail pages suggested a lower future price, therefore preventing sales.
• Results showed 25.6% lift in conversions across all traffic when tool was removed.
• Big surprise: when segmenting between new vs. returning customers, removing the tool from the page INCREASED conversions by 74.9% for returning visitors and DECREASED conversions for new visitors by 6.6%
We set up a clean A/B test across the entire inventory ‐ all item detail pages ‐ and split traffic 50/50. The original variation showed the Price‐Drop Notification Tool and made the e‐mail functionality available. The test variation eliminated the tool and its underlying functionality. See screeenshots.
Conversions, conversion rates and revenue for each variation were measured through Optimizely and Google Analytics, which was integrated.
Results showed that our assumption was correct, and item detail pages WITHOUT the tool saw a 25.6% lift in conversion rate (97% confidence). To be absolutely safe about implementing the change, we checked Segments and possible changes in AOV for control purposes.
Control (1) ‐ Segmentation:
(A) The experiment was served to direct, organic and paid traffic, so we first looked at possible variations in the results based on traffic SOURCE. There were none. The numbers matched the overall lift.
(B) Secondly, we segmented by VISITOR TYPE; new vs. returning.
For Returning Visitors, our test variation (no tool) produced a 74.9% LIFT in conversion rate.
On the other hand, for New Visitors the test variation (no tool) DECREASED the conversion rate by 6.6%. Control (2) ‐ AOV:
Since this test was geared towards Conversion Rate, we wanted to be sure that there was no negative effect on AOV, in which a lift in conversions may go along with a drop in AOV and actually produce overall lower sales.
Interestingly (and fortunately) the winning test variation for each segment (new vs. returning) also produced a higher AOV.
On a more emotional level, we felt validated in questioning all elements on the site all the time. Just because features are available and turned on by default, we should not always accept them at face value. As always, keep on testing!