Matching Revenue Goal to Actual Sales - Miles Off
On our e-commerce website we have done some A/B tests on our product page.
However I have been through the results data from Optimizely (Revenue) and sometimes our actual transactions match the data on the results page and sometimes our transactions are not recorded, total amounts are incorrect or they are well off.
Obviously this means we are not sure for a lot of our transactions if the original or variation should get the credit and means the data from Optimizely for our results is totally useless.
Any help would be much appreciated. Is this typical for Optimizely or does VWO handle transactions much better?
Are you linking your experiments to Google Analytics with the custom variables? If you are then this should allow you to match up the revenue in analytics.
Head to Audience > Custom > Custom Variables and you should be able to dive into your tests and monitor their performance against your eCommerce tracking.
Let me know if this works for you.
1. Are the same visitors in your analytics also subject to your experiment? If you are excluding any visitors from the experiment, they will make your total revenue appear lower than your own analytics.
2. Are you account for sales tax, discounts, shipping, etc.?
3. Are you comparing the same statistics? I think Optimizely shows if someone purchases more than once, your analytics report might look at only specific purchases
It's generally best to look at these things directionally. If the experiment says revenue is trending upwards, it probably is. Another thought is to also send along the order confirmation #, then maybe you can compare them 1 for 1 offline.
hope this helps
The previous two commenters definitely raise good points and I think their suggestions are worth exploring. However, I'd also like to mention that if you send a summary of the issue along with the experiment ID to email@example.com, one of our representatives will be able to look into the test for you and double-check that you've got everything set up right. If you can provide dummy details in order to make a purchase for troubleshooting purposes, that would also be fantastic. If there is an error in the test setup, we'll be able to advise and help you to correct this.
My data in MixPanel and Google Analytics confirms that Optimizely revenue tracking is 2-3x more than the actual.
and I need Optimizely to calculate my P-values and all the rest, it absolutely makes things misleading.
The only culprit I can suspect is Revenue being reported for multiple products across multiple experiments and visitors being bucketed in to the same revenue goal.
I'd like to reiterate what I told the previous commenter: if you've worried there's a flaw in the set up, please send a ticket over to firstname.lastname@example.org with the experiment details and steps of how to get to the page on which the revenue code should fire.
Your revenue goal will only fire for visitors bucketed into a certain experiment, so even though it is the same code, it would not be possible for a visitor in experiment A to trigger the revenue goal for experiment B. However, it would be possible for visitors to enter a certain experiment, go elsewhere on your site and perhaps go through a different conversion funnel but still trigger the revenue goal at the end.
Other issues I have seen in the past is that the revenue code is installed multiple times on the page, resulting in multiple conversions by the same visitor, or else that visitors refresh the page on which the revenue code resides, resulting in it being fired several times. These are effects that can be mitigated, but it depends on exactly what is happening to cause the revenue spikes.
Just let me know if I can help out in any way.