Vastly different results in Kissmetrics versus Optimizely, Don't know who to trust?
So I've been running a split test for my company the past few weeks and I've been confused by the amount of disparity I've been seeing in the analytics in Kissmetrics relative to Optimizely.
I've included images of the charts here: http://imgur.com/a/MfTkv
The numbers between Kissmetrics and Optimizely have always differed some, but the conversion rates and such would always average out to be roughly equivalent. In this case however, I'm seeing completely different results. Optimizely is showing Variation #1 as the winner, while Kissmetrics is showing the Original as the winner.
It almost seems like Kissmetrics has flipped the labels on the splits?
I'm still pretty new to using these tools and I was just wondering if anyone has seen anything like this or maybe has some ideas on what could be causing it?
Thanks for taking a look!
I would advise you get one of the Optimizely Techies to take a look at the integration and your websites conversion tracking to locate where the issue might lie.
I don't have enough experience with Kiss Metrics to provide any reasons why that would cause an issue.
This seems like a rare case and is isolated to a particular experiment. I would like to take closer look at the experiment and will need some more information. I am going to create a ticket on your behalf with this information and someone in support should reach out to you soon.
Solutions Architect | Optimizely
I have gotten in touch with Kissmetrics support and they offered some suggestions as to what could cause discrepancies between different analytic platforms, but nothing really explains why conversions are being attributed to the splits differently.