Details on latest Stats Engine Improvements? Seeing odd results...
Back in November 2016, we received a Product Release indicating that improvements have been made to the Stats Engine, however there was little detail as to what was improved with it.
Since then, our Strategists have been noticing some odd behavior particularly with the Statistical Significance calculation that was not encountered (or perhaps noticed) previously.
For instance, having recently launched tests start showing significance despite extremely low # of conversions and been live for a very short period of time. I am pretty sure the 'old' stats engine was not so generous with calculating significance (in fact, one of the gripes was how stingy the old stats engine was at calling Significance; maybe the latest 'improvement' was adjusted to address that?)
Can someone shed some light as to what has changed so perhaps we could better explain to our clients why Optimizely is showing increased significance with such small numbers?
Screenshot attached of a test we launched recently, after 1.5 hours of being live.
Director of Optimization Strategy at WiderFunnel
The best solution is the simplest.
To follow up with Nick's comments, we're now seeing some abnormal and abrupt shifts in confidence levels for the same experiment. I've attached a screenshot demonstrating this below. As you can see, Variation A in particular is regularly jumping between a confidence level of 0% and 85%. As Nick mentioned, this is something that we believe was not happening in past experiments so it would be great to know if changes were made that may be causing this behavior.
Optimization Strategist at WiderFunnel