Changing traffic allocation after starting an experiment
Hi, I started a personalization campaign about a week and a half ago and aren't seeing as much data as I would like to mostly due to the fact that I left the traffic allocations at their default levels (95% vs 5%). I understand that changing those settings midway may skew the results, but I do want to at least see a decent insightful comparison between the variations. What is the recoemmended best course of action? Should I erase the current results and restart the experiment? Thank you!
But... if you want to do it the mathematician's way...
Duplicate the existing experiment, alter the allocations, pause the current experiment, and start the new one.
Analytics and Testing Guru
Changing the traffic allocation midway through the campaign can indeed skew your results as the Stat engine is not calibrated to account for a change to holdback during a running campaign.
Recommended way would be duplicating the existing campaign, changing traffic allocation, and starting from scratch, as Jason is suggesting. You can read more on this topic here: https://help.optimizely.com/Target_Your_Visitors/Holdback%3A_Measure_overall_impact_in_Personalizati...
Ah yes - the stats engine has issues with changing allocations, but if you are measuring your results independently in a separate analytics platform (e.g., Site Catalyst integration), the normalized results (goal met per visit/visitor in segment) look just fine.
Analytics and Testing Guru
Hey @mluo this is a fantastic question for our Ask the Expert column right with with Tom Fuertes, CTO of our three-star solutions partner, CROMetrics.
If you'd like Tom's opinion (and I'd recommend it) feel free to post your question here.
Hi Mei –
- You are absolutely allowed to change traffic allocations midway through the experiment as to see enough of an insightful comparison between variations.
- With this new allocation enabled (through going to your settings and adjusting the “holdback” %), Optimizely will automatically recalculate significance with this new adjustment in mind. So as more data comes into the variation and the holdback, Optimizely’s statistics engine will re-evaluate the level of significance that it can confidently say it is at for that period of time and for the data that it has observed thus far.
- A few things to keep in mind if you’re doing this:
- By allowing a larger portion of traffic into your “control” group, it has the potential to swing your results as you’ve currently seen them come in thus far. This is because what is currently being calculated as the conversion rate for the holdback may not have been representative of the full population, and the more traffic you let into a bucket, the more confident we can be in the “true” value/conversion rate/engagement for that particular cohort. Since you only had 5% of the traffic in the baseline, our estimate of the conversion rate is more variable due to a smaller sample size - however, this is already accounted for in Stats Engine on whether it should call something a winner/loser confidently. (This is just an FYI)
- You always have the ability to change the “date range” of a personalization campaign. So you can use this feature to show what the conversion rate is for the period of time before you allowed the increased traffic into the “holdback”, and also for the period of time after/or both combined.
- Another thing to consider, is if generally you have a lot of return visitors coming to the page that you have the experience running on, just know that we will always show previous visitors the experience that they were bucketed into. So the new “bucketing” would only apply to any new traffic coming to the page that hadn’t previously seen the experiment.
- My recommendation is that changing allocation to deploy a winner is fine, but one final thing to keep in mind is that you should take care to see what else is going on around the allocation change (meaning external events). Say the conversion rate for the time "pre-change" was 5%, but "post-change" was 10% (bc of a promotion/type of visitors), then all of a sudden the "higher converting visitors" will impact the baseline conversion rate of the control group going from 5% of the traffic to 50% of the traffic. Again, just something to keep in mind in case you see the baseline conversion rate of your 'baseline' change dramatically.