Conflict from running multiple experiments on same URL
I have read the conflict from running multiple experiments at the same time article, but I had a similar (yet different) question.
If I were to create multiple experiments for the same URL, would Optimizely be wise enough to run them all simultaneously?
Let me provide an example the dummy URL of 'mikeswebsite.com'
- Run an A/B test on the headline
- Launch an exit popup
- A/B test the form (removing 1 or 2 fields to test a boost in conversion)
If I created 3 seperate experiments to test all three of those things, what logic would Optimizely's snippet provide to actually deliver the three experiments? Would they be able to deploy themselves without any conflict?
Also, I am aware that my scenario above is super convaluted and you likely wouldn't want to test all 3 elements at the same time ... but I'm still curious as to an answer.
Solved! Go to Solution.
Great question! Yes, by default, Optimizely will run all experiments at the same time, even when targeting the same page. In fact, we currently have 5 active experiments running right now on optimizely.com!
The only danger of this setup is when the same element on a page is being manipulated by more than one experiment. So if our 5 experiments all made changes to our headline, for example, we'd be in trouble because there's no telling which sentence the visitor will ultimately see.
As long as the elements that you are modifying in each experiment are distinct from one another, you won't have a problem.
If you wanted to do the opposite, and keep experiments mutually exclusive, you would need custom audience conditions to do so. Check out this article in our KB titled Mutually exclusive experiments for more details on how to do that.
Is that helpful? Please let me know if you have any follow up questions.
One last follow up Q: If you're running 5 experiments on the Optimizely homepage at the same time, wouldn't you run into analysis issues trying to sort out which experiment actually drove the greatest impact? I understand from the mutually exclusive experiment story that "By default, Optimizely will evaluate each experiment independently." but still, from an analyst standpoint that's got to put you in a bit of a pickle if there aren't large factors of difference in the results.
You are absolutely correct. I looked into it and realized that these are mostly not creating visual variations for our visitors. We tend to use Optimizely in very creative ways, so we also run experiments to do things like test external snippets, monitor certain activity, run A/A tests or track system performance.
I'll jump in to this conversation and point you to this simultaneous testing Support article (different from the previously linked community question). While the simultaneous testing article uses the example of two different pages, the same basic concept applies if you're running enough traffic through one page because you will get an equal proportion of visitors who all saw each combination of experiments which means you can largely look at the results of those experiments independently.
That said, to use your original example for dummy URL of 'mikeswebsite.com', that would be best set up as a multivariate test. The reason for this is that you're testing three distinctly different parts of the page and you'd like to be able to break your results down - all in one results page - by all the possible combinations of experiences and an MVT test will allow you to do this.