When I am correct: user are unbucketed from a variation of an experiment when experiment is 'paused'.
Promblem is when you restart the experiment they can get into a different variant then the one they were originally in, making the experiment in-valid.
So, my proposal is to keep users bucketed in the same variant even when expermint is paused/restarted again, and un-bucket users once the experiment gets archived.
(see info in ticket Request #99731).
This idea was a duplicate so we have merged the two ideas.
at the moment it is not possible to delete projects.
Once setup we cannot remove them from our dashboard.
Please give us the easy feature to be able to remove obsolete or unwanted, unused projects
from our account.
I'm at the point with some of my clients where I have 50+ goals set up for them. It gets somewhat tedious having to scroll through that big list every time I want to add a saved goal to a new experiment. It would be really nice if the "Saved Goals" selection dialog had a text box that filtered the list (based on goal name) of goals based on what you type into it. Also maybe checkboxes/tabs for each of the goal types (pageview, click, event).
It might be cool if you guys had a place for users to tell you where there were bugs or innacuracies in Optiverse and the developer portal. That way we can help you to keep it up to date :-)
For example: when you use browser search (Ctrl F) on the developer site (once you are in one of the sections) the header gets messed up. I'd attach a screenshot, but it doesn't look like I can. Essentially, big text appears overlapping the logo area. This happens for me in Chrome and IE, but not Safari. Not a big deal, but something I noticed and thought I would pass on.
This idea has been merged with another post that had the same idea.
Maybe this has been said before.
An option to Disable Proxy by default to all the project experiments instead of keep on adding " &optimizely_disable_proxy=true" at the end of the url of the Experiment Editing Page
For a lot of my tests I run them across the same page URL targeting options but I have to keep copying them into each test.
It would be good if I could create url targeting groups/conditions that I can just click to apply to my test rather than having to copy all the URLs into each test.
Not sure if this is something that others may also find useful.
To speed up our development cycle we push up experiments and variations through the API. That's all well and good for JS & CSS but for images we have to go through a few more steps (adding the image to a variation and then getting the URL to use in our code). If there was an API wasy to add images to the variation or even a simple UI to upload images to CDN and get the URL that would save some steps.