Share your Idea
Sort by:
New Idea

Allow for Full Raw Data Export from Tests

Status: Done
by amitch5903 ‎04-21-2014 April 21, 2014

Allow For Full Raw Data Export

Currently, when we want to match 1-to-1 between Optimizely test results and our database to do additional analysis, we have to request the raw data file from our Optimizely Team.  It would be great if I could export a .csv of the full visitor-level results and avoid this extra step in the future!

Status: Done

This has been completed and you can read more here!

Allow for Comments and Annotations

Status: Done
by Jono ‎04-17-2014 April 17, 2014

It would be great to be able to post notes regarding certain Goals or dates within the 'View Results' section. Currently, we are at the mercy of the Goal Name and other Optimizely terms. Being able to input notes/details about certain goals will help other teammembers understand what each change is affecting without having to figure it out in QA. 

 

Also, the ability to add annotations as certain events may want to be noted within the results. (i.e. - release went out, reasons for traffic shfts, etc)

Mutually Exclusive Experiments

Status: Done
by RyanPoznikoff ‎04-16-2014 April 16, 2014 - edited ‎04-16-2014 April 16, 2014

It would be great to have an option to make experiments mutually exclusive without writing complex JS.  A simple option within targeting would be helpful.  

 

For example, if a user is entered into experiment 1, then don't allow the user into experiment 2.  Or, if a user is included in any live experiments, then don't allow the user in the current experiment.

Status: Done

Allow switching between AND / OR in Targeting

Status: Done
by Optimizely ‎04-16-2014 April 16, 2014 - edited ‎04-16-2014 April 16, 2014

It would be great to be able to pick OR conditions between different types of targeting, for example, "visitors who come to this URL OR have this cookie". It would also be good to allow for AND conditions within a category, for example, "visitors who have the query parameter utm_source AND utm_campaign in the URL."

Status: Done
Thanks everyone for the suggestion! I'm excited to announce that switching between AND/OR is now available with our new Audiences launches: http://blog.optimizely.com/2014/07/22/create-personalized-experiences-audiences/

Experiment data API

Status: Done
by Duncan Level 2 ‎04-17-2014 April 17, 2014

We as an agency would love to have an experiment data API so that we can integrate Optimizely experiments with all of our other reporting data. This would allow us to show clients the impact of CRO in many more ways

Status: Done

Use one email address on multiple accounts

Status: Done
by brian-tcr ‎05-15-2014 May 15, 2014

It's frustrating to have to use a new email address for every Optimizely account you get added to. Ideally I would like to use my own work email address across every account. So when I log in with that email I see a list of the accounts I've been added to in a splash screen and can then dive into each of these accounts. Google Analytics has a similar screen on log in.

 

There are likely to be a number of use cases for been a user on multiple accounts:

- agency with multiple clients each with unique account

- multinational with different accounts/business units per market but centralised testing team

- strategy or development freelancer who does ad-hoc consulting on different accounts 

Status: Done

Thanks for the suggestion! We've just released an update that lets you log into multiple accounts with the same email address. Just add that email as a collaborator to each account, and then use the account switcher in the top right corner to jump between them. See this article for more information: https://help.optimizely.com/hc/en-us/articles/200040775#add_multiple

Color Consistency of Variations on Report Graphs

Status: Done
by MJBeisch Level 2 ‎10-16-2014 October 16, 2014

It would be nice if the colors assigned to variations were consistent between goals on the results page. Here are some screenshots of 2 different goals for an experiment, demonstrating the current inconsistency:

 

Screen Shot 2014-10-16 at 3.21.52 PM.png

 

Screen Shot 2014-10-16 at 3.22.03 PM.png

 

On the first goal, the "Original" variation is orange, but on the second goal it is blue. These shots were taken after initial page load, no configuration changes or any other page manipulation was performed.

 

I could have sworn that these colors were consistent, but I had a client point this out to me today. Went back and looked at the result pages of several other experiments and sure enough, the colors change from goal to goal. I guess I never noticed it because I'm usually looking at the numbers and the order of the variations is consistent, but this does appear to cause some confusion when doing "at-a-glance" monitoring or if using screenshots of a goal graphs to assemble a report.

Status: Done

Allow Testers to Document Experiment Hypotheses

Status: Done
by Optimizely ‎07-29-2014 July 29, 2014

Could we create an entry field in Optimizely where users can enter an experiment's hypotheses.

Status: Done

Integrate Sample Size Calculater into tests

Status: Done
by Betabrand ‎07-24-2014 July 24, 2014

I just read your article on how long to run a test.

It would be great if you folks would integrate your Sample Size Calculator directly into tests.  Since the number of variations and daily visitors are known by the test, if the initial parameters are entered, the test results page should be able to give you a real-time estimate of how much longer you should run your tests.
 
Status: Done

With our new Stats Engine, we now provide a real-time estimate of how much longer you'll have to wait (in number of visitors) before your test calls a winner or loser, assuming the observed conversion rates were to hold. Read more about Stats Engine in our FAQ

Minor suggestion:

I often find myself wanting to rename experiments, so the naming can be consistent across experiments, when I see them together with others on the dashboard. It's a minor annoyance to have to slowly load to edit each one just to change the names to be consistent across experiments.

 

So,  a nice minor improvement, would be to add a 'rename' option on clicking on an experiment in the dashboard, so you can rename without going to edit the experiment.

Status: Done

Hypothesis Tab

Status: Done
by Julita ‎10-31-2014 October 31, 2014

Hi! Smiley Very Happy

 

I think it would be very nice if the Experiment Edit Page would include Hypothesis Tab.

Then the editor would be able to write the hypothesis or give a short description about the experiment.

This feature would help with giving a better idea about the experiment, especially if a few people are working on it.

 

It would be useful to grab it later from the API as well.

 

Attached are 2 pictures of how it could look like.

 

hypothesis2.png
hypothesis3.png
Status: Done

Optimizely includes tablets within mobile devices, however if you check with google analytics you will see 3 types of devices which are desktop, tablet and mobile and there's areason for that. We can't target only mobile devices without doing extra work with optimizely and it's just a non-reliable workaround. This is a problem if you have a mobile first strategy like us. We start with the smallest screens, which are the smartphones and continue to grow to tablet and then finally desktop. How can I target these seperately. Currently we can't, but we should.

Status: Done

It would be awesome to have a visual representation to differentiate between whether you are targeting a positive vs. negative attribute. Maybe just have one tab be a different color, as well as along the left nav once the targeting is set.

Status: Done
We made it easier to see and toggle between positive and negative targeting conditions with our new audiences launch. I hope it helps! You can learn more here: http://blog.optimizely.com/2014/07/22/create-personalized-experiences-audiences/

Folders on Dashboard

Status: Done
by salismat ‎07-03-2014 July 3, 2014

Please, please please can you put some folders on the dashboard? I don't want to have to implement a new code snippet every tme I want to put things in a folder. Really simple idea but would be fantastically useful. 

 

Once you have run over 30 tests you want to be able to quickly find the ones you are working on creating and the ones that have completed. Its a waste of time to scrolling up and down the page. This is particularly acute when you are trying to present your wizzy new tool to the Marketing Director and you can't find the things you need.

 

Thank you

Matt

Status: Done

Ability to Pause Experiment on Results Page

Status: Done
by Allie ‎04-18-2014 April 18, 2014
I would love to be able to pause an experiment from the results page instead of needing to hit "Edit Experiment" and waiting for the full page to load.
Status: Done

Statistical Test Choice & Power Analysis

Status: Done
by jweinstein Level 2 ‎04-16-2014 April 16, 2014 - edited ‎04-16-2014 April 16, 2014

Choosing Statistical Test: 

It would be helpful if there were different statistical tests to choose from when looking at the results. We generally use two-tail tests, so we take the numbers of visitors and conversions and run the results in R.

 

Power Analysis: 

This would  tell the user what kind of effect size he or she could expect to be able to observe given the traffic so far (or how many additional visitors will be needed to reach a certain confidence level).

Status: Done

With our new Stats Engine, Optimizely now uses sequential hypothesis testing with false discovery rate controls to calculate statistical significance. This means that the p-value that you see is a valid measure of statistical significance at all times that your experiment is running.

 

As for power, there is no longer a need to consciously set statistical power. Instead the specificity of your results depends only on how long you are willing to wait. The sequential test we implemented is a test of power one, which means it will detect a non-zero effect size always, if you wait for enough samples. Waiting longer on any test gives you more chance to detect a winner or loser, if one exists. Plus, instead of the effect size you can expect to detect with power given your traffic so far, we now give an in-product estimate of how much more traffic you need to call your test significant given the currently observed effect size stays the same.

Archive Legacy Goals

Status: Done
by alphanumerritt ‎08-21-2014 August 21, 2014

Need the ability to archive goals that are outdated or that shouldn't be used for one reason or another. Deleting the goals would be great, except that compromises any past test results that utilized the deleted goal. As it is, I've got a bunch of garbage goals in my goal list that just muddy the water when I need to pick/reuse goals for tests. 

Status: Done

Before the Audiences release, you were able to target based on new vs returning visitors. I know the previous implementation was not ideal, but I would really love to see this general functionality come back in a more robust way. 

Status: Done

It would be nice if the Global JS/CSS sections of the Optimizely editor were code syntax highlighted and supported code editing functions just like the Variation Code box does!

Status: Done

Wordpress blog headline variation testing

Status: Done
by HeatherW ‎04-14-2014 April 14, 2014

I'd love to be able to A/B test different headlines in wordpress easily. Thanks!

Status: Done
Other Ways the Optiverse Can Help
Knowledge Base
Your reference guide for support topics, FAQs, feature documentation, and optimization strategies.
Academy
Learn optimization essentials at your own pace and practice your skills through interactive, step-by-step courses.
Support
View your ticket history or file a support ticket with our team of helpful experts.
​Agencies and Optimizely Partners
Access the Optimizely Agency Portal, an exclusive site for agencies and partners to view Optimizely resources and register client leads.