Rich-text Reply

QA in starter packet

AG 03-17-16
Accepted Solution

QA in starter packet

Hi all,

 

I have a question regarding the QA within the Starter packet. We have created a test, but in the starter packet it is not possible to make QA with Cookies. We can see the variants with

/?optimizely_x5157582480=1&optimizely_force_tracking=true but I don’t know how to proove our metrics/goals without the cookie. Furthermore if I use this parameter, I can see our test variants also on pages I haven’t included in the URLs for my test. Is that normal? … If I only use the preview tool I can only see the variants on the testpage, but not on the included URL’, because I can’t open the preview tool there.

 

It would be very nice, if you could tell me an alternative regarding the QA for the starter packet.

 

 

Kind regards

AG

AG
Level 2

DavidS 03-18-16
 

Re: QA in starter packet

Hi AG,

 

Thanks for reaching out!

I understand that you would like to find out what best QA tool is available for your purpose in the Starter plan. 

 

If you would like to find out if your goals are firing correctly, you need to use ?optimizely_x[EXP_ID]=[VARIATION_NUMBER]&optimizely_force_tracking=true like you mentioned. Then head to the Network tab in the Developer Console and you should be able to see an Optimizely network event there. We have a dedicated article on this topic that may be of interest to you. You can check it out by clicking here


If you use the Force Variation Parameter on pages that are not targeted by your URL targeting settings, then you will experience issues on your page. We don't recommend you do that as the variation code will be forced to run on a page not targeted by your experiment. 

 

Regarding your issue with your Preview Mode, you can try regenerating a preview snippet by adding a space in the variation code and saving your experiment. Head back to Preview Mode and it should now load.

 

I hope that is helpful. Feel more than free to reply back with any additional questions at all.

 

Best,
David

 

Optimizely
AG 03-18-16
 

Re: QA in starter packet

Hi David,

 

thank you very much for your help.

 

I can see the Variants in the tab Console and also some events in the tab Network. For Clicks it works fine, but for a pageload I am not sure, because I see more optimizely events for the same url in tab network "n". I can also see Pageload events on URLs I haven't included in my test. But I guess, this shouldn't be the case.

 

I know that I should'nt use ?optimizely_x[EXP_ID]=[VARIATION_NUMBER]&optimizely_force_tracking=true for pages which aren't included in my test. But I would like to know, when they aren't included, why is it possible that I am able to see with this parameter snippit pageload goals and the variants in the developer tool?

 

I haven't understand the matter with the Preview mode correctly. Where exactly should I add space in the variation code? Do you have an link with explanation for this one too?

 

Thank you very much for your help David, I really appreciate it.

 

Kind regards

AG

AG
Level 2
DavidS 03-21-16
 

Re: QA in starter packet

Hi AG,

Thanks for getting back to me.
When you use the Force Variation parameter, Optimizely will not consider any Audience, URL Targeting or Traffic Allocation settings. It will force yourself into one variation and thus Optimizely will run the variation code on this page.

As for Preview Mode, you will need to add a space in the <Edit Code> box (known also as the variation code). You can find at the bottom of the lower right corner in the Optimizely Visual Editor. Then click on Apply to save the changes. This will force Preview Mode to regenerate the snippet and should load your page fine. We don't have a link yet in our Knowledge Base for this matter.

Best,
David
Optimizely
AG 03-31-16
 

Re: QA in starter packet

Hi David,

I am sorry for my late reply. I have also a question regarding the pageloads. For Clicks it works fine, but for a pageload, I see more optimizely events for the same url in tab network "n". But there should be only one event for this experiment right?

Thank you for your help.

Kind regards
AG
AG
Level 2
DavidS 03-31-16
 

Re: QA in starter packet

Hi AG,

 

As Optimizely de-duplicates goals so that one visitor can only ever record one conversion per goal, it doesn't matter. It will still be recorded as one conversion in Optimizely results page.

 

Best,

David 

Optimizely
AG 03-31-16
 

Re: QA in starter packet

Hi David,

thank you for your quick reply. So it doesn't matter how often I see this goal when i make QA, it only matters that the goal exists, right? For example in Maxymiser it is different. When I see duplicate goals in the QA, it would be a bug and in the reports there would be also duplucates. But if it is with Optimizely different, than thats fine. Smiley Happy Thank you very much for your help.

Kind regards,
AG
AG
Level 2
JasonDahlin 03-31-16
 

Re: QA in starter packet

[ Edited ]

@AG - With the limitations in the "free" version, there are a few ways you can run a test without any chance of a real user entering the experiment.

 

1- Do you have a development environment?  If so, target those URLs instead of your production site.

 

2- Set up a Custom Audience that targets with an obscure language preference, such as "Zulu" or "Yiddish".  You can change your browser preferences so that whatever language you select is your first choice for language.  (You will want to change it back before browsing Google or any other site that supports "Zulu" or "Yiddish", but otherwise, your chances of having a legitimate real customer entering your experiment are pretty slim.

 

3- Create the experiment as you want it to be but set the "B" group to 0% of traffic - that way no one will accidentally make it into the test group.  Turn the test live (since all users will be in the control group, no one will see any of the changes).  This works well for testing URL targeting.  Also, you can use the force parameter such as "optimizely_x123546789=1" to force yourself into test variation.  Since your vairation allocation persists, you and you alone will see the "B" experience on your site.  When you are satisfied that the experiment is set up properly, end this experiment, duplicate it, adjust the allocation to 50-50, then start the new version.

--Jason Dahlin
Analytics and Testing Guru Smiley Happy
Experimentation Hero
AG 04-03-16
 

Re: QA in starter packet

Hi Jason,

cool, thank you very much for your help. Smiley Happy

1. Do you mean I should do QA at first with the URL'S of my development environment, and if everything is fine I should replaice this URL's with the real ones and set the test life?

3. Do you mean I should make QA fot the metrics e.g. some clicks and pageloads within the default only? But if I do it, can I be sure, that if everything tracks correctly within the default, that it is corect also on the challenger variant?

Thank you and Kind regards
AG
AG
Level 2
JasonDahlin 04-04-16
 

Re: QA in starter packet

@AG - 

1- If you have a development environment, set up the experiment and activate it in that environment so that everyone can see what the experiment is doing and provide feedback (is there any "flashing", are there pages where it should be running but is not, are there pages where the effects are not as desired, etc.).  Once the experiment is verified (e.g., the Business Owner and QA both sign off on it), *dupilcate* the experiment and modify the URLs in the new version so that it runs in the production environment.

 

e.g.,

if your QA experiment is:

"QA Make Button Green" and runs on all pages of "http://dev.yoursitehere.com"

make the production version:

"Make Button Green" and run on all pages of "http://www.yoursitehere.com"

 

This prevents your dev environment from interfering with the results in the production experiment and, should you find any issues with the experiment once it is live, it gives you a place to recreate the issues and fix the experiment (adjust the code, adjust the URLs it runs on, etc.)

 

2- You should verify both.  

A page view goal for URLs containing "/thankyoupage.html" wouldn't need to be verified since that pagename is not different between experiments.

But, if you have a click-goal on a button that is on a page that you are modifying, you will want to verify that the click goal works in both the default and the variation.  The easiest way to verify this would be to check the selector directly on the page using your browser's developer console.

 

e.g., if your click-goal is for a button that matches the selector "#submit-button", run the selector using jQuery like

$('#submit-button')

if the button you want to track shows up, then the click goal will work.  If the button you are tracking does not show up as a match, then you will need to modify the click-goal so that this button is tracked.

--Jason Dahlin
Analytics and Testing Guru Smiley Happy
Experimentation Hero
AG 04-05-16
 

Re: QA in starter packet

Hi Jason,

I've tried it to set the test life on my development site, but I only see the default (also if I put default on 0%). I don't know what the problem ist. The snippet is also integreted.

Kind regards
Adina
AG
Level 2
JasonDahlin 04-05-16
 

Re: QA in starter packet

@AG - Once you see a specific variation of an experiment, it persists for you so that you always see the same variation.  It sounds like you joined the eperiment before pasuing the default.

 

To get yourself re-bucketed, you should clear your cookies then refresh the page.

Since your default is paused, you should immediately be placed into Variation 1.

--Jason Dahlin
Analytics and Testing Guru Smiley Happy
Experimentation Hero
AG 04-06-16
 

Re: QA in starter packet

Hi,

I've tried it, but it doesn't work. Then I have seen that I only changed the Targeting URL's but not the Editor URL, so I changed it as well, but then i've got an error on my page and I couldn't edit it. Maybe this happens because the URL of my development site is too much different from my normal website URL. Maybe I should try it with a language like yiddish.

I've made Sandbox QA like David described with the Network Tab for my Click and Pageload tracking and to see the Variants I have used this link ?optimizely_x[EXP_ID]=[VARIATION_NUMBER]&optimizely_force_tracking=true So, in Sandbox everything is fine.

I hope on production mode it wil work with the language segmentation.
There is no way to exclude all IP's except of my?

Kind regards
AG
AG
Level 2