Using Chrome Developer Tools to QA and Test Experiments
Chrome Developer Tools allow for an incredibly simple and accurate Q/A of your live Experiments. This walkthrough is for users of Optimizely that are responsible for experiment creation and experiment quality assurance. Coding skills are not required to follow along, but may be helpful in understanding the info Chrome Developer Tools displays or outputs.
In this post, we will cover 3 main tools and their relation to Optimizely Experiment QA.
- Elements Panel
- Network Traffic Panel
If you’re interested, you can also find out more about all the Chrome Developer Tools features and shortcuts in this Google support article. You can also follow along with the slides below for a visual reference to what each step will look like.
When I review Experiments, especially on new pages, I typically like to start with using Elements Panel to verify the Optimizely snippet is correctly added to the page near the opening of the <head> elements. To open the Elements Panel, you can easily right click on any page element, then click on “Inspect Element” (slide 12); alternatively, you can also use the shortcut key Ctrl+Shift+I (or Cmd+Opt+I on Mac) to open the DevTools. Once the Chrome Developer Tools are open and the Elements Panel is in focus, click inside the panel, use the shortcut Ctrl+F (or Cmd+F on Mac) to bring up the search input (slide 12). You can use this search to look for any element in the DOM, but searching for the Optimizely snippet is easy by just looking for the string “cdn.optimizely.com”. Verifying that the Optimizely snippet is as high in the <head> element as possible will minimize any chance of experiment timing or flashing issues.
- optimizely.activeExperiments - (slide 14)
This will return an array of experiments currently running on the page for that visitor. This is the first thing you’ll want to check. If the experiment ID for the experiment you’re reviewing is there, it means it is currently active on the page.
- optimizely.variationNamesMap - (slide 15)
If you find that the experiment is active, you can use this to see which variation you are in. This will return a mapping of experiment IDs to variation names for all experiments into which the visitor is currently cookied, both active and inactive on the page. Unless you have chosen to “Mask descriptive names…” in your Optimizely project settings, you will see the name of the variation displayed.
If you aren’t seeing an experiment activate on a page, you can push the log statements that Optimizely keeps track of during execution. This can be really helpful in seeing when goals are triggered and integrations are activated. What’s more, you can even find out info on why an experiment isn’t activating by using the shortcut Ctrl+F (or Cmd+F on Mac) to bring up the search input and look for the experiment ID. This can tell you if you failed to match any Audiences, URL Targeting, the experiment is paused or you were excluded from Traffic Allocation.
Network Traffic Panel
The Network panel records information about each network operation on your site, including detailed timing data, HTTP request and response headers, cookies, WebSocket data, and more. Once you’ve confirmed your experiment is activating properly, you can test your goal out with the Network Traffic Panel. As a brief overview, Optimizely goals are asynchronous, non-blocking, and only occur for visitors that have been placed in at least one experiment. They should have no effect on your page load time (read more about that here).
You can filter your Network Panel for Optimizely events by clicking on the filter icon, selecting XHR, then entering “event” in the input box (slide 17). By default, these events will be reset and removed each page load, but you can click the “Preserve log” checkbox to keep those around (slide 18). You can now see the network events send off by refresh the page to see the Pageview Goal or clicking on the element you are tracking to trigger the Click Goal. Right click on the element we created a Click Goal on above and look for a new network event (a right click will work because Optimizely uses the mousedown event for Click Goals).
Verify Goal is Working
To verify the proper network event is sent and goal is triggered, we will need to take a look at the “Headers” tab within a specific event in the Network Panel. Within the “Headers” tab, we see the data and parameters passed to Optimizely. The two parameters we need to look for are:
- “n” parameter - Goal name or page url (slide 20, 21)
- “v” parameter - Revenue in cents
If these aren’t match the goals you set up, all is well; we’ll need to figure out why they aren’t sending off (remember, an experiment has to be active or a goal’s targeting has to be met to trigger a Click Goal or Custom event).
Since the experiment we tested is live (whether it is with a test cookie or not), the visits and conversions will show up on the results page (slide 22). Results do not show up immediately in the results page, but will not too long after they are triggered (if you find that you are bucketed into the experiment and triggering the goal but it’s not showing up, check IP Filtering).
Test, Test, Test Again
Use a Chrome Incognito window to have a “fresh” experience. You must close all previously used Incognito windows to fully remove all Browser Storage and Cookies. Only then will you be treated as an entirely new monthly unique visitor and have a chance to be re-bucketed. Keep in mind, a Chrome Incognito window will treat you as an entirely new site user. This means you will show up as another visitor on the Results page and are able to convert on all goals for that user.
If you aren’t ready for your experiment to go live quite yet, you can also read about setting up a test cookie and test cookie audience condition here.
There you have it.
Now you know how to use Chrome Developer Tools to ensure the Optimizely snippet is properly placed, the experiment is activated, which variation you are bucketed into and what events Optimizely is logging for that page. These few best practices will help you investigate common experiment issues easier, and I hope you get as much use out of them as I do every day.
Please add your own best practices and how you use the Chrome Developer Tools in your testing process. I'd love to hear about it!
Great post Derek. These all definitely help when QAing a test, and I've used them all countless times!
In addition to your tips for verifying if a goal is working, you can also do the following:
- If it is a click goal that has been created via the “Create Goal” window, then it will have an "onmousedown" event handler attached to it. This means that you can just click down onto the element, move your mouse away from the element and release it elsewhere to prevent it from performing its default action. This will then show up in the Network tab as you described as well
- You can also paste the following code into your console and this will display an alert every time an event fires:
Also, on checking your test in an incognito browser, I would also suggest forcing the experiment using the Force parameters. This works even if you have the test in the draft/paused stage and you have allowed it in your settings for draft and paused tests to be loaded in the snippet.
I've written a few more tips on my post 6 Essential tips for any developer using Optimizely in regards to the force parameters, log and how Optimizely creates click events.
Head of Development at Conversion.com
Thanks for sharing this! Amanda suggested it would be useful for me to share my post about our Optimizely Chrome Extension in the comments here:
Should add some more tools to the overall toolkit to improve the QA process - it should help with viewing variations and experiments on a page and also help you see which events are being tracked on a given experiment or variation.