Best test archiving and tracking?
We have been using Optimizely to run - literally - hundreds of tests on our website. This has been a great tool, but as we get more and more results, we are struggling to find a tool where stakeholders can easily review our upcoming and past tests and the project managers can archieve the versions they tested, results, etc. Does anyone have suggestions on how best to trackand archive test results in a comprhensive way over long periods of time?
We have been using Optimizely for hundreds of tests as well. Here is how we keep track of everything that is running and archived:
- Excel document for the list of all A/B tests for the year.
There is a status column (in progress, running, done) to make it easy for stakeholders to see what is currently running and what is under development.
There is also a color coded column for results (Winner/loser/neutral) to make it easy to glance past tests performance.
This document is shared on our internal wiki and available to the entire company. Most stakeholders "follow" the page and receive notifications when I update it.
- Each test has a unique wiki page with details, hypothesis, screenshots and detailed results for future reference (the link to this page is included in all communications about this test)
- I send a bi-weekly email with new tests and recent results. This enables stakeholders to stay informed with only one email every 2 weeks. Email always has 3 "highlights" and a table with list of new tests. It probably takes them 2 minutes to get a good picture and they have access to all the details from this email.
- At the end of each quarter I create a summary of our optimization program with: total number of tests (with breakdown per pages / conversion veins / platform / product /etc.), main learnings, biggest winners & losers. This page is on our optimization wiki.
It seems like a lot but once you have everything set up as part of your routine with templates and process it is very easy.
I hate nothing more than having someone tell me "we've run this test last year" but then I can't find any details or results. It is almost a worthless A/B test if you can have access to the learnings later on.
Hope it helps
We have a few different ways of doing this. and it really depends on what our clients us as their knowledge wiki. Some companies like to use Confluence, or Jive as their knowledge repositories and these work really well. You can build a template that has all the categories you need. Dates run, sample size, part of the site being tested, results, and goals are some of the common ones, but I would love to hear other peoples suggestions on what to include a template.
Optimizely provides a easily shareable link to results or you can export results into an excel format which maybe a little safer for long term storage.
Having said all this I think my favourite tool is a plain old Google Sheet. This is easy to share and there are no issues with versioning, you can configure it the way you want and always save it as a pdf for security.
Hope that helps
Optimization Co-ordinator at Widerfunnel
This is a great question, Whitney. Pauline and Ben have very helpful answers! I did just want to make sure you were aware of this post from awhile back where we had an information discussion about testing documentation. It's slightly different from what you're asking here, but it may provide some good tips nonetheless! Check it out here.
This is actually something that I have been tackling a lot recently..
With so much information in Google Docs and spread across different folders I thought it was about time that I tried to simplify things so I developed a very lightweight system that allows me to archive all my testing documentation and assign flags, tags and labels to things to make it easy to reference going forward.
I will try and share this with people in the next week or so and see if it is something that people might find useful.
This is an interesting topic of discussion. Our archival system is similar to Pauline's, but I'll share my methodology:
Google Docs, Drive Folders and Spreadsheets
My organization uses Google Apps for most of its documentation and sharing. As such, I utilize Google Spreadsheets to store all of my data points related to experiments.
Within the optimization spreadsheet we have:
Tab 1: a list of ideas in prioritized order based on a customized scoring forumla
Tab 2: a list of archived ideas so I can recall how the ideas were scored and the different values that were assigned
Tab 3: a master archive with all the relevant data points related to an experiment columnized:
- Start Date
- Stop Date
- Experiment Type
- Funnel Category
- Page Targeted
- Experiment "name"
- Dive folder screenshots sharable link (each experiment has a dedicated drive folder within an experiments tree system)
- Traffic Allocation
- Project Lead
- Hypothesis (inserted as a cell note)
- Optimizely sharable results link
- results summary (inserted as a cell note)
- Idea score
These data points allow us to run high-level reports on a quaterly basis about the optimization team's progress (something I picked up on from Pauline ).
Regarding specialized software solutions: in the past we devoted time and resources to build out custom project management tools for experiment workflows and archival using enterprise software such as VersionOne, but we found it was under utilized by business stakeholders, reducing visibility and engagement.
As such, with Google Apps being the main mode of correspondence for our organization, we've set up an efficient system within that framework that is utilized by stakeholders. This provides high visability into the daily, monthly, quarterly and yearly activities of the team, which as well all know is very important to the success of any program.
Hope that helps!
@MartijnSch - this sounds fascinating. Would you mind sharing a little more about what the platform is and how it works? It would also be awesome if you could share a screenshot (feel free to gray out the numbers/test names if you are not comfortable sharing). It's the best feeling in the world when you figure out how to automate robust documenation! Congrats.
@Amanda, ok ok. Because you asked ;-)
The ‘platform’ is still in a very alpha and built internally by our marketing team but more than happy to share our idea with it. Hopefully in a couple of weeks I’ll find some time to blog about why we created this tool in more detail.
A big time saver is connecting the data between platforms, 1 really small thing is that we connect test names of projects between each other because I don’t like having to check upon test names + ids across our logbook, tool and Google Analytics set up. We’ll set the owner of the test so later on if the team will start growing we can see per team member who ran which tests.
In the post from July last year we saved also things like: project and URL but we merged that into 1 type: template to make sure we can run tests very fast.
They’re really small ways on improving and of course on smaller sites this is not magic at all as probably building the tool will take more time. But as we currently run 5-10 new tests on a weekly basis and probably will increase to 10-15 in a couple of months. At this size currently writing the documentation and keeping it up to date costs a couple of hours a week. Currently this is a picture of the tool we have for documentation, with an example: http://take.ms/cGytz we’ll add 1 or 2 more fields in the upcoming week.
Currently we’re working on pulling the metrics and dimensions from our GA account, we’d like to make sure that we have the right metrics that we might not always have in our A/B testing tool. Based on their reporting API we’ll pull the metrics and show them to calculate the differences.
More than happy to share deeper details on the set up if you’d like to know more!