In your testing program, what data sources could you not live without? Heatmapping, analytics etc.
Interested to hear what data sources people are using to inform future testing and how much testing comes from this type of data instead of influence around the business or common sense?
What couldn't you live without analytics, heatmapping etc ?
We use a combination of web analytics and heatmapping to inform most experiments. I'd especially highlight heatmapping as once we know what pages to test, knowing what to look at in particular is so helpful.
Heatmapping (and also scrollmapping) shows us where to focus CTAs, what sections are important and what aren't.
Google Analytics: which can help me understand more deeply how each variation within a test is working.
Crazy Egg! (Heatmapping): Optimizely helps me understand which test variation won, and how the goals performed but by adding heatmapping I can more readily see WHY they won. Did the user interact with the part of the site I thought they would? Etc.
Heatmapping, we're using an open source/inhouse solution which saves every click on The Next Web so we can use it for segmentation but also have access to the raw data.
Great open question!
We use CrazyEgg and GA for both analysis and insights, as well as an integration with our internal datawarehouse and Tableau, which allows us to tag users who saw an experiment and apply custom business events that Optimizely results and GA can't capture. For us this is very valuable.
The data does lead our testing ideas. Some of our ideas come from reactions to the business landscape–aka common sense or market intuition.
I don't think we could live without any of our tools at this point. We always need more data.
Session recording has been a huge eye opener for us - especially from a user experience point of view, then it's very useful to see how people interact with your experiments.
A bunch of tools out there have this feature - I warmly recommend to everyone to give it a try.
Try it once and you will continue to use it forever!
In our own Optiverse testing program (KB, Academy, and Community), we use a combination of analytics, heatmapping, user testing, and qualitative feedback from our Qualaroo surveys.
User Testing and Qualaroo help paint a picture for us of where you may be struggling, or common confusion points, which we can then use analytics and heatmapping to investigate from a quantitative perspective.
Whenever we're able, we like to user test before and after to make sure that the human experience matches the data we saw, especially since on many of the Optiverse properties, there isn't a straightforward conversion funnel.
We use a combination of analytics, user testing, and qualitative feedback from surveys.
We typically are using user testing before and after a test to uncover the "why" behind a result or determine our hypothesis. We also try to get real customer interviews as much as we can to vet out concepts and ideas with real customers.
Mechanical Turk (both for qualitative surveys, interviews, etc. to sending traffic to tests)
And of course, if you really want to get into it, custom datawarehouses using RedShift with Tableau/Looker. Using Optimizely's SDKs to get the experiment/variation data, exporting data from other sources (like Mixpanel) and importing it all into the data warehouse.