How are your teams setup?
I've come across examples of teams that are setup in a huge variety of ways so would like to hear about how everyone else is setup and what you are finding as working or not working.
Has your setup changed much and how do you intergrate with other teams around the business?
In our case we are responsible for merchandising the site and trading online, but also have ownership of the tool and run testing with this. We link into specific trading teams, UX, design and analytics and try and take data from all of these. We're currently exploring agile teams as our current setup isnt optimal and there's some overlap, for example analytics, UX and development will roll out tests on our actual platform which also has a/b testing functionality.
What is everyone else doing?
@leepearso - Thanks for posting. I am sure the Optiverse will have some incredible insight here. I also wanted to point out this amazing resource @ShanaR wrote: https://blog.optimizely.com/2015/06/15/optimization-benchmark-report/
It's a benchmark report that highlights the top benefits and key metrics around team size, pacing, and prioritization. Check it out and let us know what you find most helpful.
We recently changed the way our testing team is structured:
- We had one team dedicated to A/B testing. The team had 1 product manager + dev + designers+ QA + analytics. It was great to implement the testing framework, create the testing culture within the company, have strong optimization advocates and experts in the organization, build a great testing velocity, define prioritization and best practices.
It was entirely product driven (with inputs from everyone in the company).
- Each product team is responsible for A/B testing in their product area. Let's say we have a Car team, this team is responsible for the entire product development, including A/B testing.
It is still very much product driven and has the support of a full team including dev and designers.
This new structure will help develop a culture of test & learn/optimization among all team members and give clear ownership to each team from ideation phase to deployment.
Hope it helps!
if needed they can also be supported by the development team to build tests but for now that is not needed that often.
We sit in ecommerce and are a centralized team that supports the ecommerce product teams. We have 2 front-end and 1 analyst on our AB Team. Each product team has a product owner who defines the feature roadmap and discovery efforts, UX, Designer, dedicated analyst and dev team.
Our setup hasnt changed yet. We are starting to rely more on the product teams for analytics support and test roadmaps as our goal as a team is to help the product owners validate the priority of their new ideas and the best UX for roadmap items.
We handle all the strategy around the test idea, hypothesis generation and complete delivery - including analysis.
Hope this helps!
Amanda i did have a read through the blog again lots of insight. Most interesting is that whilst it tends to sit with marketing its also sits with a lot of other places within various businesses.
How do you all handle personalisation and optimisation. Do your same teams look both or do you handle them in different ways?
Our testing team is made up of three primary members and we support A/B testing, CRO and landing page creation for the company. Two of our team members have design and coding skills which allows us to create most of our tests ourselves, and two members have extensive experience with testing strategy and idea generation.
It's interesting to hear that many teams don't seem to have a dedicated CRO role on their team. I agree that dev, design and analytics support is critical but also having someone with a history of optimization experience is also beneficial.
As for development and design, we have other internal teams that we coordinate with when necessary. For most design needs, the brand teams coordinate that creative before reaching out to us with their test. I usually work directly with development if needed. So far we haven't really needed them, although some more complicated tests (like pop-ups) will require their help.
We are also structured very much like Pauline outlined.
We have product teams that are responsible for their areas of the site. As an example, I lead the Ecommerce team and everything related to testing falls to us. We have both designers and developers who work on our team that assist with the implementation.
All the teams then meet biweekly to discuss results and determine how tests being run on one area of the site can be translated to other areas. We also meet to discuss site wide testing ideas. This meeting also includes our Analytics teams who helps drives ideas from what they are seeing.
In that we are not a team, it is just me!
I am the product owner and the conversion optimizer!
It is my responsibility to draw up the product strategy, research, plan tests, build tests, report on tests and push through any dev work required post testing!
Obviously this is not a very efficient system because the product owner side of my role takes up a lot of my time now, so it may be time to build out a team structure!
David do you class (or would you class) your conversion optimizer role as covering anything that aids conversion? - Do you cover onsite personalisation or is it more geared toward site improvement, UX, Design where your product owner position plays more of a roll?
Also if you were building that team out, how would your product owner link with the CRO person?
I would class the conversion optimizer role as anything that aids conversion across the board!
In building out a team the product owner and the CRO role would be tightly linked.
I am currently in the process of implementing a data-driven decision making process for running our product development lifecycle.
The product owner will own the business strategy and product growth strategy and takes direction from all areas of the business to decide what changes should be considered for our product (website/app) and then all of these would be tested to make them quantifiable. We then decide what gets built into the product and prioritized based on this data.
It will make our product development much more efficient and ensure that everything that gets added to the product has a quantifiable benefit and will not lead to degradation to the product over time.
Over the years we've really changed the way we approach (and value) a/b testing. Traditionally we used simple in-house a/b testing tools and each Product Manager was responsible for a/b testing their own products whenever big changes were made. Now we tend to test ideas much faster than our release cycles would allow and we’re more inclined to test even small changes before building them out on our site. Because testing has become so much more involved, it hasn’t been feasible for every product manager to specialize in conversion testing. In order to help the agile delivery teams get the most out of their products, we’ve formed a CRO department/team that supports all the product teams with their conversion goals. Product Managers meet with the CRO team to describe their conversion goals, explain why they’re important, offer experiment ideas, and agree upon importnat KPIs. From there the CRO team prioritizes the requests based on level of effort and potential impact for the business overall. This has been an ad hoc process for awhile now, but we’re trying to make it more structured going forward. Ideally the CRO team will support the Product Mangers much like the UX and Design teams have traditionally done.
Hope this is helpful.
CRO Specialist at HomeAdvisor
We have a smaller team that takes care of all the A/B testing.
Personally, i'm officially the point person for our A/B testing program. I take care of all the set up, coding, timing, reporting, etc for all the tests. Our Design Lead and I work together on getting the look the way want it and if i'm able to code it like she wants it.
Our suprvisor, helps guide us with the Analytics side of things. She helps us focus on what we should test and how to test it.
The 3 of us meet weekly going over any new mockups our designer has completed, and tests that haven't started yet but are ready to go, go over results of currents, and brainstorm ideas for upcoming tests.
For Q/A, I do not typically do any, being that I'm doing the coding. That way I don't get blinders on and miss anything. I typically put it back on our designer to do that, but on larger tests we include our supervisor on that.
It seems to work quite well doing testing this way. Each of us has our own role, but we all are there to help each other out.