Rich-text Reply

Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director of Mobile at Fareportal, join us to answer your mobile questions

Amanda 08-10-15

Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director of Mobile at Fareportal, join us to answer your mobile questions

[ Edited ]

Hi Optiverse! We're here to help you with your mobile testing questions. Ask us anything. 

 

Screen Shot 2015-08-09 at 8.57.03 PM.pngI’m Khattaab Khan (@Khattaab), a Strategy Consultant and mobile lead at Optimizely. I work closely with our enterprise customers to answer the key questions that arise when cultivating a scaleable testing program. These considerations include who will be involved (organization structure and reporting), why are we testing (defining objectives and success metrics), how will the iterative testing process be managed, and what specifically we should test to maximize business value. My customers have achieved valuable insights through mobile app testing. Some examples include supporting Yellow Pages in revising their search results view, Fareportal with their booking funnel, and Scripps Networks with video engagement.

 

 

 

RamanBhatia.pngI'm Raman Bhatia (@raman), the Director of Mobile at Fareportal, an OTA with state of the art e-commerce technology and a strong focus on data analytics and optimization. I am responsible for delivery and performance of CheapOAir and OneTravel native apps and lead a global team of 30 app developers, interacting closely with product and design. I have managed teams and remained hands-on and close-to-the-code while developing apps at Citi, MXM and SapientNitro. I have a degree in Mechanical Engineering from IIT Kanpur as well as an M.S. in Computer and Systems Engineering from RPI.

 

We've learned a lot about mobile trends and the importance of testing along the way, so please, ask us anything! We’ll be answering your questions in Optiverse all week.

 

----

 

You can ask questions until EOD August 14, at which point this session will be closed.  

 

If you are interested in being featured as an Expert for a specific topic, please email optiverse@optimizely.com

Optimizely

JohnH 08-10-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

Thanks @raman and @Khattaab - We have very different product teams for mobile and web. Do you have a unique testing idea funnel and unique testing goals for both web/mobile individually? Or do you bring everyone together in an effort to have unified testing? Thanks.

Level 2
raman 08-10-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

Hi @JohnH, we have different product and different development teams for mobile apps and web, and each channel designs and runs tests independetly. Many of our app features are developed on the lines of what we have on the web, so we do take cues from them (plus the web team has been testing a lot longer and have more testing expertise than the app team)

 

At the same time apps obviously have a distinct experience, and many unique mobile only features; and for all these reasons apps have a unique test idea funnel. App product and design, with inputs from appdevs, come up with the tests. The app dev team then creates the code blocks, live variables etc. The app design and product teams then set up the test, and with the app tech team try these out in preview mode, and finally, when the app launches with the new tests baked in, the tests are started. 

 

Even through there are separate teams and seperate execution for Web and Mobile apps, the teams do get together weekly to discuss test results, share experiences and advise each other on further testing.

 

There are also some areas where the app and web teams do run tests in synch, for example for app-downloads that take place from ad placements on web/mobile web, and the downloads and conversions that arise due to these.

Level 2

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

When setting objectives what are your thoughts on attention often being a zero-sum game?  I.e., by increasing interaction with one feature you will reduce interaction with another.

Taking YP for example, if your objective was to increase reviews & ratings of businesses, what other metrics should you watch to make sure you are not stealing from another key metric?

raman 08-11-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

 

Hi whoismattclark, Regarding the first question, i.e. about features competing for attention, that is one of the central problems to be solved. At Fareportal, we optimize interactions by making sure features are presented contextually. For example, present the ancillary products after the main one is purchased, because presenting all together would be counterproductive. That of course is the simple case in a ecommerce app situation.

 

For a more complex scenario, such as an several equal features vying for attention, increasing interaction with one could decrease another, much like a machine with connected parts. So in a ecommerce situation say,  you may have products on a screen competing for attention, and your strategy would be  to drive the user to follow a certain order based on that user's context (ex. user has purchased an air-ticket,  now highlight the hotel cross-sell)  

 

Context decreases the cost of attention i.e. if you design the interface with the right amount of context, the user's attention is 'directed' towards the next target, requiring less attention and thinking on the part of the user. Basically you have been 'gifted' just X amount of attention by this user, so make sure you use it economically before you run out. Use the attention to drive behaviour with clever design, tweaking-measuring-optimizing incrementally to get it right.

 

Adding personalization (locale, location, demographic, preferences etc.) and history (both aggregate and individual) to context, helps you further squeeze more out of the same attention in terms of getting user-actions targetted towards your goals. You also need to be cognizant of the "quality of attention" factor as well. So sequence the attention-consumption in each session, because the 'quality' of the attention progressively wanes over the course of that session.

 

In other words, I agree with your notion of attention being a 'zero sum game', but that is not necessarily a deal-breaker; it just means you need to find ways to use it smarter and more economically. That's what we constantly attempt to do at Fareportal.

 

For your second question, could you explain the context more? - i.e. for the one on "increasing reviews and ratings". I'd be able to answer that one better with more detail i.e. what is your business, what does your app do, what are the main business objectives being driven and what are the key metrics? 

 

 

Level 2
Khattaab 08-11-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@whoismattclark, thanks for your question. Click/tap cannibalization is a common concern, but as Raman noted, moving clicks around is not a bad practice. The first level of analysis is to determine how many competing calls to action and subsequent user paths you have on the page to be tested. Next, evaluate your analytics to determine the most valuable user path; the value of distinct paths should be measured by one of your KPIs such as session time, conversion on a specific action, average order value, etc.

 

If you take clicks away from a CTA that is guiding users down a less qualified/valuable path, you shouid prioritize your most valuable user path, even at the expense of clicks to a less important one. In the YP example you offered, only logged-in registered can write review, so consider what other actions that registerd user can take on a business listing page. Additional conversion events that you could monitor through Optimizely goals could be edits to business details, photo adds, shares on social, adds to favorites, clicks on related businesses, or initiate a new search.

Khattaab Khan
Director, Experience Optimization | BVAccel
Level 5
raman 08-13-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

While we wait to get the next question, here is an interesting point to ponder and discuss:

 

An interesting area to exploring is: howcan I run an A/B test for a flow that starts in a native app and has conversion happening within a web-view inside that app ...?

 

For this we will need to connect an Optimizely mobile app test to a seperate Optimizely test for the Web Application running the web-view, and find a way to match the A and B groups set up in the app, with the A and B groups in the Web view) 

 

For example, consider an ecommerece app where the products are selected in the app, and the checkout and transaction processing is done within the web view. My funnel events are "search", "select product", "proceed-to-checkout" and "complete-transaction." The first three events are on the app and the last event "complete transaction" is on the web-view.  

 

In this case the challenge is getting the app-side funnel to match the web-side funnel, so that we get a consistent view of the conversion rate of users who started on the app and finished on the web.

 

There are two ways I can think of to accomplish this. One is to pass an argument to the web-view so that the web-application can track and generate and track the Optimizely event "shopping-cart-from-MyApp", followed by the purchase-event. This would help connect the app funnel to the web funnel.

 

The second approach would be for the app to set a cookie in the webview, which the web-app could look for and make the connection similar  as was done in the first option.

Level 2
JohnH 08-13-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

One more question for you both. Do you have any thoughts on what to prioritize first to drive engagement? Or do you have suggestions on what data to look at to find the high-impact opportunities? On web, I know how to do this with analytics, bounce rates etc. Would be curious to know how you approach this. 

Level 2
MartijnSch 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

For mobile testing, how do you deal with a dozen device sizes?

avaughn 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@raman and @Khattaab can you speak a little about a testing strategy for native apps? We often spend a great deal of engery coming up with a testing pipeline for a single release in order to come up with iterative tests ahead of results of a current test or even ahead of launching a new feature so that we can be sure to have a series of tests baked into a release so we're not waiting on app store approval to delay our testing plans.

Level 1
raman 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

Whoops! Got behind on these.

 

@JohnH we ususally evaluate each new feature based on KPIs. We then rank order them from ones that could move the needle most on KPI's (Conversion Ratio, Attachment Rate, App Downloads etc.) and then look at the "available space" in the upcoming release (by "available space" I mean the number of experiments that can run without interfering with each other) We then schedule them accordingly which gives us the entire roadmap for the releases and optimizely tests for the next three months or so.

 

Things get pushed out or in as features get delayed, and then a test that could not be done in one release will need to be moved to the next. The moved test could upset the balance of tests if it interferes, in which case the "available space" has to be reallocated, which could cause some other test to be moved out into the next release, and so on.

Level 2
raman 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@Martijn there are a few ways of doing this.

 

For iOS it is easier as there is a smaller variety, but EVERY TYPE of device MUST be tested as there is a chance that the simulator may not be enough. We have had cases of crashes or significantly poorer performance on the device than simulator. Of course you can't buy every iOS device there is (especially since new ones keep coming out) so you have to test some only on the simulator especially for the older devices. You will of course need to buy the latest physical devices (right now the 6 and 6 Large) with the enhanced hardware - touchID, secure element, Apple Pay, Apple watch connectivity etc.

 

For Android the considerations are simlilar but a bit harder. You will need more physical devices  and more versions of Android to test with (almost all iOS users typically upgrade to the latest iOS version right away, Android users are a LOT slower at doing so) So you need to be strategic and smart about the set of Android devices you want to buy for testing. Usually it would be a set of four to five (a Google Nexus, Samsung Galaxy Note leading the pack and then others) 

 

I know that sounds like a Kafkaesque struggle with a lot of running and playing catchup, and especially so if you have a Global presence and have multi-lingual and international apps deployed in several countries, and user bases with different device-mixes and habits. 

 

Some savvy folks out there have altready smelt the opportunity and priovided amazing cloud  based services for this (for example deviceanywhere or keynote) where you can pick any device at any location and deploy your .apk or .ipa to it for testing.

Level 2
raman 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@avaughn we have a roadmap of features out into the future and our release schedule and team capacity/bandwidth, and then lay out the tests in an optimal way given the constraints of dev resources, and best use of "available space" in each upcoming release (by "available space" I mean the number of experiments that can run without interfering with each other, as mentioned in a previous answer)

 

So let's say the releases, and optimizely tests are planned out as say (R1, O1), (R2, O2), (R3, O3) etc. where R1 is the 1st release and O1 is the set of Optimizely test features fitted into that first release, R2 is the second release... and so on.

 

We then package run the O1 tests in R1, and as soon as a test is complete, if it is a winner, we set the variation at 100%, and if a loser we set it to 0% (for mobile apps be aware that, in order to do this in Optimizely, there is a multi-step process needed). We keep it this way until R2 is deployed.

 

Then in R2 we package the O2 code blocks, as well as remove the code for any failed O1 tests. For the O1 tests that passed in R1, we move the code-blocks to the main code base for R2.

Level 2
jfx1026 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

We're looking at a right hand top corner navigation sceme that is dynamic and intelligent (changing state based on user action). Does this sound like a good idea, or something too forward looking for a consumer products company?

http://johnfreeborn.com
Level 3
Khattaab 08-14-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@JohnH re: your question on prioritizing the highest impact opportunities, look no further than where your app exits most commonly happen. DAUs is the ultimate metric for app optimization, so consider what you can test on the screen that experiences the most exits to promote discovery of at least one additional screen or feature. Explore ways to make the primiary exit channel a place where the repeatable value of the app is reiterated though a completed task, underutilized feature, or additional use cases to leverage the app's capabilities to drive return visits to the app.

Khattaab Khan
Director, Experience Optimization | BVAccel
Level 5
Khattaab 08-15-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@MartijnSch, the mobile QA process can certainly be lengthy because each device screen size should be QA'd separately. Utilize the screen widths that you have successfully QA'd as targeting conditions. Are there specific types of experiments that you have successfully targeted by screen size?

Khattaab Khan
Director, Experience Optimization | BVAccel
Level 5
Khattaab 08-15-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@avaughn planning a sequence of iterative tests going into a release is the best strategy to abide by. As Raman described, you want to be in a position run 100% of traffic through a winning variation while you activate your next epxeriment. I encourage you to consider sequential tests on sequential screens, ensuring that as you optimize a specific experience at the beginning of a user's session, they are compelled to return to the app and are now more qualified to explore more app functionality on ensuing screens.

Khattaab Khan
Director, Experience Optimization | BVAccel
Level 5
Khattaab 08-15-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@jfx1026 by changing state of the navigation scheme, do you mean exposed vs. collapsed? I have seen customers successfully test badge notifications on menu links and toolbar icons to draw attention to these elements.

Khattaab Khan
Director, Experience Optimization | BVAccel
Level 5
raman 08-15-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

@jfx1026 like Khattab I'm trying to understand what you mean by "changing state of the Navigation scheme" Do you mean the icon changes (hamburger animating to back arrow as in android material design) or do you mean the menu options get updated as the user selects a product in the main screen? Sounds interesting - and I wound love to hear more about this idea! In consumer facing apps, novel ideas may fall flat or may create a sensation and with A/B testing can be explored safely.

Level 2
jfx1026 08-17-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@Khattaab @raman

 

The navigation has two elements. The top most piece is a standard hamburger element. Beneath that is a more dynamic one that changes based on your activity or location within a page.

 

For example: If you are looking at a product page, that element might be a shopping cart. But, if you are looking at something that is more location based, that icon might change into a pin icon and take you to a store finder.

http://johnfreeborn.com
Level 3
raman 08-17-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

 

Hi @jfx1026   thanks for the additional context! At the outset, there appear to be two questions here (1) is this a design path you want to take? (2) where does A/B testing fit into this, what should you do to optimize or refine this concept?

 

 

(1) For starters I'd say take a close look at the Apple Human interface Guidelines /Android Material design guidelines and ensure that this fits in with the standard interaction patterns on the platform (i.e. is there a standard way to perform this already? Think through the value add of this approach vs the standard one) 

 

(2) If the idea still sounds good (which I'm sure it will because you likely have already done (1) and are already excited about this idea ;-) .... then go with multivariate testing. Since this is an experiment plus an innovation (which may need several tweaks) there may be several ways for you switch the interaction-metaphor from one interaction-type to another (shopping-cart to location-finder, how to vary the color, animation, sequence of actions before the switch etc.); so you may want to do an A|B|C|... test to try the variation that works best in terms of conversions. This will of course take longer as you are splitting your traffic into smaller streams so the tests will take longer to complete to the significance threshold you need.

 

Level 2
jfx1026 08-17-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

Thanks for the thoughtful response. Nothing is decided just yet, but we are actually leaning against this non-traditional approach. I feel that our audience isn't ready for something like this–they aren't the most forward, tech-savvy group. It's fun to explore and helps us think through the process, limitations and goals so it's a worthwhile exercise.
http://johnfreeborn.com
Level 3
Khattaab 08-17-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@jfx1026 Raman had some great guidelines to consider as you iterate on how users interact with a dynamic navigation feature. When approaching navigation tests for a B2C business, it's critical to make the primary task that they are supposed to accomplish very clear and accessible. Getting more users to complete a checkout funnel or locate a store by featuring a dynamic navigation link sounds like a valid test case to guide users to these important conversion events. 

 

The navigation treatment you describe sounds appealing and should be tested because it makes that next step required to reach a conversion event very clear, helping to manage expectations through a sequence of screens and thereby encouraging users to return to the app when they can quickly identify what they can do where in the app.

 

Raman has a good point about using multivariate tests to evaluate how changes complement one another, but it is by no means a requirement to validate the effeciveness of the navigation link you described. For example, a valuable A/B test would be icon links vs. textual CTAs to consider how users are responsive to visual cues vs. text direction.

Khattaab Khan
Director, Experience Optimization | BVAccel
Level 5
Minbie 08-18-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

Hi,

 

We are an Ausralian start up selling state of the art newborn feeding products.

 

We are looking for a specilaist to help us uptimise our shop and product pages, helping us with testing strategy and implement tools to help increase the conversion rate.

 

Any introductions to specialists in this area would be greatly appreciated.

 

info@minbie.com.au

 

Kind regards

Level 1
raman 08-18-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

@Minbie that is more of a LinkedIn question :-)

 

We are here to answer interesting questions or to provide guidance on any specific mobile app optimization issues you may have. So, with that in mind, go ahead and ask us any specific design/info-architecture optimization issue you are struggling with on your app!

Level 2
juliofarfan 08-18-15
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

We are currently testing on Web with great results and evaluating start testing on apps, however, we have not clear the possibilities of AB Testing on Apps with Optimizely.

Are there any cases of study in the travel industry for this tool, in order to understand which type of test we could be running to increase our apps usage (i.e check ins) and revenue as well?

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

Hi Team ,

I have a hybrid app base on ionic-angular . I need to integrate Optimizely for both android and ios builds. Do we have phonegap plugin which covers all the optimizely functionalities as native code does or if not what is the alternative way to implement your sdk in my hybrid app.Please reply asap
Shanann 05-05-16
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

@96Mayankjain I moved this question to the Mobile Apps forum. You should get a reply today!

Shanann
Optimizely
tedroddy 05-06-16
 

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

[ Edited ]

@96Mayankjain,

 

PhoneGap is unfortunately something we don't support at the moment! Some of our users have tried using Optimizely and Adobe Test & Target in a PhoneGap app, and it didn't work out of the box for two main reasons:


PhoneGap doesn't support cookies natively, which is a huge issue for our bucketing logic. You'd need to use the localStorage API to persist data across app sessions and restarts.

You'd need to account for offline states. Optimizely for Web doesn't do a great job handling offline web applications – essentially the experiment just won't run if the snippet can't be downloaded.

Best,
Ted
Optimizely

Re: Ask Us Anything -- Khattaab Khan, Optimizely's Stragegy Consultant, and Raman Bhatia, Director o

Hi Team,

Thank you for your quick reply

 

Does it means that i can not use the optimizely in my phonegap application.I have checked , following Plugin is available optimizely-cordova-plugin (https://github.com/optimizely/optimizely-cordova-plugin). Could you please let me know what fetures are missing in this plugin.Please help me out should i or should i not able to integrate Optimizely in my phonegap application.Messed up completely