Rich-text Reply
Highlighted

Missing Clicks/Conversions vs PPC Reports

URLDigital 04-13-18

Missing Clicks/Conversions vs PPC Reports

I've run 4 separate A/B tests during the last several days. The pages being tested are landing pages which receive PPC traffic only. I'm seeing significant gaps between the volume of clicks and conversions reported by Optimizely and what AdWords and Bing report:

 

 Landing Page ALanding Page BLanding Page CLanding Page D
Total PPC Clicks38350413012141
Optimizely Clicks2762999081407
Missing Clicks107205393734
% of Clicks Missing27.9%40.7%30.2%34.3%
     
Total PPC Conversions2725101121
Optimizely Conversions23218893
Missing Conversions441328
% of Conversions Missing14.8%16.0%12.9%23.1%

 

Our developers have confirmed that Optimizely is configured correctly on the landing pages and on the "thank you" page. And our PPC team has confirmed that the PPC traffic above is to the same landing pages and is from the experimental time period.

 

I understand Optimizely de-duplicates users, but our internal data doesn't reflect 30%+ of our PPC traffic coming from repeat visitors.

 

Can you help me understand possible explanations for why I'm seeing a 30% gaps in clicks and and 20% gap in conversions between PPC reporting and Optimizely?

JasonDahlin 04-13-18
 

Re: Missing Clicks/Conversions vs PPC Reports

[ Edited ]

One source of discrepancy is that they are measured in different ways.
"Clicks" are usually measured by a user going through the advertising partner's URL.
"Page Views" require that the user actually see your page.

For example, if your site is blocked by my Firewall or ghostery or similar, my click would still be recorded but your page would not load.

Also, a user could click and then abort the process before seeing your page. (Perhaps the passthrough on the advertiser's network takes too long and they abandon the clickthrough or the passthrough actually fails - happens to me *a lot* on slow connections).

Also, the user could have Optimizely blocked preventing the user from being recorded at all (ghostery, etc.)

Also, does your site do any blocking of "bad actors" (bad robots, etc.). For example, Akamai can block or tarpit requests based on user agent or behavioral indicators. Your Site Ops team would know if they have anything like this in place. Though it's unlikely that these requests would click through advertisements, if they are scraping a site that you are advertising on, they might.

Also, could the advertising platform be including robots in their reporting? Robots tend to do things repeatedly and could show up as a single user in Optimizely but as separate click throughs in the advertising platform.

I don't know if any or all of these would end up being 30% for your advertisements or for your site, but it would not be unreasonable. On the main site I work on, "robots" account for 2/3 of our page hits (which are automatically filtered out of our reporting tool). Quite literally, we record about 4 million "valid" page hits every day. Our reporting tool filters our about 6 million per day and Akamai blocks about 10 million "bad actor" requests.

--Jason Dahlin
Analytics and Testing Guru Smiley Happy
Experimentation Hero