Disclaimer: As with ANY marketing case study, you should take this with a grain of salt because your marketing will be completely different – your budget, your offer, your funnel, your creatives, etc etc. What I say here applies only to me on this specific campaign.

snapshot-2014-04-17-140010

I wanted to target dentists for a local marketing service I launched recently.

So here are the details

1) Targeting – Very straight forward

  • Dentists
  • US/Canada
  • 30+ (i wanted to target dentists who were already practicing)

Very simple targeting. On Facebook. this yielded approximately 16k, while there are approximately 300k dentists as reported by Factual.

2) Segmentation

Obviously, there isn’t much demographics segmenting I can do with such low sample size, so I got creative and segmented by channels

  • Mobile feed
  • Desktop feed
  • Right side (on desktops)

3) Creative

This was a multivariate testing, consisting of

  • straight up squeeze page – 2 of them
  • pre-sell pages – 2 of them

So there were 4 possibilities in terms of conversion path

4) Tools used

Normally, i would use Google analytics with content experiment, but because of the complexity of the multivariate testing, I used CPVLab because it has far better tracking capabilities & more intuitive to use, not to mention some crazy dynamic link controls you can have DURING the tests. Highly recommended if you’re doing any kind of testing.

Results

As reported by Facebook ad platform:

facebook-lead-gen-campaign-overall-as-reported

and if you break it down by channels

which basically translates to:

Cost Clicks CPC Conversions Conversion Rate Cost per Lead Right 13.06 19 $0.69 6 31.58% $2.18 Mobile Feed 357.34 1164 $0.31 29 2.49% $12.32 Desktop Feed 151.03 204 $0.74 21 10.29% $7.19

Results as reported by CPVLab

Cost Clicks CPC Conversions Conversion Rate Cost per Lead Right 13.06 127 $0.10 6 4.72% $2.18 Mobile Feed 357.34 775 $0.46 29 3.74% $12.32 Desktop Feed 151.03 233 $0.65 21 9.01% $7.19

What the hell does this mean?

1. There is a huge discrepancy between what’s report as clicks vs. actual # of visitors.

This is because Facebook charges for EVERY click, including clicks to your Facebook page, and the actual “like” button.

In fact, this is a quite well known problem. Another marketer Phil Anderson had the same problem (src):

On face value, these are pretty good results. 50 cent clickthroughs are cheap for the personal finance space, and a click-through rate for an ad like this of .4% actually seems quite high. Note that Facebook considers any engagement a “Click”, so they say 92 clicks went to my website, while they are counting 123 “clicks”, which includes Likes, etc… It seems the price paid though is only for the website clicks, which is great, because people liking my ad doesn’t pay the rent.

I used Google Analytics campaign tagging to tag the link, so I was certain any traffic came from this campaign and only this campaign. When looking at this campaign in GA, I noticed quite a discrepancy:

Uh oh, only 61 total visits, even though Facebook claimed they sent me 92! That’s 30% less, is this the bot problem that Limited Run was referring to? Do women aged 22-40 disable JavaScript? Luckily, I don’t rely just on Google Analytics, and can turn to my trusty server logs, where data of who actually accessed the site is 100% correct.

80% of traffic sent by Facebook were using Android (excluding traffic from me, it’s an even higher number). Now, I’m not entirely surprised that mobile traffic is a large part of Facebook, although towards the end of last year, TechCrunch reported that only 48% of them were. So why am I receiving such a disproportionately high percentage of mobile? And why all Android clicks as opposed to iPhone?

My thought is this. These are almost all mis-clicks. In a completely unscientific survey of my friends phones, I noticed the Android version of Facebook seems to show more ads. I’m not sure why, and maybe that’s just a coincidence. Android users also repeatedly said things like “sorry, my phone is slow”, maybe everyone feels this way, but my thinking is people are accidentally clicking ads on Android, perhaps because there are more ads and their phones run slower.

In my case, I’ve gotten the CTR to upwards of 5% (which is quite high, but not as high as that time I was able to get it up to 11% CTR) after I optimized the ad.

Of course, it doesn’t matter if Facebook (a publically traded company) decides to charge advertisers for useless clicks. For me, Facebook reported 1,387 clicks while my tracking reports 1,135. That’s off by 20%!!!!!

(Ugh, i feel my blood pressure rise).

Does this mean you should not advertise on Facebook?

No, ultimately what you care about is cost per lead & volume. Sure, Google clicks might be 99% accurate but if your cost per lead a multiple, you would think twice.

2. CTR & conversion rates vary by channel

Obviously.

The highest CTR i’ve gotten on desktop feed was 3.8%, while the right side side was pitiful 0.08%.

But just because CTR is high doesn’t mean that you get better results. Notice the cost per lead for desktop feed & right side is a fraction of the mobile leads.

3. Every channel needs to be tested for its own conversion path

I made the mistake of setting the same path for all three channels.

I remember reading somewhere that desktop feed is more friendly towards lead generation, where mobile traffic needs to be “pre-sold”. In another words, desktop traffic could’ve worked well without the pre-sell. But I wouldn’t know that because the data got “dirty” since I let everyone go through the same path.

To test that hypothesis, I should’ve have separate the campaigns and let each channel go on its own set of multivariate tests

*banging head against the wall*

4. If you’re doing lead nurture via autoresponder, do single optin

I would offer the bait in the thank you confirmation letter.

Even though I had 56 conversions, only 44 opted in because it was a double optin.

*banging head against the wall harder*

Read more: