Five Dollar Failure – Testing a new idea for less than the price of a beer.

December 1, 2014 § 1 Comment

If there is one thing that I have learned from motivational posters, it’s that there appears to be money in sticking mildly inspirational quotes on largely unrelated, heavily filtered photography.  If there are two things that I have learned (luckily for this blog post, there are) it’s trying and failing is nothing to be ashamed of.  Due to limited budgets, the organisation I work for understands the need to innovate in order to provide the best returns.  Innovation will bring with it more failures than successes, the important things are to ensure that risk is limited and lessons are learned.

One recent experiment we tried was the idea of priming audiences before an upgrade call to see if we could either get them to give more or respond better. It was an unequivocal failure.  Although it did not work, we learned a great deal. It is also a strong example of how to experiment without betting the farm on a hunch.

Theory

The desired outcome of the project was to improve ROI on our phone upgrade campaign. Upgrades are a very important part of our regular giving strategy. The program we run with our phone agency works really well and his was not an attempt to fix a problem. This was an attempt to improve the already returns on an already successful program.

The concept was to prime the donors with information and actions on the same topic as their upgrade call. Theoretically, this would make them more knowledgeable and care more about the topic before they took the call. If they are more knowledgeable and more engaged then, we assumed, they are more likely to say yes to an upgrade and/or more likely to upgrade by more.

We would prime the donors in two channels.  1) An email to donors about the topic that the upgrade call would be based on and 2) Facebook adverts, targeted at the donors. Both of these communications types would ask the donor to take an action (specifically to sign a petition) related to the upgrade call topic.

Execution

It is very important to point out the assistance that our phone agency gave us putting this together and making this happen.  (I’m not naming them here, but drop me a line and I’ll happily recommend them).  With their assistance, we pulled two groups of 1000 donors out of the data set for upgrade.  These donors would be the group that we experimented on a the control.  They were designed to be as close a match to each other as possible.  They all had to have email addresses (as this was essential to the communications) and there were an equal number from each of the phone agencies internal segments.

The email was put together.  Although it contained an ask to sign a petition, the email was structured in a way that didn’t put the call to action front and center.  The focus was instead on the problems and our potential to solve them.

For Facebook we uploaded the email addresses of the donors as a custom audience. The Facebook ads were more focussed on the call to action than the email.  However rather than bidding per click as we would normally do, we optimised payment for impressions.  We also included adverts on the right hand side of the desktop screen, which literally no one clicks on. As we were more interested in eyeballs on the ad rather than clicks to the petition, these side bar ads were great opportunity to get cheap exposure.

Managing the Risk

As we were working with a program that was already working well, we wanted to mitigate the risk of alterations making it work less well. At the same time we needed enough people in the two groups to make the results significant.  1000 donors (who would be the only ones exposed to the communications) came in at about 7% of the upgrade program, meaning that even if it had a catastrophic effect on donors upgrading, we would not be worse off.  It’s also worth noting that because of the nature of communications stream, we deemed it unlikely to have a negative effect on donors.

Keeping the donors to a small test groups also kept the resources and expenditure oretty low.  The time taken to write and set up the email was the greatest time resource that came from our side.  The data wrangling on both our and the agency sides wasn’t too intensive and putting together a Facebook advert took no time at all. The total expenditure above existing charges was just under $5.

The results

As you already know, the experiment didn’t work.  There was no statistically significant effect on the donors in terms of response rate or uplift rate.  In fact if anything they were slightly less likely to upgrade if they’d been in the test group.

Although this was disappointing, I was hoping I could write an article like this on how to double your upgrade rates for less than $5, we were able to take away a number of important things.

What I learned

When there are multiple parts to a comms stream across multiple platforms, its really hard to get people to engage with all or even some of them.  From the 1000 emails we had a 26% open rate (this isn’t great but I didn’t have a big enough group to test email opens).  The immediately means that the test group were 260 people rather than 1000, making it much harder to get a statistically significant result as to whether people who opened the email were more likely to upgrade or likely to upgrade by more. As you will have noticed above we didn’t get this, but let’s continue going through the process as it’s useful.

In terms of the Facebook I could only find about 60% of the donors through their targeting system and the advert only reached a total of 262 people.  Facebook will not tell me which 600 people it could match, let alone which 262 people saw the ad, so it becomes much more difficult to draw out any findings as to the effects this had on the upgrade call.  Again, even if I could have been able to identify people who had seen the ad, the reduced sample size would have made it much harder to draw a significant conclusion.

The next part of the equation that complicates and ultimately devalues the result is that there is we are only able to have a conversation with about 50% of people.  So the group of 260 that opened the email became a group of 140 who opened an email and who were asked to upgrade.  The number who opened the email, received the Facebook post and were asked to upgrade on the phone is unknowable. If the number of people was statistically even (unlikely but possible) then it could be that only 35 out of the original cohort of 1000 donors received all three communications.

I’m actually really happy that there wasn’t a statistically significant response because the response that we did get was the opposite of my hypothesis. People in the test group were less likely to upgrade and when they did, they upgraded for less…slightly.  Knowing that the sample size and small difference in the results means that I can at least tell myself that my theory might be correct.  This means that my ego and I can sleep well at night.

However, doing this did teach me a valuable lesson.  It’s really REALLY hard to move people along a multi-channel fundraising/sales journey by just pushing stuff out to them.  Funnels and/or sieves have issues as metaphors as they cast the donor as passive. Successful sales funnels require with the potential donor responding to an offer, thereby self-identifying as someone who is more interested in the next stage than the people who didn’t respond to that offer.  Just because you gave them or even just attempted to give them the opportunity to “step up”, does not mean they are further along the funnel.  Just because you send them an email doesn’t mean they’ll open it, let alone read it and even further away react to it.  Even if they do all of that it STILL doesn’t necessarily mean that that will influence their decision on a matter further down the line.

There are possibilities that this sort of priming does work.  I’ve heard anecdotal feedback from people who have been very supportive of the idea and have said that it’s working well.

Things we could do differently that might improve the result include:

  • A greater number of email communications before the call. This would almost certainly increase the number of people who had seen the communication before they received the call.  I’m not convinced that this will revolutionise the results as even if a completely different 30% open the email, we’ll still only be looking at a total of 30% of the audience having read the email and had the call.
  • A more actionable communication. Something centred on getting the donor to do something other than give money. Then finding a way to prioritise calling these people.  I’m not sure we have the CRM technology to achieve this at scale at the moment, but we might be able to run a manual process.
  • Run a longer and hopefully deeper internet campaign. Although given the results from this attempt then there is not an argument for spending substantially more
  • Larger test and control groups to allow for the lower than expected penetration of the priming pieces.

So… it wasn’t a conventional success, but it was a successful exercise and only cost $5.

Tagged: , , , , , , ,

§ One Response to Five Dollar Failure – Testing a new idea for less than the price of a beer.

  • Tom says:

    Now for the customary “apologies for such little updating comment” since I last updated I’ve bought a house, moved into said house, got a new job (not yet started) and much more. A special thank you to the people who introduced themselves after my last minute BBCon presentation… its nice to know that there are people reading this who I’ve never met.

Leave a comment

What’s this?

You are currently reading Five Dollar Failure – Testing a new idea for less than the price of a beer. at Keeps On Giving.

meta