Blog

split-testing-content-marketing-slight-changes-in-content-yield-impressive-results

Split testing content marketing: Slight changes in content yield impressive results

October 23, 2018

Iterative marketing is defined by running several campaigns at once with slight modifications, all of which carry equal weight and provide valuable insight when assessing overall metrics.

But, what happens when your timeline for testing is suddenly shortened and you need to execute on new ideas?

The team at 83bar recently worked through a campaign where we needed to show variation and provide results – fast. We implemented a high-risk, high-reward model to run a transparent performance test.

  • To get started, we crafted four landing pages for testing across three different geographical markets.
  • Each market was being served a set of four paid ads on Facebook.
  • Every ad was deployed in each of the markets to make the varying collateral as visible as possible.
  • Each ad was linked to s a single URL, split via four variants using the Instapage proprietary  A/B testing functionality. (Disclaimer: We don’t advise running an A/B test this way because the data will be difficult to digest, potentially affecting learnings applied. But, it provided reliable data when looking at cross-variant performance differentials over a short period of time.)  

Hypothesis:

Performing a high-velocity test will allow for quickly identifying the highest-performing page variant. Establishing the consistently successful variant early on will allow for the elimination of unsatisfactory outcomes as quickly as possible, while still being driven by sound practice.

Applied Variants

Group 1 pages used content variations to push page visitors to self-educate using content on the page. This targeted method of producing product awareness can often succeed in building a healthy, long-term funnel. However, this approach depends on positive initial traction so that you can judge the health of your campaign early and often to compete in a high-velocity environment.

Group 2 pages focused on a direct, action-focused call to action, designed to push leads directly into the funnel. This is typically a successful r model of testing as it produces the most immediate response. However, the challenge here is that the long-term health of the funnel can suffer if iteration cycles are not maintained

Results:

Length of test: 6 days
Total visitors: 1,583
Total Conversions: 62
Overall conversion rate: 3.9%

Highest performing: Variant D
Visitors: 392
Conversions: 17
Conversion rate: 4.3%

Lowest performing:  – Variant C
Visitors:  376
Conversions: 12
Conversion rate: 3.2%

The highest performing variant performed 35.9% better than the lowest performing landing page, and at least 20% better than the other contending variants, meeting our criteria for performing both extremely well and consistently.

Ad Group Performance
The primary variation between ads was specific to their targeting criteria and thus, the analysis is placed on the ad groups and not individual creative.

Ad group 1 – Charlotte, NC specific targeting, using broad lookalike audience targeting

Impressions: 20,073
Reach: 12,992
Leads: 9
Cost per lead: $63.95
Conversion rate: 3.36%

Ad group 1 focused on small, single-city markets with a wide range of lookalike behaviors. Given ad group 1 is targeting a known audience with similar interests. the lead cost is high, and the conversion rate too low to qualify as a successful response.

Ad group 2 – National targeting with medium intensity lookalike audience targeting

Impressions: 19,873
Reach: 16,193
Leads: 13
Cost per lead: $44.60
Conversion rate: 2.90%

Ad group 2 focused on a large market(entire US) with a somewhat narrower audience than that of ad group 1, with fewer lookalike factors.  The data shows that there isn’t much of a difference in impressions or reach as compared tosingle city targeting of ad group 1.  Though the conversion rate is lower, those who Facebook displayed the ad group to, were in a higher interest category. .

Ad group 3 – National reach with narrow lookalike targeting ( top 1%)

Impressions: 16,145
Reach: 12,713
Leads: 31
Lead cost: $18.27
Conversion rate: 5.84%

Ad group 3  performed 73.8% better than ad group 1, and 101% better than ad group 2 on the basis of conversion rate.

On the basis of cost comparison, ad group 3 saw reductions of 71.43% and 59% over ad groups 1 and 2 respectively.

Conclusion:

This performance is striking given that:

  1. All ad groups used the same ad creative
  2. All ads pointed to the same pool of randomized landing pages

Ad group 3 demonstrates the value in applying lookalike audiences to a larger pool of viewers. Given the potential for both cost reduction and increased lead generation, we will continue experimenting with the same audience selection, with the Variant D landing page as part of the cycle of iteration and improvement.