Go Back
Optimizing Google Ads: Using A/B Testing to Achieve Better Results

Optimizing Google Ads: Using A/B Testing to Achieve Better Results

You’ve just finished launching your first Search campaign. You did research on your target audience, created a solid list of keywords, and came up with some great ad copy. After running the campaign for a month, you start seeing pretty good performance numbers. You show the results to your boss who asks you the classic question “Can we get better results without spending more money?”

Stanley eye blink

No need to worry, because there IS a way to get more from your Google Ads campaign besides investing more dollars!

The Pieces of the Puzzle

Looking at that Search campaign you just built, it has several building blocks that influence the performance numbers you see. These can include:

  • Bid Strategy
  • Keywords
  • Ad Copy
  • Ad Assets
  • Landing Page

These different components are all dependent on one another and influence the overall performance of your campaign. Have a robust and vetted keyword list but poor ad copy? Your performance will go down. Have great ad copy but a poor Landing Page experience? Your performance will go down.

It’s when you find the optimal combination of these components that you’ll start seeing a steady increase in performance over time.

But how do you find out what a winning combination looks like to set up your campaign for success?

Enter A/B Testing!

The Formal Way vs. The Manual Way

If you need a refresher on the basics of A/B Testing, you can review our previous blog post. In a nutshell, A/B testing involves running two variations of something simultaneously to see which performs the best.

There are two ways to conduct A/ B testing within Google Ads; Using the Experiments Page that’s built into the Google Ads platform or manually creating and tracking tests. They each have their benefits and may be better suited for use depending on what you’re looking to test.

Since I’m a sucker for details and like to have more control over my experiments, I prefer using manual testing, and it’s what we’ll cover today.

Ad Experiments – The Old Fashion Way

Let’s say we wanted to conduct an A/B Test for our newly created Responsive Search Ad (RSA) in our Search Campaign. It has a decent Click-through Rate (CTR), but we want to see if changing the ad text makes a noticeable performance improvement. 

Life is good meme

A quick refresher on RSA’s; they are made up of 15 headlines and 4 descriptions that Google can serve in different variations resulting in an ad made of 2-3 headlines and 1-2 descriptions. 

Indianapolis hotels google search results

It’s easy to think that an ad with so many Headline and Description choices would have plenty of assets to get the highest possible results. But despite having so many combination options, Google typically uses a handful of assets to serve in most Ad Actions while the other assets are hardly used. That’s why it’s important to continuously test new ad copy variations to try and drive higher results.

Let’s look at this example of a Responsive Search Ad experiment that I conducted for a local whiteboard manufacturer. Looking to improve the Description text, I plotted the current ad text in Excel and created slightly different variations of the higher-performing Description text.

Ad experiment changes

From the ‘Original Ad – A’, the Description text “Build your own whiteboard…” was most popular in Position 1 and “Personalize your own…” was most popular in Position 2. So I created slightly different versions of those Descriptions in my ‘Testing Ad – B’.

Comparing the A & B variations, the tone of the ad is the same but the text is slightly different. I wanted to find a way to convey the same message but in a more compelling way to the end user.

Keeping all other aspects of the ads the same, I ran them both side by side for a total of 3 weeks and compared the numbers.

The Results

Results of Ad Experiment

As you can see, the Testing saw a noteworthy increase in CTR (+4.78%) and Conversion Rate (+2.8%). With this new ad variation, we drove more users to the website who have a higher likelihood of completing a valuable action.

And remember, I only changed the ad copy in this Search Campaign. There are many other campaign components that we can also experiment with to continuously refine our campaign, giving us more opportunities to make a bigger impact.

At JH, continuously reviewing and improving our SEM campaigns is a cornerstone of our marketing strategy. We have time set aside every month to evaluate our campaigns and make incremental improvements that help steer performance in a more positive direction.

Takeaways

I’ll leave you with a few tips when conducting your A/B experiments in Google Ads

  • Don’t Expect Success Every Time: Don’t be discouraged if your first few tests don’t give you the results you were looking for. Try to look for insights in every test you run, successful or not, and use what you learn to make your next experiment even better.
  • Give it Time: The longer you let an experiment run, the more data you’ll have to gauge its performance. If an experiment is too close to call, let it run a bit longer.
  • Rotate Ads Indefinitely: To get a more even spread of impressions in an Ad Group with multiple Ads running, turn on ‘Rotate Ads Indefinitely’ in the Campaign or Ad Group Settings
  • Track the Data: Keep a log of all the experiments you’ve run along with their results. That way, you can keep track of what experiments you’ve conducted and use that knowledge to build your next experiment
  • One Thing at a Time: Be sure not to have multiple experiments running for a Campaign/Ad Group at once. You’ll have a harder time figuring out which experiments were successful versus the ones that weren’t

Now get out there and start testing!