Boost Your Website Success with A/B Testing

|
Content Author:
Hannah Schultes
Boost Your Website Success with A/B Testing

CALS/LAS Web Team content editors can now use A/B testing to help optimize their websites and improve marketing results. 

What is A/B testing? 

A/B testing works by randomly dividing your website visitors into two groups. Group A visitors get version A of a page (the original), and group B sees version B (the variant). An analytics system tracks the actions of each visitor, and over time, you can see whether version A or version B is more effective. A/B testing is a data-driven approach that takes the guesswork out of webpage optimizations. 

What are conversions and why do they matter? 

Conversions are one way of measuring a website's effectiveness. A conversion occurs when a website visitor says "Yes!" and takes your desired action. 

Common website conversions include:

  • Filling out a contact form
  • Requesting a campus visit
  • Signing up for email updates
  • Requesting more information

What is conversion optimization?

If you are measuring conversions by the number of people who click a specific button on your web page, and if two visitors out of 100 click the button, this page has a conversion rate of 2%.

Conversion optimization is the process of making small changes to try to improve your results. It looks for ways to increase your conversion rate through strategic adjustments to your webpage.

Learn more about website conversions and optimization methods.

A/B testing is a method used to compare webpage variations to help determine which has a better chance of converting, or getting visitors to complete your goal.

What can you test?

You might compare two different images on a landing page to see which inspires more engagement from your visitors. Another great example would be testing two different variations of button text to determine which gets more clicks. Some other common tests to try are:

  • Different headlines - Test which message captures attention better
  • Button text variations - Test if one version gets more clicks than another
  • Less copy vs more copy - Is less more or are the details what convince?
  • Different calls-to-action - Test if one block type performs better than another
  • Form variations - Fewer fields vs a more detailed forms
  • Testimonial variations - Different quotes or student stories might resonate
  • Page layout variations - Does moving your call-to-action higher on the page make a difference?

How we tested our own website

Let's look at how we used A/B testing on our own site to show you exactly how it works.

A good test starts with a hypothesis.

Our Hypothesis: "If I change the text of the 'Request a website today' button to 'Start your website project,' more people will click, increasing traffic to our website request page."

We created a new A/B test in our Matomo analytics dashboard, setting parameters like what counts as a successful conversion, how much traffic should be allocated to this test, what should trigger the test, and when the test should end. In our test, "Request a website today" is our A test and "Start your website project" (called "Variation1" below) is our B test.

Then we added JavaScript to alter the button text when the test was triggered, so half our visitors saw the original button and half saw the new version.

Here's what that looks like on our live website:

A - original test screenshot
The version of the webpage that was shown to visitors for option A.

 

B - variation test screenshot
The version of the webpage that was shown to visitors for option B.

The Results: With variation 1 trending towards a 10% conversion rate and the original with only a 3% rate, we had clear evidence that "Start your website project" resonated better with our audience.

Overview of results

Best practices for accurate results

Test one thing at a time. If you change both the headline and the button color simultaneously, you won't know which change actually improved your results.

Create a strong hypothesis. Use this format: "If I change [specific element] to [specific variation], then [specific metric] will improve because [your reasoning]."

Be patient with your data. Depending on the quantity of your website traffic, tests need time for each variation to produce reliable results. Making decisions too early might lead to misleading conclusions.

Ready to start testing?

A/B testing can help you get more value from your existing website traffic, resulting in more successful visits!

If you've got a landing page you want to improve and are interested in setting up an A/B test, contact the CALS/LAS Web Team to schedule a strategy meeting. We'll help you identify the best opportunities for testing and implementation.

Body