How much will A/B testing cost?
Consider the costs associated with any tools or software you’ll need, as well as the time and resources required to plan and run the test.
There are several cost factors involved in A/B testing that can impact the levels of financial investment. This ranges from the cost of the tool or software chosen, the scale at which the testing is being carried out from a content creation standpoint, developer and designer time, testing duration and the following data analysis.
A/B testing cost is also heavily impacted by the amount of monthly visitors your website has, with many tools offering a pay as you go pricing model for a fixed price per number of unique monthly visitors.
Ultimately, the costs will be heavily reflected by your business size and extent of the testing done. A/B testing tooling on its own can range from anything as low as $50-100 a month, though can easily run into six figures annually for enterprise businesses.
Assuming you’re running an initial A/B test on one product page at $50 a month. If you’re building out another variant of this product page for the purpose of the test, then this will factor in perhaps a day’s work for a developer and or designer, and perhaps a few hours’ work for a copywriter. Rolling the test out for a period of three months, with relevant teams spending a similar amount of time assessing results and refining and iterating as the test continues should give you a rough estimate of cost per page.
What is the potential return on investment (ROI) for A/B testing?
Consider how much you stand to gain from optimising your SEO strategy and whether A/B testing can help you achieve those gains more efficiently.
There are many case studies where A/B testing has led to notable percentile increases in conversions on certain product pages that have seen a healthy increase in revenue.
For example, one A/B testing variant that includes further rich media such as FAQs, videos or product testimonials may see an increase in 10% of conversions over its alternate or original page, which well justifies the investment. Particularly if there’s a fixed monthly spend on tooling and hours spent on the task across departments – such results can readily be scaled up.
A/B testing also has the potential to enable you to improve user experience and conversion rate signals, as well as potentially make significant resource savings if you’re able to pinpoint a sustainable formula in terms of optimisations that yields ongoing results. If there are some clear patterns in the data following a series of A/B tests that result in positive gains for SEO, then this can also reduce risk in implementing certain website or content changes without fully knowing the impact.
The exact ROI of A/B testing will depend on your level of spend and the existing conversion metrics you’re looking to improve. There can be a lot of associated costs with A/B testing, though if done correctly and carefully, it can return significant ROI in terms of organic traffic and revenue. The important thing is to set clear goals and refine and iterate on an ongoing basis based on the results you’re seeing.
What are the potential risks of A/B testing?
Consider the potential risks associated with making changes to your SEO strategy, such as the possibility of negatively impacting your search engine rankings or user experience.
There does need to be some care taken by your SEO and general marketing teams when conducting A/B tests to ensure there aren’t negative impacts on SEO strategy.
With A/B testing, you are effectively deploying multiple versions of the same page at the same time. While different users may see different iterations, Google may get confused in which is the true version of the page. If they are significantly different in content and there aren’t clear signals sent to Google (such as canonical URL mapping or 302 redirect implementation by your tech team), then there is a risk of this being interpreted as cloaking by Google.
Cloaking is the practice of presenting different versions of content to users than is seen by search engines with a view to manipulate search rankings. It is a breach of Google’s spam policies and may damage your SEO strategy.
Various versions of a product page deployed on-site for A/B testing purposes need to be checked by your SEO team to ensure that the right signals are sent to Google in terms of which version is the “true” version of the page you want indexed. Steps in this department will also protect any existing rankings and previous organic revenue metrics tied to this page while the A/B test continues.
SEO A/B testing on the edge significantly reduces risks by enabling real-time optimisations and experiments directly at the content delivery network (CDN) level. This allows webmasters to test various SEO strategies, such as meta tag refinements, structured data changes, and header modifications, without altering the core website infrastructure. By conducting these tests at the edge, it minimises potential disruptions to the user experience and server performance, ensuring that only a subset of traffic is exposed to experimental changes.
How will A/B testing fit into your overall SEO strategy?
Consider how A/B testing will complement other aspects of your SEO strategy, such as keyword research, content creation, and link building.
There is arguably never a bad time to consider investing in A/B testing to enhance conversion rates alongside your existing SEO efforts.
However, a good time to look at it as a conversion rate optimisation outlet however could be once you’ve got your site’s technical SEO foundations solid and when your team have undertaken and deployed a content optimisation strategy across key commercial pages, and traffic is clearly on the up.
With an increase number of new and existing users from organic eyeing your product pages, deploying a series of A/B tests can be a savvy move to enhance conversions, especially if you’re not seeing that desired conversion rates alongside the uplift in traffic.
If you’re also undertaking activation such as link building, then using your A/B tested content as link destinations on your content placements to drive users to this can be a quick fire way to assess the effectiveness of said A/B tests and measure conversion rates.
Who will be responsible for planning and running the A/B test?
Consider who within your organisation will be responsible for managing the A/B test, as well as any outside experts or consultants you may need to engage.
A/B testing will rely on external tooling and software to enable your team to deploy, iterate and measure for the duration of the test. In terms of content creation for the page variants, this will require resources across development, design, content, UX and SEO. There are considerations for A/B testing that carry across these disciplines, and you may want to consider deploying a dedicated marketing or project manager to tie all these elements together and drive the tests forward.
If your business is in a situation where resources across departments are stretched or perhaps don’t currently exist, you may want to consider hiring an external agency or consultant to enable you to facilitate all of this. Many digital marketing agencies will have dedicated A/B testing services that will take care of all of this for you, enabling you and your team to focus on the final output.
SEO A/B Testing Case Study
Events business deploys A/B testing sign up page to improve user conversions.
Problem: A business that generates its revenue from event sign ups was struggling to generate admissions to a landing page for an upcoming conference.
Solution: The business experimented with an A/B variant, with users sent to an alternate landing page via paid search, paid social and email marketing that had streamlined content and a clearer call to action. The original landing page which was struggling conversion-wise though was ranking well in Google search with the relevant event schema, was maintained by ensuring that canonical URLs on the content variant pointed to this original. As a result, cloaking risks were averted, existing SEO metrics were maintained, and conversions using the variants were optimised.