SEO SEO Tutorials

5 Mistakes Will Make the Split Testing Bad for SEO

Maybe you know that Search Engine Optimization and the Split testing are the two fundamental tools for all website. However, do you know the two both have an effect on each other? For example, if your split testing is made incorrectly, the SEO of your websites could be damaged.

split-testing-and-seo-featured-image

In the following tutorial, we are to reveal focus on this issue to reveal 5 mistakes when doing split testing could be bad for the SEO effort. Then, we are to give advice for our reader that how to avoid these common mistakes.

Mistake 1: Do Not Set Canonical URLs for the Duplicate Pages

As for split testing work rule, you need to build two versions of a page with slightly differences at first; and then display the two slightly different pages to the different audiences at random. After that, you need to use the specific split test software to track some metrics, like purchases, subscriptions, and click-throughs for a certain term. These tracked metrics are the references to decide which version coverts better.

However, it could lead to the misunderstanding that search engines cannot define which versions of the page should be indexed as well as displayed in the search results. If that really happen to your website pages, then you will be in big trouble! And all the split tests you have made will go to waste because it certainly harms to your SEO!

Fortunately, you can successfully avoid that disaster for your websites. It is to set the canonical URLs for all the two versions of the pages you are testing. And canonical URLs are purposed set to help all search engines that which version of the page is the definitive one!

Regularly, you will need to set a control page for canonical when you are running a split test. It may change when you decide the winning page version. However, we still think it is the best solution to avoid the search engine confusion at the very beginning.

You can set the canonical URL from within WP, but we have another easier way to do it.  You can simply define it in the way of editing <head> section of that page’s HTML. Additionally, it you are utilizing Yoast SEO, you can quickly set the canonical URL of that page via Advanced settings as the following picture:

yoast-seo-canonical-url-divi-final

Just a simple setting of canonical URL only takes little time, but it can make sure your website pages get rid of one of the most frequent split test & SEO troubles.

Mistake 2: Run a Test Too Long

It is important for you to notice that the split tests cannot be run too long! If you do that, the search engines will detect 2 similar pages on your websites for a long time, and then these search engines could interpret the situation as a practice to “game” the position in the SERPs.

Google claims that these cases will be regarded as that the webmasters are duplicating page content attempting to manipulate the search rankings:

Nevertheless, when you are running a split test, you can avoid this kind of split test & SEO problems effortlessly – limit the testing time. To be more specific, you just need to run the test utile you get a significant result. And then, stop decisively!

Actually, the test results are not reviewed by yourselves, there are many quality split test programs assist to let know which one is a satisfying and significant result. For instance, Divi Leads can display the data when you are running an A/B test as following:

divi-leads-statistics-screenshot-graph

Mistake 3: Manually Block Search Engines’ Crawlers on the Duplicate Content

To prevent the search engines’ crawlers from indexing the duplicate pages, one of the common methods is to edit your pages’ robots.txt files that are the files loaded with each web page instructing bots how they treat it.

You can use this method to achieve the same purpose as well, but it is really a bad solution, because firstly it will be interpreted as you are displaying different content to search engines. More importantly, Google has explicitly claimed that it will against block search engines’ crawlers in the robots.txt. If it cannot crawl the pages with the duplicate content, then it cannot automatically detect the URLs pointing to the same web content. All search engines will do like Google, and they will also treat those pages separately!

Besides, every people can access and view your robots.txt files. So do your competitors and they can even use the robots.txt files to figure out which pages that you instructing the search engines not to index! Secondly, anyone can view your robots.txt file. By excluding those pages, they can easily find out which pages you are testing as well as attempting to optimize. If you want to win in competition, obviously you should get rid of this issue.

Therefore, Google offers a recommendation that instead of editing your robots.txt file, but to use 302 redirects or the canonical URLs that will tell the crawlers the skills of dealing with the similar pages on your sites.

Mistake 4: Do Not Delete the ‘Losing’ Page after Finishing a Test

After finishing a test, there must have a losing page and a winning page. You should think of deleting the losing one, after all it is not converting at all. So, why will you keep it around? However, is it really better to deleting them?

We have conclude two reasons that you should keep low converting rate but keep the losing pages as a source of learning! On the one hand, deleting losing pages will erase your valuable testing data. Don’t look down upon the losing pages, which include many things you should learn from different mistakes. The first reason that you need keep those losing pages is that you can refer to later.

On the other hand, you may not know that those deleted loosing pages could confuse search engines and users. If you have been running your testing for a term, search engines likely have already indexed that page in question. So, if you delete that losing page but do not delete in from the search engines, it is very possible that users can still find that page in the search results. When the users click it, they can only receive the 404 errors instead of the right page. It certainly harm to your SEO.

We understand that you of course want to show the better pages. So, we are glad to recommend you two methods to help you achieve that:

  • Make use of premium split test program to archive testing pages, such as Visual Website Optimizer.
  • Utilize 301 redirects to point to the better pages from the poorer ones.

Mistake 5: Do Not Update the Control Page

Suppressed that your new version of testing page can perform better than the old version, but it fails to update your control version, which will lead to an awkward situation, also named keyword cannibalization. Seriously, if that really happens, all those best efforts for better conversion rate will be ruined.

However, it is fortunately that keyword cannibalization is a simple method to avoid the conflict between split testing and SEO. The method is the same as the one for the mistake 4 above. You can make it in a way of redirecting your control page to the split testing winner.

Conclusion

We all know that if you can take good advantages of split test and Split SEO, you can quickly increase the conversion rate and the traffic for your sites. Nevertheless, it is also very important to ensure that you do the split tests in the way that absolutely cannot disrupt the SEO efforts.

To get rid of that conflict, you need to set the canonical URLs for those duplicate pages, run the test only until the significant result is generated, and make full use of losing pages. Customers can read this post to solve related issues, and we are also glad to hear from readers who have better techniques to keep from this question.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.