digital marketing
By Carrie-ann | Jan 20, 2015 | Do It Yourself, Internet Marketing, SEO, Web Design

How to Update Your Site without Damaging Your SEO

SEO is one of the biggest investments small companies make in their websites. But when it comes to initiating an update of the design, little thought is given to the way in which SEO can be disrupted. It’s possible to seriously damage or enhance traffic depending on the process you use.
updating your website

Boost or Lose Traffic

Here’s where the problems start. Over time, most websites of any size accumulate SEO errors including structural errors. These can range from broken links to duplicate H1 tags. Chances are by the time you get to a re-design, even if your site’s rankings and traffic are OK, the site is already running below par. Viewed this way, you can see that a re-design or update is an opportunity to remove errors or duplicate them across.
The possibility of damaging SEO is not something most web designers are preoccupied with. A full audit of your site prior to the launch of the new site is something you have to do yourself or employ a savvy SEO to do. Don’t worry, there are tools available to help but there is some heavy lifting to do. What I would say, is time spent managing or doing your own audit is one of the best investments you can make in terms of time and effort.
First, let me explain what type of tool you need.  The data you need to have to hand is your site’s structure and meta data. You can capture this for every page on your site using an SEO site crawler.  A free crawler, and there are plenty about, should be in your handy kit bag of SEO tools. For the audit process itself, following data capture, I recommend you do it by hand, but if you must automate part of it, there are also tools to help you.

SEO Red Flags

If you understand the SEO elements involved in optimising your site, identifying problems in the output of the crawl will be straightforward.  If you don’t have this knowledge to hand, I would suggest the following as a good start:
Link structure, checking that navigation is sound
Meta Descriptions Too Long / Too Short
Missing / Duplicate / Multiple H1 Tags for the same page
Broken Internal and External Links
But software can’t do everything. You should check your sitemap by hand and also Robots.txt, server speed / loading time, site structure, and duplicate content. It’s also useful to know what pages have been indexed in Google.

Testing Environment

Your new site should be set up on a testing server and set to noindex because you don’t want search engines to crawl it. You should then use the crawler in exactly the same way you used it to crawl the existing site.
Once you have captured data from the test site the name of the game is to compare the output of both crawls and to make necessary changes to the test site before going live.

Optimising Your Test Site

As you’ve probably gathered, I don’t believe in leaving everything to automated processes!  Optimising your new (test) site is a reiterative process.  An SEO professional can assess the output of your analysis to see if further gains can be made. There is always more to do. In truth, when sites gain rather than lose traffic after an update, it’s probably because they have done just that.  Any gain is worth having but sites than are 90% optimised or more have a head start no matter how good your content is.