Losing Organic Traffic After A Redesign? Four Things to Check!!!

Discussion in 'Articles & Tutorials' started by clipping path, Jun 20, 2014.

  1. clipping path

    clipping path
    uix_expand uix_collapse
    New Member

    Apr 16, 2014
    Likes Received:
    While web designers and developers are often strong at designing and programming websites, not all are as well prepared to handle SEO issues, especially when relaunching websites. Just this week, I’ve worked with three separate web design firms who contacted me after a recent site relaunch caused a substantial loss of organic website traffic.

    So, if you just relaunched your website and you’re losing organic traffic, how can you know where to start?

    Here’s my basic list of the most common issues I see that are often overlooked in a relaunch. While there can be many reasons for traffic losses, these are the four areas I recommend checking first.

    1. Check For An Algorithm Update

    First and foremost, make sure you don’t mistakenly contribute a loss in traffic to a relaunch problem when it was actually the result of an algorithm update.

    If you had the misfortune of relaunching at the same time as an algorithm update, you likely will need to check potential issues suspected to be part of the algorithm update as well as relaunch-related issues.

    To identify if the loss is algorithm-update related, first check to see if the organic traffic losses are occurring for multiple search engines.

    If the traffic losses (by percentage) are significant only for one engine, then this may indicate an algorithm update. Also check Moz’s Google Algorithm Update History page and blogs like Search Engine Land to see if others are discussing a possible update.

    2. Check 301 Redirects

    This is one of the areas I find most overlooked during a relaunch and often the main culprit of organic traffic loss.

    301 redirects are like a “change of address card” for the search engine robots — they indicate for the engines that the old URL has permanently moved to a new URL.

    If a web page is relaunched with a new URL, the search engine robot will still go to the old URL to index it. Links from other sites, for instance, likely still exist for the old URL, so when the search engine robot follows those links, it follows them to the old URL.

    Without a 301 redirect to tell the robots where the new URL is, the robots will abandon trying to index that page and eventually it will drop from the search engine’s index.

    To figure out if you’re missing 301 redirects (or perhaps they are not programmed correctly), look at the organic traffic to the individual pages of your site both before and after the redesign. I typically run a report that shows the top entry pages from organic search engines before the relaunch and compare that to the traffic after relaunch.

    For pages with major drops, check the URL itself by entering the URL in your browser. Were you redirected? If you received a 404 error, that’s likely what the search engine robots are finding, too.

    Another problem may be the type of redirect used. Be sure to use 301 redirects in this case because a 301 redirect tells the search engines that the move is a permanent one. Other types of redirects, like 302s, shouldn’t be used in most relaunch situations.

    3. Check The Robots.txt

    The robots.txt file serves as a set of instructions for search engine robots, indicating which pages to index and which to avoid.

    It’s not uncommon to have a robots.txt on the test server (prior to website launch) that blocks search engine robots from indexing any of the pages on the test server (since the site is still being developed and approved).

    Occasionally, when a website is moved to the live server, the robots.txt from the test server may be inadvertently copied to the live server.

    If the robots.txt file is not updated to allow search engine robots to index the relaunched site on the live server, the search engines will not be able to visit or view pages, leading to them being removed from the search engine index.

    To find out if your site’s robots.txt is blocking search engine robots, open a browser and enter your domain followed by /robots.txt in the address bar. This will show you the robots file. Then look for “disallow” on the page. While you may want to hide certain pages, like password-protected pages, only these “protected” pages should appear under a disallow statement.

    4. Check The Pages Themselves

    In addition to blocking search engine robots through the robots.txt, individual pages can block robots through using NoIndex in the robots meta tag. As stated above with the robots.txt, there may be reasons you want to block certain pages from search engine indexing.

    However, similar to the situation with a robots file being copied from the test server to the live server, I’ve seen the same mistake made with pages on the test server using the robots meta tag accidentally copied to the live server.

    Check the page for a robots meta tag by viewing the page’s source code. A robots meta tag will resemble this one:


    … and will reside in the head area of the page.

    Resolve Issues Quickly

    Regardless of the reason for the traffic loss, once you find the issue, be sure to resolve it as quickly as possible. When search engine robots can’t find pages (or index them), they will soon remove them from the index to avoid sending searchers to invalid pages. So, act quickly to regain your rankings.

    Best of luck :cool:
  2. Ol1v3r

    uix_expand uix_collapse
    New Member

    Jul 7, 2014
    Likes Received:
    Absolutely agree with this point. Great to see some valuable advice in business forums.
  3. SConsultant

    uix_expand uix_collapse

    Jun 29, 2014
    Likes Received:
    Comprehensive checklist,thank you!
    The 301 redirects - usually .htaccess file content - has the greatest impact and usually forgotten.
    If you move from dynamic CMS you need special attention to Redirect formatting.
  4. Kevin Peter

    Kevin Peter
    uix_expand uix_collapse

    Feb 18, 2015
    Likes Received:
    Most of the companies till do not feel the need of involving SEO team while planning and during implementation of the new website design. Result -- Loss of organic traffic. There is need for them re-consider the thought of involving SEO team for Research, strategy, analytics, content, design. frontend - backend web development, marketing PR, social media.

Share This Page