SEO migration

Find out how to perform a perfect migration using Screaming Frog without losing search engine rankings.

How to Migrate Website

This article tells you which is the best solution to discover and solve the most common critical issues during the delicate phase of migration.

Any migration is always an epic undertaking in order not to negatively impact the organic ranking of the website that is being migrated.

Pre-migration

Before delving into this process there are essential steps to put in place in the pre-migration phase in which a crawler can help you and which should not be overlooked.

  • 1. Perform a Crawl of the current site

First, a full website scan is recommended for:

Being able to compare it with the staging site;

Check whether the old URLs have been correctly directed to the new resources.

Have a backup to start from in case something goes wrong during the migration.

In summary before migration you need to prepare the ground for a full backup that includes all the most important elements of the current site.

To take advantage of Screaming Frog at this stage you need to enable scanning for all resources you may need to refer to in the future such as AMP versions, Hreflangs, structured data etc.

In this shopping list, don’t forget if there are any client-side JavaScript dependencies that require JavaScript rendering and include any other domain, subdomain, or mobile and desktop site variants.

Always integrate XML Sitemap into the crawl and connect via API to Google Analytics and Search Console to unearth any orphan pages (enable “Crawl New URLs Discovered GA/GSC).

Also collect link data from Ahrefs, Majestic, Moz, and performance on query performance with GSC to keep an eye on important pages based on revenue, conversions, traffic, impressions, clicks that allow you not to dissipate the ranking you have gained.

  • 2. Export the URLs to be redirected

Very often during migrations the elements that change most frequently are slugs, so it is necessary to collect a complete list of URLs to permanently redirect old pages to the equivalent ones on the new website.

This step can be relatively straightforward if it is a domain migration and the URL path remains the same, but it becomes complex and risky if the entire URL structure of the website is changing or simply some strategic areas are being rethought based on new Search Intent.

To get a complete list of existing URLs, simply export them from the crawl run of the live site by clicking on “Export” under the “Internal” tab.

This mapping activity is one of the crucial steps in your migration to redirect each element in the most appropriate way.

If comprehensive mapping is not possible, it is a good idea to prioritize top-performing pages based on conversions, user and links collected in step 1.

  • 3. Pre-migration testing phase

When the new Web site is in “Staging” it is time to start the checks before it goes online for good, and here again the crawler comes to your aid:

The first activity is to crawl the staging site to identify differences, problems, and opportunities.

Normally, sites at this stage are inhibited from being scanned by the robots.txt file or password so as not to be considered by the Search Engine Spider. Thanks to Screaming Frog you can force authentication, ignore the robots.txt file or, by changing configurations, manage the scanning of resources left in “noindex” or “nofollow” because they are still under development (see dedicated section “How to: Scan a password-protected site”).

Once you have completed the production site scan you can begin the analysis by initially using the “Overview” tab in the sidebar.

site staging vs. live

  • 4. Comparison of staging and Live site (URL MAPPING).

Now you are in possession of all the data you need: on one side you have scanned the site currently online and on the other the site in staging, all you have to do is compare the two versions to discover any inconsistencies.

You can do this activity simply with the Seo Spider by using “URL Mapping” of the “Compare” mode and defining from the top bar the two scan versions to compare.

This opportunity introduced with the latest Seo Spider updates, allows you to compare different URL structures, such as different host names or directories using RegEX rules.

To use it, simply change the crawl mode to “Compare,” choose the two saved crawls, and click on the “gear” icon in the top bar.

On the “Crawl Comparison Configuration” popup all you have to do is click on “URL Mapping” and define the old domain with the new one via RegEX rules.

In the example below, the existing site was mapped with the “Staging” version only for the hostname considering they share the same URL paths.

Through mapping, equivalent URLs are compared with each other to obtain the data from the overview tab, problems and opportunities, site structure tab, and change detection.

When the comparison is finished, you can click on the columns to see which URLs have changed and use the filter on the main window to alternate between current and previous scans, as well as to check URLs that have been added, missing, or have new or removed items.

The Seo Spider includes four columns that help segment edited URLs according to these filters:

Added: Identifies URLs that are present in the existing site and contextually in the staging site but differ at the filter level.
For example by considering the Live site as “Current” and the Staging site as “Previous” and putting the focus on the “h1” headings this filter shows you all the URLs that have only in the “Live” site an optimized heading.

New: displays all URLs that are not in the existing website, but have been introduced in the new website.

Removed: shows all URLs that are present in the current site scan filter but not in the filter in the new site being staged.

Missing: URLs that exist in both scans but differ at the filter level.
For example, consider the live site as “Current” in which there is no “meta Title” of a specific URL while in the staging site (considered in the example as “Previous”) this element had been optimized.

  • 5. Identify differences between sites through “Change Detection”

When performing a migration, it is a good idea to minimize the number of changes that could impact a site’s visibility.

To keep track of these changes you can use the “Change Detection” feature in “Compare” mode that allows you to view all of the elements that have been changed between the current and “staging” sites.

To enable “Change Detection” click on the “Config > Compare” comparison configuration (or the “gear” icon at the top) and select the items and metrics in which you want to examine changes.

Next, simply click “Compare” to run the scan. The ‘Change Detection’ tab, not present by default, will appear as the last tab in the sidebar and main view of Seo Spider.

In the example above, 2 URLs changed the title of the page, 250 changed in word count while the meta description and Inlinks remained unchanged etc.

In this screen you are able to click on each item listed to isolate it and see its changes. In the example below, “Meta Title” was isolated to analyze the semantic validity of the meta title.

If the site structure and internal links are changing considerably in the migration of your site, then it is really important to understand which pages have lost (or gained links), and changed the depth of crawling.

  • 6. Redirect Mapping (Bonus)

Although not built for this purpose, the “near duplicates” feature you can use it to do redirect mapping from old URLs to new URLs if the content is similar between them.

To benefit from this, simply switch to ‘List -list’ mode (‘Mode-Mode > List-List’), enable ‘Near Duplicates’ and disable ‘Only Check Indexable Pages For Duplicates’.

Config > Content > Duplicates

Next, you can refine the scan by configuring the content area used for similarity analysis between old and new URLs.

Config > Content > Area

For example, you can “include” category classes and product descriptions from the old and new sites if they have changed. If the patterns are the same, then it is unlikely that you will need to adjust the content area.

To use List mode for mapping, you can upload both new and old URLs in List mode and scan them, remembering to remove the scan depth in ‘Config > Spider > Limits’ and enter both the existing site and staging site homepages.

With the crawl complete, all you need to do is turn on “Crawl Analysis” and navigate to the “Content” tab to view the details of the near-duplicates and the similarity index.

If the content has a perfect match, each URL will have 1 near duplicate, with close to 100% similarity.

If the content has changed significantly, the similarity scores and the number of near-duplicates will be different. The match between URLs will be viewable in the lower window of the Seo Spider.

At this point, simply export the “Near Duplicates” from Bulk Export>Content>Near Duplicates to get a good basis for properly setting up redirects and saving the site from the potential organic damage of migration.

  • 7. Post Migration: scanning the new site

Upon publication, all you have to do is crawl the website to check that all pages on the site are “crawlable” and indexable and that it does not have important resources blocked by robots.txt (Response Codes > Blocked by Robots.txt)

Make sure there are no unwanted noindex, nofollow or none directives in the meta tags or HTTP header (Directives > Noindex).

Verify that each URL you want to place is indexable.

Once you have completed this checking phase, I recommend that you take one more step by comparing the old site and the “Live” site to see if everything you had optimized on the staging site has carried over to the current version.

  • 8. Verify redirects

If URLs have changed, it is critical to check all 301 redirects to the new destinations to avoid errors, skips, loops, temporary redirects, or to pages that cannot be indexed.

Switch to list mode (‘Mode > List’) and enable ‘Always Follow Redirects’ in ‘Config > Spider > Advanced’.

Add the old URLs discovered with the first scan of the old site (step 1 article) and start the scan.

At the end of the crawl it exports the “All Redirects – All Redirects” report.

All Redirect Report

Through this report you will be able to evaluate the following indices:

  • Chain Type: specifies the type of the “redirect” ( e.g., Http/Javascript/Meta Refresh)
  • Number of Redirects: the number of nodes in the chain.
  • Redirect Loop: the presence or absence of circular redirects. This field is made explicit with True/False according to Boolean rules.
  • Temp redirect in Chain: if there is a temporary redirect in the chain (e.g. 302).
  • Address: the address of the source URL.
  • Final Address: the final destination of the link.
  • Indexability: identifies whether the destination (Final Address) is indexable or non-indexable. This field is very useful to understand if during migration you have linked an old URL to a destination that is precluded from indexing. Obviously if a resource you handled it with a redirection it is because you wanted to pass the trust to a new destination but if it has the attribute “Not Indexable” the previous work would bring no effect.
  • Indexability Status: identifies the reason why the destination address is not indexable, for example, the presence of the “Noindex” tag.
  • Final Content: the type of the target content (example: html).
  • Final Status Code: identifies the Status Code of the final destination and can be 200, 4xx, 1xx,5xx or no response. This field is also very important to evaluate to avoid wasting the redirection trust to a resource that has server errors, page not found, or other. In summary if you don’t see status code 200 you need to stop and take action.
  • 9. Check Google Analytics and GTM tracking.

At the time of publishing the new website many times it happens to forget the inclusion of tracking scripts such as GA or GTM.

For immediate analysis that everything has been implemented you can use the custom search (‘Config > Custom > Search’), with a ‘does not contain’ filter to look for the full tracking tag within the HTML Head of each page of the site.

This way you can check for which URLs there is no tracking (Custom Search tab) in the main window of the Seo Spider.

SEO Migration Video Tutorial

Seo Spider Tab