Scheduled crawl

How to perform a scheduled crawl with Screaming Frog.

“Scheduling” function

Understanding the potential of the “Scheduling” feature, all you have to do is to configure it by following a few steps that we see below.

File > Scheduling

At this first stage you can define some general aspects of the crawl:

  • Name: descriptive name of the scan.
  • Project Name: name of the project in which to place the scan.
  • Description: very useful for textually defining all the peculiarities and options that will characterize the crawl.
  • Date/Time: the date, time of the scan and its periodicity (one-time, daily, weekly or monthly).

Seo Spider Configuration

In the second tab “Start Options” you can choose the crawl mode between “spider” and “list,” enter the “root domain” or subdomain, and define which configuration to apply to the crawl, choosing from the various “Crawl Profiles” (if any).

In this tab you can also integrate through API scanning with data from Google Analytics, Search Console, Moz, Majestic and Ahrefs for a more granular view of the project.

Saving Data

Through the “Export” tab you are able to define the tabs, “Bulk Exports” and reports you want to export. You can choose to download only a specific tab (e.g., “Internal,””meta description,” or “headings”) or combine multiple tabs and reports by selecting them from the available window. Data will be served by default in “.csv” format and saved locally or in “gsheet” format. Choosing this second option will find the exported documents in the “Screaming Frog SEO Spider” folder in Google Drive (you will need to have linked your Google account).

Another very convenient option of “Scheduling” in periodic crawls is to be able to choose whether, at the end of each scheduled crawl, to automatically overwrite the original document so that only the last crawl is kept, or to create a “timestamped” that adds a new document to the folder and keeps track of the crawl history.

Please note: By using the “Database Storage” storage mode, even the scheduled scan is always available for you to reuse at a later time.

The “Scheduling” function works in headless mode to avoid unwanted user interactions. In case there are multiple scheduled and overlapping crawl instances Screaming Frog executes them at the same time. This condition demands more resources from the system, times may lengthen, and the computer may slow down.

Scheduled Video Scan

Seo Spider Tab