

However, most of the commands work locally the same with maybe minor tweaks on Windows and/or macOS. Dependenciesīefore this guide continues, there are a few points that need to be addressed first.įirst, the commands in this guide are written as if your primary local operating system is a Linux distribution. If this does not work, or to better understand how to setup the remote instance, transfer data, schedule crawls and keeping your crawl running when you are not logged into the remote instance, continue reading. Wget -O install.sh & chmod +x install.sh & source. Google Cloud, and you just want to download, install and/or update Screaming Frog SEO Spider on the remote instance in a jiffy then you can skip most of this guide by just logging into the remote instance and issue the following one-line command in the terminal on the remote instance:

To improve site speed for users and search bots by priming CDNs from different location by crawling your most important pages regularly.To have your own personal in-house SEO dashboard from repeat crawls.To create XML Sitemaps using daily scheduled crawls and automatically make these available publicly for search bots to use when crawling and indexing your website.In this guide I will be combining three distinct tools and utilize the power of a major cloud provider (Google Cloud), with a leading open source operating system and software (Ubuntu) and a crawl analysis tool (Screaming Frog SEO Spider).Įxamples of solutions this powerful combination can bring to the table are:

And by combining some of these tools not only can we address the challenges we face, we can create new solutions and take our SEO to the next level.

Advanced technical SEO is not without its challenges, but luckily there are many tools in the market we can use.
