Troubleshooting Setting Up a Tracked Site

What’s Covered?

This guide covers troubleshooting tips for resolving errors encountered while setting up a site to track in a Campaign.

Troubleshooting Errors Encountered When Setting up a Moz Pro Campaign

Our Crawlers Were Unable to Access That URL

Some folks receive this response when they try to set up a Campaign. Let's go through some steps to ensure you've entered the correct details and checking your site is crawlable.

An example of the Our Crawlers were unable to access that URL error message sometimes seen during Campaign creation.

First of all, please check the following:

  1. Check you've entered the correct site
  2. Check there aren't any extra blank spaces at the beginning or end of your URL
  3. Enter your URL into your browser and check it's publicly accessible

Still stuck? Let's go Through Some Other Troubleshooting Steps

To create a Campaign, we need to receive a valid HTTP response from the web server hosting your site.

That may sound rather complicated, but it basically means that our crawler, Rogerbot, must receive a response from your website's server that enables us to crawl your site. So even if your site is accessible in the browser there may be something blocking our crawler behind the scenes.

1. Test your site by sending an HTTP response using any online HTTP tool

There are many ways to run a test. One third-party tool that we use is httpstatus. Enter your site, for example moz.com, and click Submit. A valid URL will return a 200 OK or 301 response. If you receive any other error, then you should report this to your website administrator or hosting provider to investigate and resolve.

2. Check you're not blocking AWS servers

Our Campaign crawler, rogerbot, is hosted with AWS. In some cases sites may have firewalled AWS servers, which would block rogerbot from accessing your site. The best way to fix this is the check with your website administrator or hosting provider to investigate and resolve.

3. Check you’re not blocking our crawler

In some instances the http tool will return a valid http status and we will still not be able to crawl a site. In this case you'll need to speak to your website administrator or hosting provider to see if they can check their server logs for responses to User-agent: rogerbot. Please ensure your site and server are not blocking rogerbot. Blocking our crawler will result in a forbidden response from the server and you will not be able to move forward with Campaign creation.

Related Articles


Woo! 🎉
Thanks for the feedback.

Got it.
Thanks for the feedback.