How to Fix Googlebot cannot access CSS and JS files

Today, I received a number of messages from Google Webmaster Tools, just like the one below because I have blocked quite a few resources from search engine bots on many of my sites.

I didn’t realize I was blocking Google’s ability to render the page because previously as you fetched the page via Google’s “fetch as Googlebot” tool, the page would render properly. So Google has changed something, and is now letting webmasters who have verified their properties within Google’s Webmaster Tools of this error.

If you just want to get to what I did to fix it, scroll down.

Here’s an Example Email I got from Google:

Googlebot Cannot Fetch CSS & JavaScript Files

 

The Simple Fix:

There’s a number of ways to fix this problem, and will depend on what you want to do, and the complexity of your robots.txt file.

If you’re not very technically savvy, this should fix it for you very easily – just add these following lines of code to the very end of your robots.txt file. The reason you want to add them to the end, is because this way they will override any previous restrictions:

User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*

The reason I added the asterisk behind the .js and .css extensions is because some of my files are called with version numbers, such as file.js?v=123 and I want to make sure they were allowed also.

If you already have a section of your robots.txt that specifies Googlebot, you can just add the 2 “allow” lines from above, to the bottom of that section.

How to Test That it Works

In order to make sure it works, you will need to use the fetch as googlebot tool and click the “fetch and render” button. Then you will see the result – you will see that in mine, it says “partial” as the result.

Fetch as Googlebot

Click on the result line where it says “/” and it will display what Googlebot sees, what visitors see (because, they also fetch without respecting the robots.txt rules, to weed out spam sites that use trick tactics – so I don’t see why this is an issue now because they do it anyway, but that’s another subject) and a list of any errors.

What my render and errors result looks like:

There are still a few lines of blocked CSS. If you click the “robots.txt Tester” link it will let you test each one individually.

Here’s what that looks like, when I click the first one that was blocked:

Google will tell you the line that is blocking the file they are trying to access, and allow you to update the robots.txt file right there and test it against the URL.

Below, after I make changes to the robots.txt file right here to test it out, you can see that when I add the 3 lines of code I mentioned earlier, that Googlebot can now access that file.

Yay, the Test Worked But There is More to Do!

In order to actually update the file, you will need to change the file on your server. As you can see the testing tool also has a “submit” button and a “last version seen” date and time.

You will need to do the following steps to let Googlebot know that you’ve fixed this access error.

  1. Upload your fixed robots.txt file to your server.
  2. If you use Cloudflare, or any other caching system or caching plugins, make sure you clear the cache and verify that you can see the updated robots.txt
  3. Click the “submit” button and Google will prompt you through the steps as follows: 
  4. Once you have done these steps, when you click “submit” you will tell Googlebot to get the newest version of the file instead of their cached version.
  5. It may take a few minutes, but you can reload the tester to see your new version of the file and confirm that Googlebot can now access your .js and .css files.

How’d you do?

Do you have any other tips on how to fix this? Please share below in the comment area if you do.

Leave a comment

Name .
.
Message .

Please note, comments must be approved before they are published