If you are running an online business, I am sure search engine traffic is quite important to you. And the giant Google comes into picture here. Getting an error notification like “Googlebot cannot access css or js files on your site http://www.yourdomain.com” is not a fun thing to look at.
When someone searches for something relevant to your website and if Google presents a list of websites as search result to that searcher, there are chances that your site could be in the list, if you do certain things properly.
Now SEO is all about this. It basically boils down to giving access to Googlebots. You allow the Googlebots to crawl your site so they know what your site is about.
They crawl every page of your website to know what every page is about and if someone looks for an information that is relevant to what you have on your website, your site could be presented up in the search results (although this depends on various factors).
Earlier, the css and js files that tell about the design and visual presentation of the website need not be accessed by Google. It was enough if Google had access to the content on the website.
But now Google wants to “see” how your website is structured and designed so it can make sure that it is giving a nice user experience to its (Google’s) users.
Here’s where the error “Googlebot cannot access CSS and JS files” comes into picture. Since Google wants to give good rankings to only sites that are user friendly, it wants to see if your site is user friendly or not.
So if you block access to CSS and JS files, Google bot won’t be able to see the structure and user-friendliness of your website.
And since Google does not know for sure about your site’s user-friendliness, it will not provide your site with optimal rankings.
Googlebot cannot access CSS and JS files on ….. error in Search Console
So if you have added your site as a property in Google Search Console (formerly Google Webmaster Tools), then you would get this error message “Googlebot cannot access CSS and JS files on http://www.yoursite.com” in your Search Console dashboard.
If you’ve got this message (go check your Search Console) it is time to take action immediately. Even if you have not got this error message yet, it is still wise to take immediate action because this will directly affect your rankings as Google says!
How to fix the “Googlebot cannot access CSS and JS files” error? Give Google the access to CSS and JS files!
In order to fix this issue you should first know (and confirm) which files are inaccessible by Googlebots. To know this, login to your Google Search Console (formerly Google Webmaster Tools) and find out how Google sees your website as of now.
You should also repeat this for Mobile by clicking the dropdown “Desktop” (marked as arrow 4 in image above).
Once it is done (currently it shows “Pending” in the image above) you will see either a “Complete” or a “Partial” under the “Status” for that item.
Clicking on that will show, as a side-by-side comparison, how your site appears in Google’s eyes and how it appears to your visitors.
Ideally these two should look the same. If the two versions are different as shown below, it is time to take action!
Watch this video to learn about fixing this error:
If you prefer to read, please continue!
The Search Console makes your job even easier. If you scroll down at this page, you will be presented with a list of links that are blocked, as shown below:
This is not a complete list though. But you get the idea.
You should also check to see the “pattern” in the increase/decrease of these blocked pages on your site so you can get an idea when this happened or what could have possibly triggered this (say the installation of a new plugin or a change in the code etc.).
This view will also help you understand the improvements you get after you’ve taken action.
For this, go to Google Index and click on Blocked Resources:
Googlebot cannot access CSS and JS files – Let’s get to action
Now our action is to edit the Robots.txt file because it is the file on your website that tells the robots what they have access to and what’s hidden from their eyes.
First, open your Robots.txt file. You can do this from your Control Panel by going to File Manager > public_html and then find robots.txt. Single click on it and then choose View from the top menu.
Or you could access the file via FTP. You could use a FTP Client like FileZilla which is available for Mac and Windows. Or a FTP browser extension like ShiftEdit for Chrome, or FireFTP for Firefox.
Or (just for viewing purpose, not editing) you could just click on “See Live Robots.txt” link in the Robots.txt tester inside Search Console (shown below).
Once you see the content of your robots.txt file, copy all of it and paste it into the tester box.
So you have to make sure you have given Googlebot the access to CSS and JS files. To still refine your investigation, go to your website, right click and click on “View page source”.
You will see a messy source code of your website. Click on Ctrl+F (For Windows) or CMD+F (for Mac) and type js. This will highlight the URL of any js file on your site. You could do the same to find out links of CSS file.
Grab the URL of any js file (or CSS file) and paste it in the robots.txt tester in your Search Console (under Crawl) – you need to delete the bit where your main-domain.com appears since it is pre-filled. And click on Test.
Or you could simply grab the URL of any web page that is listed in Google Search Console rendering mode (3 images above).
Now the tester will point out the line in the robots.txt file that is responsible for this error – if that happens to be an error (see robots.txt tester image above).
If that page is Blocked, you will see a “BLOCKED” as shown in the image above. If it is allowed, the same box will say “ALLOWED”.
Now the tester will show the exact line in your Robots.txt file. Next, remove that line; if it is a Disallow rule, you should add an appropriate Allow line.
For instance in the robots.txt tester image above, the error is caused due to the following rule:
So I should just replace it with Allow: /wp-includes/
You might want to test again and again with a few other JS and CSS file links that you grabbed from the page source or the list presented in Search Console.
If all is well grab this content and paste it in your site’s robots.txt file! You’re done.
If you are using Yoast SEO plugin (which I highly recommend by the way) you can edit the robots.txt file right from your WordPress Dashboard.
If you have this plugin installed, go to your WP dashboard and click on “Tools” under “SEO”
Then click on File Editor where you will have the option to edit your robots.txt file.
Use the same procedure as above to rectify the error in your robots.txt file using the Search Console tester and grab the correct lines and paste them in your original robots.txt file.
Now, go fix the Googlebot cannot access CSS and JS files error on your site!
I hope you could do it yourself now. It all boils down to this:
- Open the robots.txt file on your website.
- Copy its content and paste it in the robots.txt tester inside Google Search Console.
- Find out a few links to js and css files either from the source code of your web page or from the rendering page in Search Console.
- Paste them in the tester to see if they are blocked, and if so the the tester will point out the exact line that’s the culprit in robots.txt file.
- Replace that rule with an appropriate rule. If it is Disallow: /wp-includes/ you might want to replace it with Allow: /wp-includes/
- After enough testing, if all is well, grab this content in the robots.tx tester and paste it in your website’s robot.txt file (there are various ways in which you can edit your site’s robots.txt file; I’ve covered them in the post).
Hope you can do it! Try it out and let me know how it went in the comments below.