Google Search Console, previously Google Webmaster Tools, has been sending out a warning that Googlebot cannot access CSS and JS files to thousands of webmasters.
You may be wondering why Google needs to access the CSS and JS files of your site. Let’s make things brief. When crawling and indexing your site, Google not only reads the content, but also takes the visual layout into account. If Google can fully understand your site layout and structure, there is more possibility for you to obtain higher rankings in search results. So all is about your SEO performance.
WordPress itself is not configured to block Googlebot from CSS and JS files, but you may have blocked them accidentally when trying to improve site speed or security. WordPress security plugins can also cause the problem and get you a warning email.
The warning email includes some instructions for fixing the blocking issue, but you may have found that the information is a little bit hard to understand. In below, we will offer a simple but detailed tutorial.
Identify the Blocked Resources
Firstly, you need to find the files that are blocked from Googlebot, so you can carry out the appropriate actions later. You can do this in two ways, both requiring access to the Google Search Console account of your site.
Check the blocked resources
In the Search Console dashboard, expand the Google Index menu and click on Blocked Resources. If you have resources blocked, you will see a list of them and how many pages they have affected.
Clicking on the URLs under the “Host” column, you can get the locations of the files that Googlebot is not allowed to access. Manually check through the results, and if you get some JS or CSS files added by themes and plugins, you have to edit the robots.txt file to make some modifications.
Use the Fetch as Google feature
You may have not seen any blocked resource in the section discussed above. At this time, you could use the fetching feature in Google Search Console to identify those resources and see how the blocking affects your website layout.
To do this, go to Fetch as Google under the Crawl menu, and then choose fetching and rendering the homepage. Remember to do this for both Desktop and Mobile.
Once the fetching is successful, the result is put in a row. Just click on it to look at a comparison of how your website displays for visitors and Googlebot respectively.
If there is any difference in the display, it means some CSS/JS files have been blocked from Googlebot. The blocked resources are listed on the bottom of the page. Clicking on the Robots.txt Tester link after each URL will show you the lines of your robots.txt that blocks Googlebot. You can note the lines down.
Modify the robots.txt File
Since you have located the blocked resources, now you can start correcting the robots.txt file to grant Googlebot the appropriate access. Generally speaking, there are three easy ways for editing the file.
- You can connect to your site with an FTP client. Once connected, scroll down and you will find the robots.txt file in the root directory.
- You can access the file by using a web-based file manager. The popular cPanel includes an easy-to-use one, which is also our choice.
- If you have installed WordPress SEO by Yoast plugin on your site, then you are able to edit the file directly in the WordPress dashboard by going to SEO > Tools > File editor.
Opening the file, you will find some lines which disallow the access to some directories of your site. Below is an example.
Remove the restrictions
If you have taken some notes when checking the blocked resources, then you can remove the lines accordingly to give Googlebot the access to your CSS and JS files.
Typically, many files are located in the themes and plugins folders, so you may need to remove these lines. The denied access to the wp-includes folder could also be the source of problem because some themes and plugins call scripts in this folder.
Another easy way out
If you are uncertain which line to displace or don’t want to remove the restrictions put before, you can simply add the following lines in the robots.txt file to allow Googlebot to access all CSS and JS files. These lines will override the “Disallow” but they do not affect the rules for any other file.
Don’t have a robots.txt file?
Another unlikely event is that you find your robots.txt is empty or does not exist, which means all files are crawled and indexed automatically, but you still receive a warning email.
If this is the case, you may have to contact your hosting provider to know whether they are applying restrictions to some folders by default. You can either ask them to unblock the folders, or create a robots.txt file by yourself and add a line like “Allow: /wp-includes/js/” to override the default configuration.
Confirm the Modification
After editing and saving the robots.txt file, you need to make sure the configurations work by performing “Fetch as Google” once more. If the two screenshots look exactly the same and there is no blocked resource listed, the problem is resolved properly.