Contact

Keep up to date with our latest Digital Insights

Want to keep up with the very latest developments in the digital marketing world? Of course you do…

How To Fix: “Googlebot cannot access CSS and JS files”

This week saw Google send out warnings to many sites via Search Console (aka Webmaster Tools) advising them about an issue on their site which affects how the algorithms render and index content.

If you want an opinion on this course of action, keep on reading, if you just want a quick fix solution, click here to skip my rant and get the fix.

If you are reading this post, chances are you’ve read this message:

Google Warning

Personally, I found this message to be badly worded which has served to instil panic in many a site owner who isn’t tech seo or web dev savvy – i.e. your client.

The use of phrases such as “detected an issue” and “suboptimal rankings” aren’t entirely accurate.

Particularly on one site we received the warning on, which was a WordPress site that disallowed /wp-includes/

All rendering css and js was in the /wp-themes/ folder, which wasn’t blocked, the only file that was blocked was a file which is now included since the last WP update for emoji support and automatically injected into the header.

More on how to fix that here:

https://wordpress.org/support/topic/emoji-and-smiley-js-and-css-added-to-head

But alone, this file did not have any effect on the rendering of the site, so there wasn’t an issue.

But if the robots.txt file on the site was detected as an issue and triggered the sending of the above message, what’s to stop them from being susceptible to suboptimal rankings?

Whilst I applaud Google’s efforts in making folk aware that you need to ensure your site is optimised in terms of making css and js crawlable, and to get you to check your site’s rendering, I’m not convinced they are yet in a position to suggest not doing so will harm your site.

It reminds me of the time everyone went into a panic thinking they had to migrate to https because Google suggested it would be in your interests to so, from a ranking point of view (and user security/trust/etc of course).

The Solution

So, it’s fairly easy – if you go to webmaster tools and check how your site renders in the “Fetch As Google” section- be sure to check both “Desktop” and “Mobile: Smarthphone” from the drop down:

webmaster tools

 

Once complete, click on the Status to see how Google views your site – you’ll know there is an issue if the box on the left looks totally different to the box on the right – note of the purposes of the experiment I blocked the js and css folders on our site using:

Disallow: /js/
Disallow: /css/

Broken Site

The solution is to either remove the directory blocks in robots.txt to any folder that may have js or css in it that affects your layout.

Or, if you don’t want to open up those folders for all elements to be crawled, then the following will do the trick:

Allow: /*.js$
Allow: /*.css$

This basically tells search engine robots that they can look at whatever css and js they want to.

Once added, test your site again and all should appear well:

fixed-site

You may find a number of 3rd party resources blocked, but there’s not much you can do about those.

And that’s it.

Leave a Comment

Your email address will not be published.