If you’ve been keeping an eye on your Search Console (née Webmaster Tools) notifications, and if you haven’t then you really should start – do it now, I’ll wait; you may have noticed Google is now sending out direct notifications of blocked resources that the Googlebot is unable to access. Crucially it makes mention of the fact that this may hinder the way Google is able to render and index your content, which can subsequently lead to ‘suboptimal rankings’.
(We’ve got some work to do!)
Whether you’ve received a message regarding blocked resources or not, this information is accessible elsewhere – both within Search Console and other Google resources. The first place to check is the relatively new Blocked Resources Report, accessible under the ‘Google Index’ tab of Search Console.
If you want to check for blocked resources on a particular page you can either use the Fetch As Google tool, under the ‘Crawl’ tab (now updated to show you a side-by-side comparison of how Google views your site compared to a typical user) or use Google’s Mobile-Friendly Test – both will give you a list of which resources are utilised by the page but inaccessible to GoogleBot.
Once you’ve established which resources (if any) are blocked, it’s relatively simple to check the robots.txt file on your domain and recognise which rules are causing the blockage. Chances are it’ll either be the directory where plugins and other resources sit (e.g. disallow: /wp-content/plugins/ on a plugin-heavy WordPress site), or that rules have been included to block access to specific filetypes (e.g. disallow: *.js).
Remove or amend these rules, re-fetch the URL and see if the warnings disappear. If you’re still concerned about any of these assets being indexed by Google you can look into utilising X-robots-tag HTTP headers to specifically disallow the files/URLs from being indexed.