Blog

Google Rolls out "Blocked Resources" Notifications

David Hellowell

Google Rolls out "Blocked Resources" Notifications

If you’ve been keeping an eye on your Search Console (née Webmaster Tools) notifications, and if you haven’t then you really should start – do it now, I’ll wait; you may have noticed Google is now sending out direct notifications of blocked resources that the Googlebot is unable to access. Crucially it makes mention of the fact that this may hinder the way Google is able to render and index your content, which can subsequently lead to ‘suboptimal rankings’.

cannot-access-js-css

The culprit behind all this is an over-zealous Robots.txt file configured to block the crawling of Javascript and CSS files, typically by disallowing the folder in which these external resources are housed. Until recently this has been a commonly implemented tweak on many sites, with the theory being that it’s undesirable for these files/URLs to be crawled or indexed as they offer no utility to a user. Google’s ability to parse Javascript files has historically been quite patchy but this move, along with other recent updates, suggests that this may be in the process of changing.

blocked-resources-bcs

(We’ve got some work to do!)

Whether you’ve received a message regarding blocked resources or not, this information is accessible elsewhere – both within Search Console and other Google resources. The first place to check is the relatively new Blocked Resources Report, accessible under the ‘Google Index’ tab of Search Console.

fetch-as-google

If you want to check for blocked resources on a particular page you can either use the Fetch As Google tool, under the ‘Crawl’ tab (now updated to show you a side-by-side comparison of how Google views your site compared to a typical user) or use Google’s Mobile-Friendly Test – both will give you a list of which resources are utilised by the page but inaccessible to GoogleBot.

mobile-friendly-test

Once you’ve established which resources (if any) are blocked, it’s relatively simple to check the robots.txt file on your domain and recognise which rules are causing the blockage. Chances are it’ll either be the directory where plugins and other resources sit (e.g. disallow: /wp-content/plugins/ on a plugin-heavy WordPress site), or that rules have been included to block access to specific filetypes (e.g. disallow: *.js).

Remove or amend these rules, re-fetch the URL and see if the warnings disappear. If you’re still concerned about any of these assets being indexed by Google you can look into utilising X-robots-tag HTTP headers to specifically disallow the files/URLs from being indexed.

Featured

Got a project in mind? Our friendly team would love to hear from you

Get In Touch

Want more information?

View Our ResourcesView Our Work

For exclusive digital marketing news and resources, subscribe to the Blueclaw insight list.