SEO is a fast-moving discipline at times, but even when there aren’t rumblings from Google and SERP-watchers about algorithm changes and penalties, there are always snippets of insight that highlight how the world of search is evolving on a weekly basis – hence this regular feature.
Last week’s Seven Days in Search roundup is here.
Here’s the Blueclaw roundup of the three top stories of the previous Seven Days in Search:
Google Plans to Release Mobile First Index, Slowly
On a recent Hangout, Google’s John Mueller discussed Google’s Mobile First Index saying it could be released in batches depending on what sites work.
“My guess is what will happen is, we will provide you with more information about the type of issues that we’ve run across so far in our tests. And based on that we’ll give you more guidance on when we expect some things to happen.
It’s also possible that we’ll say well this batch of sites works perfectly fine on mobile first indexing so we’ll just switch that over and wait with the next batch until we’re certain that they’ve been able to solve these problems.
But that’s something where we’ll have more information kind of as time goes by.”
So, what will this mean for your website? When Google does this, you should expect to see some ranking and traffic fluctuations. If you haven’t yet rolled out a website that is optimised for mobile SERP users then you need to implement this sooner rather than later. Other considerations such as HTTPS and AMP will massively help you achieve this and of course providing content on your website’s pages that ends the users’ search should be you end goal.
Rank Brain Is Irrelevant So Stop Trying to Optimise for It
John Mueller also recently put out a recent tweet talking about Rank Brain.
The key point John was trying to get across is that if your providing a great site with great content that helps answer your user’s wants and needs then you’ll naturally hit the right signals for Rank Brain and your site will do well.
Google is using machine learning to try and develop algorithms that return a set of search results that are listed with the results that give the best information for the search query. They are basically trying to understand human behaviour rather than keyword relevancy.
So, keep building great sites that provide the best and most useful information optimised and organised based on your user’s intent and behaviour and aim at getting visitors to keep coming back to you.
Good Links, Bad Links – The Debate Continues
Firstly, in its most basic form for SEO a natural or good link is a link that has been created by a webmaster or website publisher where they have linked to a source of information because they unbiasedly want to reference something that’s relevant and provides further insight or information around the topic they are publishing content about. They have also done this without been coerced or incentivised to do so. In summary, they have written about a topic, need to do some further research and have linked to the source that helped them complete their piece of work and offers additional insight or information.
A bad link is the exact opposite of this!
We know what a bad link looks like however, one thing that confuses a lot of SEO’s is the difference between a good and bad site. A good link can sit on a bad site and a bad link can easily be placed on what could be deemed as a good site (a good site is typically measured by Moz’s Domain Authority metric).
Google’s John Mueller recently waded in on this conversation saying that if a good link has been found on what is obviously a spammy site then you can just simply ignore it.
One grey area that a lot of SEO’s get wrong is how do they attract links to their sites. Content marketing is the number one strategy for gaining links to a site. Simply promoting your own websites content telling people that it exists is the best way of building good natural links, however, you must be confident that what you are publishing is good and link worthy. Incentivising people to link to you is something you should be avoiding.
If a site is known to be spammy, especially to Google, then the site and any links from it should have been devalued by Google so unless those sites could to send you any meaningful traffic to your website our recommendation is to avoid them and concentrate on implementing a long-term content marketing strategy.
Do You Want to Be in My Gang?
So, Gary Illyes aka @methode on Twitter aka House Elf and Chief of Sunshine and Happiness over at Google recently put out a tweet talking about if you have an SEO theory you should have someone else from the industry review it.
Note that this was a 2-part tweet, he then went on to recommend a list of some helpful, long time SEO’s.
Which, well err, stirred up a bit of a discussion on the thread over on Twitter.
What’s interesting here is that a public face of Google is recommending non-Google people for SEO. Google has always had a history of sending out conflicting messaging around ranking factors obviously to protect how they’re algorithms work so it’s a bit strange that Google is now doing this and appears to have a list of ‘SEO favourites’. Remember Gary’s recent rant about why sites don’t link anymore where he seemed oblivious to the fact that Gary and his team created a fear around the link between sites.
Is this a new strategy to keep SEO’s guessing about Google’s algorithms? What criteria went into Gary’s selection of this list of ‘industry experts’? Just seems a strange move all in all.
The plot thickens.