A glance at the SEO blog scene will reveal dozens of ‘experts’ offering advice which essentially parrots Google’s guidelines. This is fine, in most cases. However, if you’re a natural sceptic with a ‘Question Everything’ attitude, you might wonder: is Google always right? Furthermore: does Google always tell the truth when it makes its recommendations?

Read on for some examples of Google guidance which shouldn’t necessarily be taken as gospel.

1. Manually building links to your website does more harm than good

At a conference, Google’s John Mueller was once asked if link building is “good in any way.” In response, John offered this famously controversial tidbit:

“We do use links as part of our algorithm but we use lots and lots of other factors as well. So only focusing on links is probably going to cause more problems for your web site than it actually helps.

Google’s dislike for link building is even enshrined in the Webmaster quality guidelines. They state: "Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google's Webmaster Guidelines.”

There’s nothing wrong with the idea that websites with valuable, shareable content should gather links steadily and naturally over time. Yet there is evidence that links remain one of Google's strongest ranking signals, regardless of their supposed legitimacy.

First, think for a moment: if this guidance on ‘link schemes’ is enforced, would there be a continued market for black-hat link building and guest posting with the precise aims of increasing PageRank? Moreover, what would happen to the vast numbers of popular publications with ‘advertorials’ and ‘sponsored content’ which conveniently include targeted links back to corporate websites?

Even recently, Google’s ranking standards were dodged by parties working together to hijack the ‘trust’ passed on by hyperlinks. See this Reddit SEO community post from two years ago. It highlights how a US media network pushed a low-quality sales site, littered with Amazon affiliate links, to the top of search rankings for a range of competitive niches. How was this achieved? The sites were using footer links - long regarded as the among the lowest form of manipulative linking practices!

There can be no doubt. Having brands such as Cosmopolitan, Esquire and Elle link from their website to yours can help you to outrank others. Even if your competition invested significant toil to earn their organic position! This, of course, is a quirk of how big brands are inherently trusted by Google algorithms - ironically due to their healthy and ‘natural’ backlink profile.

Even Google’s guide to link schemes states that alongside quality and relevance, the quantity of links is a key factor in your rankings.

‘Content is king’ - but is it really?

The importance of strategic digital content hasn’t been under-emphasised in recent years. There’s no lie here, either: having share-worthy content is a boon to any online business.

However, the idea that content will generate links, likes and shares all by itself is plainly fantasy. That is unless you already enjoy the benefits of a broad public profile. Even if your article is the objective best in its niche, it’s very unlikely to perform better in organic search than content targeted with relevant anchor text.

All evidence suggests that Google’s robots are still fairly literal-minded, preferring to look for
on-page text before things like community engagement. The fact is: if you don’t build links in some form, the SEO victories will go to your competitors that do.

About anchor text

Another aspect of Google’s guidance on links is that anchor text - the text part of a hyperlink - shouldn’t be ‘over-optimised’ to target your main keywords.

A comprehensive study of Google's Penguin update by the now-defunct Microsite Masters revealed the only websites punished by Penguin had optimised anchor text in 65% of their backlinks. Sites with a ratio of 50% or under saw no rankings hit.

Our ‘click here’ search experiment

If you needed proof that anchor text in links remains an important signal used by Google to understand content on a webpage, here it is. We had the Selesti team search Google for the term ‘click here’.  While the top three results were split between some deserving sites, positions four and five were more interesting.

The fourth position website not only had more ‘click here’ anchors for links than any other text on its homepage (12.77% of links). A backlink check also showed that there were 437 referring domains linking to the page with ‘click here’ anchor text. The same thing was shown for the fifth position site which - although it had no ‘click here’ anchors on the page - was linked to with ‘click here’ anchor text from 296 domains.

This clearly supports the idea that you can and should use some keyword-targeted anchor text where possible. To be safe, we recommend that you go sparingly.

Important note: we’re definitely not suggesting that you ignore Google’s advice around links. Natural links are the best links, and something we encourage wherever possible! On the other hand, we are definitely hoping to inspire a healthy scepticism - especially when what Google says contradicts what is proven to work.

2. Page speed is a major ranking signal

Since 2010, it’s been largely accepted as fact that site speed is a key ranking factor. Look around for blogs on trends and focus areas - site speed is something digital marketers have become a bit obsessed with.

That’s not without reason. Data definitely suggests that a slow site speed can contribute to reduced user satisfaction, conversions and revenue. But is site speed actually an important factor in whether your website appears in the top ranking position?

When Google released an update to their PageSpeed Insights tool last year, the emphasis on new dimensions - like First Contentful Paint (FCP) and DOM Content Loaded (DCP) - confirmed to many digital marketers how important site speed is.

Google has been fairly clear about the implications of 2018’s ‘Speed Update’. So, it should be no surprise that only ‘superslow’ sites are affected by any ranking downgrades.

The hypothesis that the ranking of most websites is unaffected by speed was proven by Aleh Barysevich of SEO Powersuite. His data showed that metrics like FCP and DCP had zero correlation with search rankings. This, despite months of scrambling in the industry to put site speed at the centre of SEO strategies.

To reiterate: in the months surrounding Google’s so-called ‘Speed Update’, the vast majority of websites went unaffected by actual load speed. It turns out that the most important thing for Google rankings was the level of speed optimisation achieved on the website. Optimisation suggestions, of course, have always been on offer within the PageSpeed Insights tool.

That means agency SEOs and developers who spent months viciously stripping code and plugins from their websites - just to grant a few milliseconds of speed to their users - could have simply ticked the speed optimisation boxes we’ve always known about and focused their efforts elsewhere.

Now, there’s a good chance that Google’s algorithm will become more sophisticated in its measurement of site speed over time. However, this is one example of how a Google directive - saying ‘site speed matters’ - was taken as total truth, leading industry professionals to pursue the world’s fastest website in the name of a non-existent SEO benefit.

To be clear: a fast site is good for user experience. But nobody, in the name of a page one Google ranking, should be revolutionising their website’s look and feel in a speed improvement initiative.

3. Clicks and user behaviour from search results pages aren’t ranking factors

The idea of measuring users' click data to decide rankings - recording their navigation through search results - remains controversial to this day.  Part of the reason for the controversy lies in Google’s lack of consistency. When pressed, the official position has usually been avoidance,
if not outright denial.

To quote Google’s Gary Illyes: "CTR (click-through rate) is too easily manipulated for it to be used for ranking purposes".

The idea that CTR can be manipulated seems self-evident. At the same time, the notion that Google’s algorithm doesn’t include click data in its calculations - or subsequent behaviour on landing pages - seems counter-intuitive to many digital marketers. After all, Google is famous for emphasising relevance and accuracy in their search results. How on earth do you measure relevance without looking at:

1) Which pages users click on for each search query?

2) Whether users engaged more with some pages over others (with time on page as a metric, for example)?

For Google’s ranking factors not to include some measurement of these statistics seems illogical, at best.

Interestingly enough, public patents for Google products and services seem to indicate click data is used in ranking calculations. Pointedly, some of these refer to methods of filtering the spammy and manipulative click data which Google cites as the very reason it can’t use clicks for rankings! That’s fishy, to say the least.

Furthermore, the famous Rand Fishkin - formerly of Moz - performed several experiments which appeared to prove that click data is a ranking factor. In his tests, Rand asked his followers to search for a specific keyword, click on the #1 result and bounce back (leave without interacting). The users would then click on the #4 result and stay on the page a while. Within days, the lower result at #4 had moved into the #1 position.

Both Google’s community managers and blogger experts have repudiated the results of this experiment. However, it’s easy to see that Google has to enforce the idea that CTR isn’t important! To say otherwise would only encourage people to manipulate the results.

While Google may continue to insist they won’t use clicks and CTR in ranking considerations,
a bit of scepticism around Google’s guidance seems more than reasonable, in this instance.

Reasons not to always trust the Google

Now, surely going against Google’s guidelines is a risky path to pursue?

Absolutely. It’s not something that should be taken lightly, or without careful consideration.
Yet it’s noteworthy that a decent proportion of the digital community has come to be more than a bit wary of Google’s advice, over the years.

A perception among some industry players that “Google doesn’t like SEOs” goes back to 2013, when Google began obscuring keyword data in Search Console reports. To many, this was clear confirmation of the corporation’s aim to force users into its paid-for Adwords (now Google Ads) platform.

Many now argue that Google’s data is circumspect. They suggest the company has no reason not to nudge users toward bidding on more expensive search terms. Especially if it can raise more revenue for the company without raising questions. In recent years, the rise of ‘top position’ search results and extra features have convinced even more SEOs that Google is moving aggressively to make them, and their clients’ websites, redundant.

This, for the most part, is conjecture and conspiracy. Yet there is evidently good reason to question Google’s digital marketing guidance, especially when common myths conflict so often with proven reality.

We’re not suggesting rules are made to be broken. At Selesti we always stick to best practices and proven industry standards. We also know when to dig into the data and question authority, without accepting recommendations blindly. Can we help with your digital marketing strategy? Get in touch and let us know!