The old days of playing with Google's ranking algorithm are over, but many SEO professionals haven't realized that yet.
Once upon a time there was a world that was simple. There was a thesis called – “The Anatomy of a Large-Scale Hypertext Web Search Engine” by Sergey Brin and Larry Page – what he told us how Google worked. And while Google quickly evolved the concepts in that document, it still tells us what we needed to know to rank highly in search.
As a community, we abused this – and many made large sums of money simply by purchasing links to their site. How could you expect any other result? Offer people a way to spend $2 and earn $10Obviously many people are going to sign up for this program.
But our friends at Google knew that providing the best search results would increase their market share and revenue, so they continually made changes to improve search quality and protect against spammer attacks. In large part what made this effort successful was hiding the details of Google's ranking algorithm.
When reading the PageRank thesis was all you needed to do to learn how to formulate your SEO strategy, the world was simple. But Google has since published hundreds of patents, most of which probably have not been implemented and never will be implemented. There may even be trade secret concepts from Google ranking factors for which patent applications have never been filed.
However, as search marketers, we still want to make things very simple: “Let's optimize our site for one feature and get rich.” In today's world, this is no longer realistic. There is so much money in search that any single factor has been thoroughly tested by many people. If there was a factor that could be exploited for guaranteed SEO success, you would have seen someone publish it by now.
“Many different signals” contribute to Google rankings
Despite the fact that there is no magic formula for obtaining high rankings, SEO professionals often look for quick fixes and easy fixes when a site's rankings take a jump. In a Webmaster Hangout Central office Hours Recently, a participant asked Webmaster Google Trends Analyst John Mueller on how to improve his site's content to reverse a drop in traffic that he believed was due to the Panda update starting in May 2014.
The webmaster told Mueller that he and his team are reviewing the site category by category to improve the content; He wanted to know if the ratings would improve category by category, or if there is an overall rating applied to the entire site.
Here's what Mueller said in response (emphasis mine):
“For the most part, we've moved more and more toward better understanding the sections of a site and understanding the quality of those sections. So if you are... reviewing your site step by step, then you would expect to see... a gradual change in the way we view your site. But I also suppose So yeah... you've had a low quality site since 2014, that's a long time to... maintain a low quality site, and that's something where I suspect there are a lot of different signals that are... telling us that it's probably not a great site. ”
I want to draw your attention to the bold part of the comment above. Doesn’t it make you wonder, what are the “many different signs?”
While it's important not to overanalyze all of Googlers' statements, this certainly sounds like related signals would involve some form of cumulative user engagement metrics. However, if it were as simple as improving user engagement, it probably wouldn't take long for someone hit by a Panda ban to recover – as soon as users started reacting better to the site, the problem would be quickly fixed.
What about CTR?
Larry Kim is passionate about the possibility of Google using CTR directly as an SEO ranking factor. By the way, here is your article. It's a great read as it gives lots of advice on how to improve your CTR – which is clearly a good thing regardless of the SEO impact on rankings.
That said, I don't think Google's algorithm is as simple as measuring the CTR on a search result and moving the CTR elements higher in the SERPs. On the one hand, it would be too easy to game a signal, and many industries that are well known for their aggressive SEO testing would have gotten higher Google rankings and made millions of dollars. Secondly, high CTR does not speak to the quality of the page you will land on. Talk about your good job writing the meta title and meta description apart from the branding.
We also have the statements of Paul Haahr (a Google ranking engineer) about how Google works. He gave a presentation linked to this at SMX West in March 2015. In this presentation, he discusses how Google uses a variety of user engagement metrics in ranking. The upshot is that he said they are not used as a direct Google ranking factor, but instead are used in periodic quality control checks of other ranking factors they use.
Here's a summary of what his statements imply:
- CTR, and similar signals, are not a direct Google ranking factor.
- Signals like content quality, links, and algorithms like Panda, Penguin, and probably hundreds of others are what they use instead (the “Core Signal Set”).
- Google runs a series of quality control tests on search quality. These include CTR and other direct measurements of user engagement.
- Based on the results of these tests, Google will adjust the core signal set to improve test results.
The reason for this process is that it allows Google to run their QA tests in a controlled environment where they are not easily subject to algorithm gaming, and makes it much harder for black hat SEOs to manipulate.
So who is right? Larry Kim Or Paul Haahr? Don't know.
Returning to John Mueller's comments
Looking at John Mueller's statement I shared above, it strongly implies that there is some cumulative impact over time generating "there are a lot of different signals that are... telling us it's probably not a great site."
In other words, I guess if your site generates a lot of negative signals for a long time, it is harder to recover, since it needs to generate new positive signals for a long period of time to compensate for the bad track record you have accumulated. Mueller also makes it sound like a graduated scale, where improving a site will be “a long-term project where you will likely see gradual changes over time.”
However, let's consider for a moment that the signal we are talking about may be links. Shortly after the Hangout, on May 11, John Mueller also tweeted that you can get an unnatural link from a good site and a natural link from a spam site. Of course, when you think about it, it makes complete sense.
How does this relate to the Hangouts conversation? I don't know if it really relates. However, it is entirely possible that the signals John Mueller is talking about are links. In which case, analyzing and disavowing them since they may be unnatural links would probably speed up the recovery process dramatically. But is that the case? So why wouldn't he have said that? Don't know.
We are all trying to simplify the operation of the Google algorithm
As an industry, we grow in a world where we could go read the original PageRank thesis by Sergey Brin and Larry Page, and more or less understand the Google algorithm. Although Google's initial release had already deviated significantly from this document, we knew that links were a big deal.
This made it easier for many of us to be successful on Google, so much so that you could take a really bad site and get it to rank high with little effort. You just got tons of links (in those days, you could just buy them), and you'd be set. But in today's world, while links still matter a lot, there are many other factors at play. Google has a vested interest in keeping the algorithms they use vague and unclear, as this is a primary way of fighting spam.
As an industry, we need to change the way we think about Google. However, we seem to be desperate to make algorithms simple. We want to say “Oh, it's this factor that really drives websites,” but those days when this happened are long gone. This is not just a PageRank situation, where we are given a patent or paper that explains everything knowing that it is the fundamental basis of Google's algorithm, and then we simply know what to do.
The second-largest market company on planet Earth has spent nearly two decades improving its ranking algorithm to ensure high-quality search results and maintain the integrity that requires the algorithm, in part, to be too complex for spammers to miss. They can play easily. That means there won't be one or two dominant Google ranking factors.
That's why I continue to encourage marketers to understand Google's goals and learn how to thrive in an environment where the search giant continues to inch closer to those goals.
We're also approaching a highly volatile market situation, with the rise of voice search, new devices like the Amazon Echo and Google Home hitting the market, and the imminent rise of personal assistants. This is a disruptive market event, and Google's position as the number one player in search as we know it may be secure, but search as we know it may not be as important of an activity. People are going to switch to voice commands and a centralized personal assistant, and traditional search will be less of a feature in this world.
What this means is that Google needs its results to be of the highest quality possible. However, they must continue to fight spammers at the same time. The result? A dynamic and changing algorithm that continues to improve the overall search quality as much as it can. Maintain a stranglehold on that market share, and establish an advantage, if possible, in the world of voice search and personal assistants.
What does it mean to us?
The old days of playing with Google's ranking algorithm are over. Instead, we need to work on a few core issues:
- Make our content and site experience as prominent as possible.
- Prepare for the world of voice search and personal assistants.
- connect with new technologies and opportunities that become available.
- Promote our products and services in a highly effective way.
In short, make sure your products and services are in high demand. The best defense in a rapidly changing market is to make sure consumers want to buy from you. This way, if any future platform doesn't provide access to you, your potential customers will let them know.
Source: Search Engine Land