Google changes its search algorithm up to 600 times a year. Most of these are minor changes, but once every few months the search giant implements an important algorithmic update that determines how search results will appear until the next major update.
Search marketers and SEOs find it hard to operate properly without knowing Google’s timeline for these changes. They will hit a blind alley, waiting for the imminent decline of their clients’ rankings and organic website traffic.
Thankfully, Google has been somewhat open and usually announces when a major change will happen.
Matt Cutts, head of webspam at Google, recently taped a round of webmaster videos on what to expect in the next few months of SEO for Google.
Some of these upcoming changes will continue the fight against webspam and black hat SEO, while some additions help suggest solutions for a hacked website.
Cutts said in the video that Penguin 2.0, the next-generation update to Penguin, will launch within the next few weeks. It will dig deeper and eventually have more impact than Penguin 1.0.
He said native advertorials, ads written and presented in journalistic style, violate the rules for quality. For example, if a company pays for an ad, the ad must not pass PageRank standards. But some websites linked to other websites passed it. There’s nothing wrong with the ad itself, but it must not pass PageRank. Search marketers have to be clear so people know if the advertorial is paid, organic, or editorial.
When it comes to queries, Google plans to curb “payday loans” and some queries on pornographic materials.
Penguin 2.0 will remove pages with high rankings attributed to irrelevant links. Cutts said they have new data to experiment with in the coming months.
Google will use the update to improve the detection of hacked websites. It will try to provide a platform where webmasters may go to grab the resources needed to fix the problem. Cutts didn’t discuss, however, the difference between a hacked site and a site that unknowingly runs malware.
Penguin 2.0 will increase the ranking of authority websites because Google wants an easier way to figure out authorities in a given space. “Authorities” here is subjective but it may mean that a small site, even if it has more quality content, will struggle ranking high against a well-known “authority” site.
Cutts said they are looking at Panda, Penguin’s predecessor, for signals to polish sites in the “grey zone” or “on the border.” Google wants to reduce the impact for sites that have these signals.
Penguin 2.0 will abolish redundant results for one website. Penguin 1.0 should have addressed the issue for the first page of search results. But some people say that it goes as far as the third page.
Cutts assured, as he did in the beginning of the video, that if a website offers high-quality content, Google will go with its flow. And none of these changes will affect it.
He challenged black hat SEOs and advised people who visit black hat sites not to trade information and try to fool the algorithm, or they will suffer.
The changes are designed to cut the number of webmasters who perform black hat tactics. Cutts finally said that Google wants to give higher chances to startups and small business owners who have more white hat techniques than large ones who do it the wrong way.