Algorithm Update – the next leap!?
Called Penguin 2.0 internally at Google, the latest updates to roll out in the next week or so aim to have a significant impact on web spam tactics and the value these methodologies impart to SERPs and a sites authority, even more so than the last major update.
Will this mean a rise of authentic business as a result? Here’s hoping!
In response to the question posed “What should we expect in the next few months in terms of SEO for Google?” Matt Cutts from the Google Web spam Team responds in the following video.
At the base of any businesses approach online should be a real site with real content, the site the users want to visit and use, and keep coming back to. The sort of site that people want to share and bookmark and tell their friends about.
If this is the core starting goal for any company, Google wants to support this approach and support your efforts to support your users by reflecting your valuable site in the SERPS. We hope this will be adequately reflected with the latest updates over the next few weeks (based on current information). Coming updates include refinements of the search algorithm which aim to have an impact on the following items.
Items that violate Google’s’ quality guidelines are being looked at to remove any opportunity for these instances to flow page rank to another site. An example of this sort of tactic is when someone pays for coverage, references or native advertising space on a website. The new update will look at removing the ability for this type of tactic to pass page rank (pass value).
Matt suggests that these sorts of ads or inclusions need to be clear and conspicuous and disclose to users that these are indeed a paid message on a site and not pass value intentionally.
Two different changes are coming up to deal with search queries around areas that have been traditionally had more spammy practices employed to generate ideal rankings in SERP listings. Areas such as including the likes of “Pay Day Loans” as an example and areas highly contested by spammers such as pornographic, gambling and pharmaceuticals industries to name a few are aiming to be impacted.
We will have to see how this roll out truly effects the SERPS once the cat is out of the bag, as Google are not saying much more about this element of the update right now.
Devaluing upstream spam links
Denying the parsing of value from links, ‘upstream’ of the spammers practices is intended to roll-out as part of the algorithm update over the next few months. Looking at the sites, and links ‘juiced up’ at the top of the spammers ‘food chain’ will have a knock on effect to devalue that which sits under it.
Google are in the early days of more sophisticated link analysis methodology to better determine the value of links. Google are yet to see if this methodology they will employ to help refine their results will ‘bare fruit or not’. Stay tuned.
Looking at detecting hacked sites more effectively and trying to communicate better to webmasters about hacked sites by providing better information, notifications and tools to do this though Google Webmaster Tools.
Been hanging out on web spam and black hat forums and discussing a range of methods to game Google over time? Well then you might find it “A more eventful summer for you” – Matt Cutts
Google aim to be displaying better results based on authority when it come to a particular industry niche or sectors. As an example, the medical field, travel, or finance will have reputable companies and commentators within these sectors. Google aims to add more value to these sites and rank them higher, in turn serving more appropriate results in line with the authorities in these spaces for users.
Additional signals for Panda
Google are looking to update some additional signals for the Panda algorithm update to assist with refining sites that may sit within the ‘grey area’ that may indeed have some additional signals of quality. This may help some of these sites gain some better results.
Based on feedback from people, Google are looking at ‘potentially’ rolling out an update looking to impact cluster results (when you see a group of results from one site in SERPs). Matt suggests that once a cluster has been discovered on any given search engine result page, that it would be unlikely to discover further instances of this site and a user went further though the results pages. Hopefully the aim here is to stop saturating SERPS with any one site, and to offer a more realistic access to information around any given topic.
More information to webmasters
Further information via Google Webmaster tools around how to better understand the performance of a site is another part of an update from Google. The aim of providing more concrete details, example URLs’ where to go to help diagnose their site and other tools is all part of Google’s attempt to refine the information available to make better site for better results.
Apparently some good improvements are coming with the Penguin 2.0 update with more updates ‘cued up’ to roll out in the not too distant future. Will this help authentic brands including small and medium businesses, and those who have done things in line with best practice guidelines? Will it mean a significant negative result for those going against the guidelines and gaming the system?
We hope so. Time will tell.
Dan Petrovic, the managing director of DEJAN, is Australia’s best-known name in the field of search engine optimisation. Dan is a web author, innovator and a highly regarded search industry event speaker.
ORCID iD: https://orcid.org/0000-0002-6886-3211
2 thoughts on “Penguin 2.0 and Matt's overview”
Nice One 🙂 Thanks For Sharing
I Look forward to this!