Your Go-To Guide to Google Core Updates

the reach of google core updates
Google Core Updates can represent massive shifts in not only SERPs but also search behaviors of users by reacting and changing the way organic search occurs over time

Google Search is a widespread household name among most people around the world. There was a time when Google was unknown and had minimal effect on the world of search technology. Today, Google handles, on average, 5.6 billion searches per day, 228 million searches per hour, 3.8 million searches per minute, and 63,000 searches per second. This massive amount of search activity on averages tops more than 2 trillion searches a year. All this searching requires a fast, intelligent algorithm to help sort and categorize all the content Google finds online. The process is continuously evolving to stay ahead of human search behavior and understand better how to find the perfect result. The search algorithm is updated multiple times a day, but sometimes a significant change is made to the process, enter the Google Core Update.

What is a Google Core Update?

To understand what a Google Core Update is, we must understand how Google Search works on a fundamental level. When a user searches on Google.com or any number of affiliated search partners, a query gets passed to their servers. After matching this query to a SERP (Search Engine Result Page), they provide the user with the result. Google uses a proprietary algorithm to create its SERPs, using them as results for search queries. We see evidence of this process when Google presents both the number of results and time it took to process the request on the SERP.

A Google Update happens routinely throughout the day, with some days seeing, on average, nine updates per day. Google updates are not straight forward in their efforts “to present relevant and authoritative content to searchers,” as they must also balance openness with stability. This goal can be a challenge on a platform that handles 63.2 percent of the US search market as of July 2018.

This leads us to the Google Core Update process. The smaller updates that regularly occur are intermittently broken up by massive changes to the nature of Google’s search algorithm. These updates change a fundamental aspect of the algorithm and generally receive names as they enter the history of the SEO world. Some famous examples in recent history are the BERT Update, Medic Update, Panda, and Penguin updates. The newest addition to this brotherhood of Core Updates is the May 2020 Core Update, launched on May 4th, 2020. Some SEO professionals have dubbed this update a monster, we have yet to get a collective agreement on the name for this particular monster.

Why is This Update Considered a “Monster” for its Effect on Search Results?

the google algorithm is complex
The complex algorithms that are part of <a href=httpswwwgooglecomsearchhowsearchworksalgorithms target= blank rel=noreferrer noopener title=How Search Works>Googles search process<a> are constantly updated and monitored with live testing and trained external Search Quality Raters from all over the world

The May 4th, 2020 Core Update, and other similar updates affect rankings over a broad range, and SEO professionals call these changes “rankings volatility.” Several sources track volatility, including RankRanger and SEOMoz. The more industry verticals that are affected by an update and the more significant the rankings shuffle, the more it shows the overall extent and size of a Google Core Update.

The May 4th Update on RankRanger resulted in a Desktop SERP risk level of 87 on May 5th, and 116 on May 6th, the highest desktop-based risk assessment RankRanger announced in the preceding 90-day window.

Similarly, SEOMoz assigns SERP volatility a score between 1 and 10, in a nod to the Richter magnitude scale used for earthquakes. After the confirmed update on May 4th, SEOMoz reported a 9.3 on May 5th, 9.4 on the 6th, and 9.0 on the 7th. This stretch of volatility is the longest consistent 9+ stretch of volatility within the previous 30 days.

Consistent with data from other SEO professionals, the volatility experienced after the May 4th Core Update indicates that it is extensive and defining. In fact, the last time the internet saw volatility on this level is during the prior Penguin and Panda updates, where Panda 4.1 affected 3-5% of all search queries, and Penguin affected 1-4% of queries. Although relatively fresh and new, recent data is showing ranking visibility gains and losses of an average of 6-11% across all industry verticals. This means a large number of keywords are fluctuating, and websites must conduct position audits to align their SEO strategy accordingly.

Why Does Google Make Core Updates and When Do They Occur?

Google Search creates core updates in response to broader global trends in search habits from users. Core updates help results stay relevant when human nature, events, and technology change the way we search. These changes can be unannounced, but usually, Google announces updates publicly within the first 48 hours of an update.

SEO professionals use several channels of communication to keep up to date with Google Updates, ultimately staying reactive to changes. These channels include Twitter communication from Danny Sullivan, the official public liaison of search for Google. Another communication channel is Search Engine Land and SEOMoz. Both websites act as hubs for SEO professionals to submit data and gain consensus on the reach and depth of a particular update by combining their insights across the search sphere. SEO professionals end up providing valuable data that helps everyone mitigate and react to changes as a community.

How to Determine if a Website is Affected by a Google Update

best practices for seo experts
Over the years best practices and industry standards have emerged through trial and error that help SEO experts monitor prepare and more importantly react to Google Updates when they occur

Determining the effect of a Google Core Update on a website requires a bit of leg work. When approaching the issue, SEO professionals start by examining historical data if it’s available. Thankfully, the adoption of Google Analytics is increasingly becoming standardized throughout the web development world and can give digital marketers the ability to review past traffic and see trends in gains and losses. These trends, when combined with various traffic, engagement, and behavior metrics, reveal how and why website visitors are coming to the site.

Adding another tool to the mix, Google Search Console can help digital marketers discover organic traffic patterns, indexing errors, manual actions, organic CTR, and crawling factors. Identifying lost traffic, keyword positions, and CTR for specific date ranges allows SEO specialists to react to specific algorithm changes.

Finally, third-party tools, such as SEMRush, Majestic, Ahrefs, Moz, and RankRanger, can provide extremely detailed insights into website stability, traffic, and growth opportunities when configured correctly and managed. These tools also provide industry-specific material that can help even non-SEO professionals ask the right questions.

Ultimately, if a marketer utilizes the means provided by Google and third-party tools to review insights including traffic patterns, backlink growth, and loss, position changes, and crawl reports, then they can properly determine whether a website is affected by an update or future updates.

What Tools and Best Practices Can Help Avoid SERP Volatility?

“Content is king” is a phrase thrown around by many SEO professionals. The exact meaning and worth of those words are a debated topic throughout the SEO landscape. However, content is the primary factor that can decide whether a website can avoid SERP volatility, and backlinks would be a close second in terms of performance. Good content, however, creates backlinks, so in the end, content is still king.

Google designs the SERP around the idea of providing relevant and authoritative content to searchers. Historically, they are secretive about their algorithm strategy, but they have provided some help when it comes to Google Core Updates. After the Panda Core Update, Google provided advice in the Google Webmaster Central Blog on May 6th, 2011, that is still considered relevant by the SEO community today. Google said it, so why not believe what they say?

analyze your website to understand the changes
The Google algorithm is complex powerful and one of the biggest secrets in search engine optimization Dont get lost though there are ways to approach understanding it

Begin by asking the following questions.

  • Is the information trustworthy or authoritative?
  • Is the content shallow, does it have expertise? Does it have substance?
  • How objective is the content? Is it in depth? Does it answer the entire question?
  • Does it show both sides of a story and bring value to the information it seeks to present?
  • Is the content edited? Or is it hastily produced with grammar and spelling errors?
  • Does the article provide further reading or more in-depth articles?
  • Does it provide a comprehensive snapshot of the data?
  • Does the content give any reason to bookmark or share the page?
  • Would a user complain when they see pages from the site hosting the content?
  • Does the content fit the niche of the website? Alternatively, is the content merely presented with no relation to the theme of the site?
  • Is the content mass-produced or outsourced to an extensive network of sites with no real care or attention given to the final result?

Answering these questions can help a website owner, digital marketer or SEO professional determine whether content lives up to the standards that Google is seeking. Truthfully examine the answers and seek to correct the negatives. Make thought-provoking content that demands to be read, shared, and used as a resource by others with the same search interest as the targeted query.

It’s recommended to utilize all channels available in a digital marketing strategy and create content that can be shared quickly and effectively. Implementing interactive elements and video also helps increase engagement from avid readers, image-based viewers, and the video-watching audience. Also, consider backlinks as recommendations of the authority and respect for the content presented on the page. Create content, display data, and present facts that will gather links without actively demanding them. We can adapt a famous movie line from Field of Dreams to fit the logic—“build the right content, and the links will come.”

Some free tools that can help prevent SERP volatility are SEOQuake and MozBar. These tools are browser plugins that allow marketers to see performance metrics at a glance when on a page. SEOQuake provides on-page SEO audits, internal and external linking data, density reports, and URL comparisons. MozBar provides authority scores, overlay tools for on-page metrics, custom searches, and a URL comparison tool. Professionals specializing in SEO can use these tools to examine websites for pain points that can be adjusted before becoming issues for future updates.

search habits constantly change stay ahead
Human search patterns are constantly changing and evolving with those changes Google continues to monitor and grow to meet the demand for better and more relevant search results Your first priority as an SEO specialist or website owner is to stay informed of these changes while creating great content

Google Core Updates will never stop as long as human search habits and technology continue to evolve. It is the responsibility of SEO professionals and digital marketers to work within the guidelines laid out by Google and avoid the volatility of updates by not being affected by them in the first place. By creating and demanding quality content for a website, SEO professionals can avoid reacting to a Google Core Update and, instead, proactively prepare for the advantages they will enjoy when the next one goes live.