A Broad Core Algorithm Update Recovery Example: Gurmebebek SEO Case Study

This brief article has been written by a friend of Holistic SEO & Digital, to show an example of Broad Core Algorithm Update Recovery. As Holistic SEO & Digital, we can’t say that we agree or support every argument in this article, but we can say that we find this article as interesting and valuable as an SEO Case Study for Google’s core algorithm update analysis. In the future, this article will be turned into a Broad Core Algorithm Update Analysis and Recovery Guideline with more concrete results. We thank Atakan Erdoğan for creating this SEO Case Study as a starting point for Core Update SEO Recovery and Analysis.

Hello to everyone. I am Atakan, the founder of Doktorunla Konuş and Talkurdoctor. This article will be my first English SEO case writing. I’m waiting for your comments. What I’ll tell you in this article? Through a real SEO case;

  • How can you predict a website with possible traffic loss?
  • How is the lost traffic analyzed?
  • What actions can be taken to regain traffic?

First, let’s look at the website; Gurmebebek.com is a website that has received over 500,000 singular organic traffic in the last 6 months.

SEO Case Study's Site
SEO Case Study’s Site, Gurmebebek.com’s homepage.

Let’s move on to analysis now; First of all, I want to start this case with the following words: Attention! If the traffic is stable for a long time, it may be an abnormal situation. This phrase is a personal principle derived from my experiences. I think when a website doesn’t be tested by the Search Engine for a long time, it can have some risks in a growing way.

While the traffic of Gurmebebek.com as usual, it lost 25% organic traffic in just one night. That confirms my expression once again.

Core Algorithm Update Effect
Core Algorithm Update’s Effect on the SEO Case Study Site (gurmebebek.com)

Post-Core Algorithm Update Analysis

I checked the number of available web pages under the Coverage menu in the Search Console. Then, I looked at how many pages Googlebot makes crawl requests per day on the crawl statistics page and compared these two numbers with each other. Initially, I didn’t think it was a crawl budget problem, because the numbers were close to each other.

Crawl Stats for SEO Analysis from Google Search Console
SEO Crawl Stats from Google Search Console for Gurmebebek.

Note: Google Search Console’s crawling activity report is not definitive, and it doesn’t include the all reality of the current situation. But, in this example, the partial data from GSC Crawl Stats is used for extracting insights due to velocity anomaly.

P.S. If there is a large difference between these two numbers, there may be a crawl budget problem on your website.

Coverage Report
Coverage Report of Google Search Console for Gurmebebek.

To understand the problem more clearly, in the Site Contents section under the Behavior menu in Google Analytics, I compared all websites as day and previous day that traffic loss happened. I realized that the main reason for traffic loss is the most precious web pages that the website offers to its users instead of all web pages. One of these pages was the Percentile Calculation page.

Organic Traffic Change after Core Algorithm Update
Gurmebebek’s Organic Traffic Change after December 2020 Google Core Algorithm Update.

Isn’t Googlebot crawling the most precious web pages? I thought about the question. You can get a detailed answer to this question by doing a log analysis. However, I wanted to do a little test. I made various changes on the Percentile Calculation web page and tested it on the SERP for 2-3 days. After the results I saw, I realized that Googlebot didn’t crawl, or crawl very little my most precious web page. That was my first finding in my traffic loss research.

I continued my analysis by looking at other problems. Gurmebebek.com was a vertical website about mother and baby. It has been serving for years, and its social interaction was very high. However, the website was outdated. If you have a direct or indirect website about health, you know that it’s crucial to becoming a master in the E.A.T. rules. E.A.T. has simple, but comprehensive rules.

I recommend you to review Google’s detailed E.A.T. guide

(https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf)

On the recipes and pediatrics pages on Gurmebebek.com, I realized that none of the E.A.T. rules were enforced. That was my second finding. Now it’s time for action.

Index Bloating and Crawl Cost Optimization

I had to delete worthless web pages to optimize the crawl budget. Therefore, from the Detailed Site Contents section of Google Analytics, I filtered since the website’s opening date.

In the results, I clicked once on the Page Views and sorted it from low to high.

Analytics Data Sorting
Sorting Landing Pages according to pageviews.

To do this, you can exclude your parameters from Advanced Filter.

Analytics Data Filtering
Filtering landing pages with URL characters.

By limiting the visits to a minimum of 750 visits in 2 years, I listed the ones below 750 and I gave my list to the software team to permanently delete them. If you ask why 750 visits in 2 years; It is our joint decision with the site administrator to accept a web page that hasn’t even received an average of 1 visit per day as unsuccessful.

The point to note here: Whenever software teams want to delete a page they usually prefer to give 404 code to the page. However, 404 isn’t meant to delete. If a 410 status code is given instead, the page gives a permanently deleted notice for Googlebot. I completed my analysis for my first findings and passed them on to my second finding.

There wasn’t information about the author on the recipes page. Besides, there wasn’t separate user-oriented information. By requesting that the recipes are-updated in more detail, we started to offer some details separately, such as cooking time and ingredients.

Gurmebebek Landing Page for recipe
An example of landing page for recipes in Gurmebebek.com.

Adding Structured Data for Better Search Engine Communication

I found the most appropriate structural data for the food pages on Schema.org. After that, I prepared a structural data example provided that we select the properties within the information we provide, and I sent it to the software team to apply it to all pages. I wanted the medical contents to be approved by the specialist doctor and inform the user about the doctor for the health pages that need to follow the E.A.T. rules, such as food pages.

Information on health content expanded and was approved by the doctor, and the doctor’s information was added to the web page. Also, we presented all these improvements to Googlebot with the medical web page structural data.

Article Landing Page
An example of view of article from the SEO Case Study site which is Gurmebebek.com

As a suggestion; there are many websites that give you many ready-made structured data templates. These websites give you templates in a shortcut, but each web page may have different technical details in itself. Because of these differences, my suggestion is: manually edit the structural data of your web pages on schema.org.

Schema for Recipe
Schema for Recipe

With the Google Structured Data Test Tool, you can check the structured data you have set up and applied. I strongly recommend checking it, because a single parenthesis can render JSON code inoperable. Apart from these actions, many large and small missions were done, but these were our main problems. By working in coordination with the software and content team, I can say that we have regained all the traffic. To tell the data by numbers, our website that lost 56% organic traffic in 3 months, now achieved a 158% organic traffic increase in 4 months.

SEO Case Study Graphic
The Recovery Process and its graphic from Gurmebebek’s Analytics Account.

In the next term, we will continue to serve visitors with new web pages by developing new opportunities on the website. We have tried to regain what we lost until now, but from now, we aim to increase our traffic. Thank you for reading. I hope you stay well. 

Thoughts on this SEO Case Study from Holistic SEO & Digital

I believe that this SEO Case Study has lots of things to be more covered to include the Search Engines’ decision trees and communication style with the content publishers. In the old times (before 2015), Google was saying that they have more than 200 ranking factors, but now they say more than 2000 ranking factors within their blog posts. Thus, examining a website’s recovery after a Google Core Algorithm Update is challenging due to the excessive amount of variables, and Search Engines’ internal agenda.

Because of Search Engines’ black box, SEOs tend to learn data science and automating SEO-related tasks so that they can have more data and observation chances for the Search Engine. Python for SEO is one of the solutions for this situation. Another solution is listening to the personal experiences of the SEO of the projects. This SEO Case Study presents the perspective of the SEO Manager of a project, and it has value to understand what can be done and what to focus on after a core algorithm update.

Our Core Algorithm Update Analysis and Recovery Guideline will include more case studies by time while becoming a long Broad Core Algorithm Update SEO Tutorial.

e69bec344a2fbb3793563d911ba6fd30
Latest posts by Atakan Erdoğan (see all)

Leave a Comment