Holistic SEO Case Study: The Importance of a Holistic Approach and Clear Communication to SEO

Digital PR and Local SEO worked together to improve the authority of the site and bring a significant increase in relevant traffic to Encazip.com within 3 to 6 months.

The general precept of this SEO project was:

“Every pixel, millisecond, byte, letter, and user matters for SEO”

Koray Tuğberk GÜBÜR

The project was led by Koray Tuğberk GÜBÜR of Holistic Digital who is the author of this case study. 

ojWAO6rArLTq3BEPBrw19qLftGm2gUOWUdV7hBTNbshjYZQ7f9fYN1DRw2vKN8r9GDgAuyPj7OPXyYIJ wrlaI4h8849Q4kB5eh45iR2XPiAnPwS j1vNi pDLF1R 3s0 yuRsYMoUkszHVhbgDw4I tYUGYPO1tXo8AbvTQW qzlC7ykH1YCuTrFOGD
Last 3 Months Comparison of Encazip.com, taken on 6th February 2021.

This case study will demonstrate how clear lines of communication between software development and marketing teams can positively impact SEO results. The Encazip SEO Project achieved: 

  • 155% Organic Traffic increase in 6 months
  • 110% Organic Traffic Increase in YoY Comparison for the same three months ( November to January)

To achieve this increase in Organic Traffic, the only targeted keywords were “Relevant and Possible Search Activities” with Semantic SEO Conception.

BkfzPeezWnEQoo8UfmAzpKc 0ywDtSk8FU6AUoImQzR0C rFB9r6iNkWLSHGitZZr7bFK5HhjmArzQsOCBSwjKQZ4Db6nHkw56b1zIVRvJUGec8beXt0cbJ7VviKeqeMRG6KlMOTqslD3qVtwJH2DdnAjceTKZc4 IoDJ hHJtL e9qA4tRSEGVxOn D
Comparison between August and January 2021 for Encazip.com
TGTE0l4xAg72YR7 rB7SSw3rlMpe3UZiU1IXtacq1bJI9vKzt FtyQ9uTk7WyNsfD2a toSoUD7t7wyw48ib60CmKC9FalW4f9FWtXT3uyZeo0a23VWUdGu7fJQhtmZifm0TE677rDSwWM6tfaAzrmw8WaQeoYjTXG7DB2 tGNd67LWt7YzhyB6r4qqP
Organic Search View from Google Analytics with Average Session Duration.
resim 18
The last state of the Holistic SEO Case Study.
Contents of the Article show

Holistic SEO Tutorial and Case Study Step by Step

Encazip.com is an energy consumption tariff comparison, modification, savings brokerage, an affiliate company. Encazip.com is a Turkish-based company, founded by localizing the know-how of the founders of uSwitch.com.

This is a YoY Comparison for the Encazip.com SEO Project’s GSC Data using DataBox

The idea of “Encazip.com” has been realized by a consortium under the leadership of Çağada KIRIM and backed by British investors, including Henry Mountbatten, The Earl of Medina. Electricity pricing and industrial or personal energy consumption is the primary target of the company. At the beginning of the SEO Project, the design, web page loading performance, branding, and content structure along with the education of the customer were looked at with a fresh perspective.

The “Everything matters in SEO” perspective was adopted by the customer.

“We acknowledge the importance of SEO with no doubt, however, we would rather approach this subject to be one of the foundations of our corporate culture. I believe we have achieved the establishment of SEO culture in encazip.com thanks to Koray, and I am looking forward to being a part of more case studies on this very important subject.”

Çağada Kırım, CEO, Encazip.com

Background of the Holistic SEO Case Study and Research for Search Engines

If Encazip.com’s team were not passionate about SEO, then it would not be possible to perform this case study at such a detailed level. Thanks and acknowledgments go to the whole team, especially Miss Yağmur Akyar, Mr. Erman Aydınlık, Mr. Nedim Taş, Mr. Oktay Kılınç, and Mr. Can Sayan. CEO Mr. Çağada Kırım, who brought the entire team together and organized it for SEO, was also a major factor in the success of the project.

Some parts of this SEO Case Study have been written by the Customer’s Team and will cover Technical SEO, Branding, Entitization, Content Marketing, Digital PR, and Web Page Loading Performance.

C1 yVfz2BdNGfKCO7WqO6hFAtVGZsLGNtYaeW6hDwqWV740UK6syIoqpPZs4NNcCTFlbPXW9Vfg0pzm a4BV gsFrU gJmgHCpRrSegPuMDu0yNZw3TijFdFoyS5w6pZbY1IdisG8ch DWnY REjhIc 9QNhYyK2K

This screenshot has been taken on 9 June 2021 which is 7 days after the 2021 June Broad Core Algorithm Update of Google.

3.0 Page Speed Improvements: Every Millisecond Matters

According to Google’s RAIL model “Response”, “Animate”, “Idle”, “Load”, a developer-only has 10 milliseconds to move a pixel. To acquire “60 Frames per seconds”, we have 16 Milliseconds, but also the browser needs 6 milliseconds to move one frame over another, which leaves us only 10 Milliseconds.

And, Pagespeed is actually a “health” and “trust” issue. If a frame doesn’t move or change itself in 16 milliseconds, users will notice some “motion bugs”. If it stays static for more than 100 milliseconds, it means that something is wrong. If it takes more than 100 milliseconds, users will start to “stress”. According to Google, a slow web page can create more stress than a fight for human beings, and it affects daily life negatively.

This is the “core of my statements” for any first meeting with my clients’ development teams.

Response, animation, idle, and load (RAIL).

Below, you will see a quote from Nedim Taş who is the responsible developer for the front end of Encazip.com

“The more important a brand’s image, the more important SEO and Code Performance are. SEO and Front-end Development can work together to create a better brand image by taking users to the most usable and accessible website for their intents.”

Nedim Taş

There is another “core lesson” here. I have many clients that have more than 15 members for their IT team, but yet, these crowded IT and Developer teams can’t create the same productivity and effectiveness as just a 2 personal team. If you ask me what the difference is, I would say “SEO Passion and culture of the Company” and “growth hacking instinct”.


Now, we can look at what we have done for improving the web page loading performance of Encazip.com during the SEO Case Study and Project.

3.1 HTML Minification for Gaining 15 Kilobytes per Web Page

HTML Minification means deleting the HTML Inline Comments and Whitespaces of the HTML Document. This is one of the most essential page speed improvements but there are more benefits such as:

  • Helps Search Engine crawlers explore the “link path” faster.
  • Helps HTML digestion for Search Engine Crawlers and their Indexing Systems.
  • It Lets users’ devices build the Document Object Model faster.
  • It lessens the burden on the website’s server and lets users’ devices consume less bandwidth.
  • A complex and big HTML document might prevent Search Engine Crawlers from loading the document or evaluating it for ranking purposes.

For this last point, we can look at the old warning message from the previous version of Google Search Console”: “HTML Size is too large”. This warning was valid only for news sites. Google didn’t move this warning to the new Google Search Console, because I guess it didn’t want to share “weak points” or “deficits” of its indexing system. But, we still have the same warning from “Bing Webmaster Tools”.

“HTML is too Large Warning” from Old Google Search Console’s News Website Section:

67Vk3 j7XyWwH2N aRMs3h6IM1JbeyyAoJfaHxvEyCCR4Caoftrc5Keis741fEF5vgatkyvYf9WqYF7uV3rR21cph9BoVbh0Zc4 zk9P9z0WQCgn7Lh4TtNaUnjpFzL0WHSAKUz 8pXVJ2 2wlJemQ49LBSw6m65nQYyhIZXdgLgtLAMl biTzEEsIqJ

In the old Google Search Console, the “extraction failed” error happened if the HTML was larger than 450 KB.

“HTML is too Large” warning from Microsoft Bing’s Webmaster Tools

QHOI2Yc6HnnRgr0j MWmvej6 0Fgl3tGYr3hCBMFXB7kzzdrDqN14iCIbZvmQpbgllv5qFg1Heti84O3BMDCH1k7KK FekIzlUff0AKW8R7CqJwVYJiifSfSTBhaU5jh2i5S2KmWqDKOVDnOWJjtIsaSORqkL7cFudbRy5L1XCwaqEUXUiN1O88ZkwQ6

As a note, Microsoft Bing appears to be more open with information sharing about its algorithms and “desires” from webmasters than Google’s “BlackBox” attitude.

HTML Minification for Encazip.com was done in the first month of the project.  But further down the line, we had to can this work as we had some “server incapabilities” during the migration from .NET to .NET Core.

So, even the simplest Pagespeed Improvement has lots of aspects and value for SEO, also bear in mind that it might not be as simple as you may think!

3.2 CSS and JavaScript Refactoring

All Developers know that “refactoring a CSS File” is actually harder than writing a CSS File from scratch. Some developers call this situation “Code Spaghetti”. If you experience this, you should create a clean and efficient new CSS and JavaScript file for the same layout and functionality.

7LP9dmjFIcvgDINDjy2DQx8atTwJ YEXqwIAh92VozmvA0k1 9IZ19FvdUVcB7iSxbBBWe3dOcDVdm2GjaznZeyMSoIJ1m9x0ihW1cAeQL9aEFI DNOebxRpYLm p2hg1P1

A screenshot from “CSS specificity calculation”.

In this example, the “CSS and JavaScript Minification” took place at the same time as “CSS and JavaScript Refactoring” as there was already a plan for the website redesign.

(Editor’s Note: For more on this, see Koray’s article on “Advanced Page Speed Metrics”.  He covers what to know and what to do to achieve an efficient Rendering Tree).

Before the JavaScript and CSS Refactoring process, there were more than 8 JS and CSS Files for only “page layout and functionality”. And, most of this code was not used during the web page loading process.

Test-driven development Methodology (TDD)

1YNnCxin4pxDcEZjJ 2J5tFwBB7mcFQPmuCgiajyUghLVCJXlwkN5 Yfkwdc9FiUXulKRVrWLvBsjWcyrgWJa Ofsj7SDlNz0CbGcwawMD8cqZiKaNjbva1wx2TwPxKwgch8u nACktcGtwdtHTOJKW4lGp73XKqewfCvcnGucae2d2mn6IbK3LXIEfc

The total size of these web page assets was more than 550 KB.

At the end of this process, the development team decreased the total number of CSS and JS files to 3.

  1. Two of these 3 files were CSS and their total size was 14 KB.
  2. The JavaScript file that is being used for functionality was only 7 KB.

3.2.1 Why Have Two Different CSS Files?

Having the two different CSS Files lets Google cache and use just the necessary web page resources for its crawling routine. Googlebot and other Search Engine crawlers use “aggressive caching”, which means that even if you don’t cache something, Googlebot stores the necessary resources.

r00TRaD1Q VeUwzLsld 1P2Zr4E9PxuLk0 LmA yXoPBYx8wJfBA7kwVehgXQJ6GXKLQTdgPUldSGuemMyTjgaM6bZqx018Rb0zGkZ7DyeElBKTZSmDIs2UIQzkNbACTJ88Er5l2QNSwpXEpU

YoY Comparison for November, December, and January.

3.2.2 How does a Search Engine crawler know what to cache?

Thanks to aggressive caching of web page resources, a search engine uses less bandwidth from the website’s server and makes fewer requests, thus the Search Engine Crawler can crawl a website more quickly. 

To determine how to crawl a site, search engines use Artificial Intelligence. If a website uses all its “CSS and JavaScript files” site-wide, it means that the Search Engine will have to cache all of these resources.

ZQsWeo7XgoD0 zMDm6lND6PpzXzpSv0MbUf5YoR7tL7FTo7koxFXSnDdbBFZOKlCWitNTX0IiFyDu0T6rbDre9fu DeBryuJwc4sK xq0KIiCa 8BDFOXjy9v3HIiIlGj8MFIraUCe9NsXm

But, in this case, only the necessary CSS File for the necessary web pages was needed, thus we can facilitate “fewer requests” for the search engine crawler and also help it use only the essential files.

In our example, only three different types of a web pages were used according to their functionality and importance:

  1. Home Page
  2. Product and Service Pages
  3. Blog Pages

Thanks to the “intelligence website design”, the code necessities had been united for the “product, service, and blog pages”. Thus, fewer CSS files and code were used for managing the site.

  1. Encazip.com has “headfoot.css” which is only for the header and footer area of the website.
  2. A “homepage.css” file for just the homepage.
  3. “Subpage.css” for only the blog and product pages.

As you can see, Googlebot and other search engine crawlers can cache the “headfoot.css” easily because it affects the site-wide section of the web page. Also, “homepage.css” is only being used in the “homepage”, so for most of the crawling activity, search engine crawlers also can use the “subpage.css” file while saving itself from the “home page CSS codes”.

In short, the effect of this strategy was:

  1. We decreased the CSS-JS File size from 550+ KB to only 22-25 KB per web page.
  2. We decreased the request count for CSS-JS files from 8+ to only 3 per web page.

And lastly, “CSS and JS Minification”, have been used. You can see again that “Every millisecond and byte really does matter”. 

LueKyHtwwy01PFfyqEJRtRWGCdkbWlv1YayOBugaAWFyMXqhO4yoUZ4sZXsrCPuviNnGYtAlEL4ZJaeIlJmTsNLn3oYGEBxq0jc5 Jh

With Authoritas’ Google Search Console Module, you can examine all the queries and their traffic productivity as above.

Text Compression on the Server-side and Advantage of Brotli

Originally, the company was using the “Gzip” algorithm for server-side compression. Many brands today are using “gzip” compression. Brotli was invented by Google. You can examine “Brotli’s Code” on Google’s Github Profile.

Brotli uses the LZ77 algorithm for lossless compression, for Brotli there is also a Google Working Group.  Brotli performs 36% better than Gzip!

And, in the fourth month of the SEO Project, the team started to use Brotli for the text compression on the server side.

27tyVJDDsLbL7Z 2AYz 8Wrw3e1pw5pd0OVBPsCiXaHOtJqXChCM2mcDhhftKNYTjEhSQTyLPO16oleChwdlieBC88bc8WSJS1cWp2fs4NEnzCRrhv273S4xC6kh5Cissi17BVg2vjW ieX5mr gwiF cRcaXt3RhV URzF4bwQo6nOC75Y4nO 6b9T7

Compression technologies and file size.

3.3.1 Why is Server-side Compression Important?

Here is a breakdown of the importance of Brotli usage.

  1. The hardest thing about loading and rendering a web page is transmitting the files from a server over an internet connection.
  2. If you check the Chrome DevTools Network Panel, you will see that for every web page resource, the longest part is the “requesting a file” and “downloading the file” over a network connection.
  3. Text compression means compressing the files on the server side and conveying these files to the requestor.
  4. Since the file sizes are being decreased by compression, conveying these resources over a network connection is simpler.
  5. Thus, every web page resource will be loaded faster by the requestor which is the search engine crawler or the user.
  6. After loading the web page resources, the resources will be decompressed by the browser to use for rendering, parsing, and compiling the web page.
Xa9f6ORl2QhO1HDtwdeL1KtCT5HzTiudlxjykSx3aNgHV8ikFrfAf nP5oXnqoEzQPdCcOlb1fA3BMlnmBu23e6wYHPM5KJqy1QvUhhSwlm076Yya19l3 MsHG9si8c0jQYXdpTzYZSdfVWq0X5QvPMjlmDFWBTnDJTqFBz0LlL8wHb6brwmqcJLXD4k

Basically, most of these articles include the “Time to First Byte Improvements”. By implementing Brotli, we have achieved an improved user experience, crawlability, crawl delay time and crawl efficiency of the website.

3.4 HTTP 2.1 Usage Instead of HTTP 1.1

The essential difference between HTTP 1.1 and HTTP 2.1 is the request count capacity per round-trip for a requester to a server. What is the reason for this situation? HTTP 1.1 keeps every request and response in plain text format, while HTTP 2.1 keeps every request and response in binary. Thanks to the binary format, HTTP 2.1 can convey more web page resources over a single TCP Connection.

During the web page rendering, a requester can take only the first 6 resources from a server. If the first 6 requests don’t include all the critical resources for the above-the-fold section, this means that the client will need a second round-trip for rendering the initial contact section of the web page.

a0NL7EgST8ZYqrJL h4UlDUS7h VLQkignsvZ kt7crYLEnz jveD 1F3klI3w5DuyahQXi4W2wRmkLSZi9iIwvw9ygmoA13qR5Z0NGEk5aJW0fHtcmUC1hKdzv3656hjXI5JdnnDGYvXoZIa h bGSFIyrpACdPNy KdK7K ltoEtTrC3HW DiO7LA2

A Schema for showing the HTTP 2.1’s Working Principle.

HTTP 2.1 adoption was a critical change for Encazip.com’s SEO project. Since the critical resource count and size for the fold section were already decreased, the “request round-trip need” was also decreased with the usage of HTTP 2.1.

Furthermore, Googlebot started to use HTTP 2.1 for its crawling purposes, thus our crawl efficiency improved.

3.4.1 HTTP 2.1 Server Push

HTTP 2 Server Push is the first reason for Encazip.com’s HTTP 2.1 migration from HTTP 1.1. After creating the “subpage.css” and “headfoot.css”, we used HTTP 2.1 Server Push for creating a faster initial contact with the users.

HTTP 2 Server Push lets a server “push a web page resource” to the requester even if the client doesn’t want it to. Thus, before the client makes the request from the server, the server will push the resource so that the connection and downloading process of the resource can happen faster.

7jNKlNvvwxx 1 AIhFnWMRIPg2Xn fso1yW6b ngr7bp9Y86X9MlYpOwbRvG cTTYca5HQ0PYLLwQaBX5mkGw7TIpGumOZGWLt9zeqT0AhKGVY9ttalZg4vW8Z7ob55hIg8ZdqsQJSX t9J GixKLLwn077zOcNcR5 qv84uKky7XW VyXs8TjVZZ8CX

Server Push’s working principle.

For the HTTP 2 Server Push, the team determined to include some resources from the above-the-fold section of the page, such as “logo, headfoot.css, subpage.css, and main.js”.

Please bear in mind, HTTP 2 Server Push also has some side effects.

  • If you use HTTP 2 Server Push for too many resources, it will lose its efficacy
  • The main purpose of HTTP 2 Server Push is to only use it for certain and critical resources
  • The resources within the HTTP 2 Server Push can’t be cached
  • HTTP 2 Server Push can create a little bit more server overhead than usual

Because of only the final point, we ended up using HTTP 2 Server Push only for a short time.  Once we have made some back-end structure changes, Can Sayan plans to use it again?

“Lack of a caching system or a strong server… Long queries that slow down the response times… All of these affect the User Experience and also SEO. Thus, we are racing against milliseconds.”

Can Sayan, Backend Developer

3.5 Resource Loading Order and Prioritization

TCP Slow Start means that only the first 1460 bytes of an HTML Document can be read by the client. This is actually designed to protect servers. Thus, the most critical resources need to be at the top of the HTML document.

Our Resource Load Order is below with the HTML Tags and Browser Hints.

  1. <link rel=”preload” href=”/content/assets/image/promo/banner.avif” as=”image”>
  2. <link rel=”preload” href=”/content/assets/css/headfoot.css” as=”style”>
  3. <link rel=”preload” href=”/content/assets/font/NunitoVFBeta.woff2″ as=”font” crossorigin=”anonymous”>
  4. <link rel=”preload” href=”/content/assets/script/main.js” as=”script”>
  5. <link rel=”stylesheet” href=”/content/assets/css/headfoot.css”>
  6. <link rel=”stylesheet” href=”/content/assets/css/subpage.css?v=23578923562″>

You can see the translation of these resource load order’s logic below.

  1. “Banner.avif” is for the Largest Contentful Paint.
  2. “Headfoot.css” is the First Contentful Paint
  3. “NunitoVFBeta.woff2” is for the “FOUT” and “FOIT” effect.
  4. “Main.js” is for the functionality of the web page.
  5. “Subpage.css” is for the general layout of the Product and Service web pages.

And, you can see the profile of resource load prioritization below.

Encazip.com Resource Load Order Prioritization.

3.5.1 Cross-Browser Compatibility for Preload Usage

In the image above you will see some “duplicate requests”, these have been purposely left in. This is because, until January 2021, Firefox didn’t let SEOs and developers or Holistic SEOs use “preload”. Thus, if a user-agent included “Firefox”, it couldn’t use the “preload”, thus we have also put normal request links without preload.

Don’t worry, Google Chrome won’t request the same file twice!

9 eBahdYwiK oa lLH0hINt86 coXeTqx4c6SggEjSGSbseyGd6PESRBqobJyfadZygkAUvk4FMfErUiSrHPPRwaTYYAu1u1GAJi2DlwjXAnGXiiZNg7EAQ6LeISa GzfatuzX1vR33aVUtjyvgmE kpOiJu3XHv 0uq26LOSHW7BebwzcJ5uAczijcP
An updated view of Preload browser compatibility profile.

3.5.2 What You Should Know About Preload

Preload does not work in Firefox, but lately, they have started to use Preload without implementing any kind of Firefox flag configuration. Here are other things you need to know while using preload.

  1. You can’t use preload for the resources that you have already “pushed with HTTP 2 Server Push” feature.
  2. If you try to use “preload” for everything, it will lose its meaning.
  3. Preload also “caches” the file for the browser so that returning clients can open the next pages faster.
  4. If you preload too many things, it can create a CPU bottleneck at the beginning of the web page loading process.
  5. A CPU bottleneck also can increase the Total Blocking Time, First Input Delay, and lastly “Time to Interactive”. Not an ideal scenario!

That’s why discussions with your development team are important. It can sound easy while saying “okay, we will just put ‘preload’ value to the ‘rel’ attribute”. But, it’s not that easy, everything needs to be examined repeatedly with a bad internet connection and mediocre mobile device.

J9kU dBO0Sw8oajkg RXodGf8M8BpaoQxY3xjynVzDFYgRhlhd6pnh8r54WQOoxoPAFNlA C7ryiYL8BFIyFul2Xc99XgkryULu RxT6jG5ZXI3apCb Bz6PW6dnB5ojjK2HwHZZMll8jiuYwA jw UGILoRfzaebUAwqtcDDjZttoQqzXfEvCM QJP4
Preload’s effect on resource loading.

3.5.3 Beyond Preload: Preconnect for Third-Party Trackers

Why didn’t we use DNS-Prefetch?  Simply put, DNS-Prefects is only for “DNS-Resolution” for the third-part resources’ server while “Preconnect” is for performing the “DNS-Resolution”, “TLS Negotiation” and the “TCP Handshake”. TLS Negotiation and TCP Handshake are essential processes for loading a resource from a server. Instead of DNS-Prefetch, we implemented “Preconnect” for third-party trackers for this reason.

E7 uAcwmCSbIVrts20yAxrQWCq KbPCzTmSggrWdaJZaZkNhxQZEzjOvc7HTd6JT1JJ0UyStf Z0AoTjI7twt2 N5v9q zp1eyniAt0 0Ai3 zSH c03tGYeDvSOXEX9I8f0BKlrrEelEEu3WQmcV5EAeNFuuUm5HpDOA2fBY06lRwABetoPSUa vhIM

Preconnect’s working principle

Some of the “preconnect” requests for Encazip.com are below.

  • <link rel=”preconnect” href=”https://polyfill.io”>
  • <link rel=”preconnect” href=”https://cdnjs.cloudflare.com”>
  • <link rel=”preconnect” href=”https://unpkg.com”>
  • <link rel=”preconnect” href=”https://www.googletagmanager.com”>

3.5.4 Loading CSS Files as Async

Loading CSS Files as “async” is an important but mostly ignored topic. Thus, it is recommended that you read the “Loading CSS Async” tutorial. To load a CSS File asynchronously, the browser needs to be manipulated with the “media=all” and “media=print” attribute and value pairs. All the CSS files are actually render-blocking, it means that while they are being loaded, a browser cannot also render the web page as with JavaScript files.  For JavaScript, we have the “async” attribute, but we don’t have anything for “CSS” files.

Thus, we wanted to use “CSS Async loading” for Encazip.com, but at first, it wasn’t necessary. Because the total CSS File amount was only 16 KB per web page. It is so small that it couldn’t block the rendering.


CSS Async prevents blocking.

But, when we started to load more resources with the “preload” browser hint, CSS Files started to “block the rendering”. Thus, we wanted to re-plan it as we were only gaining 15-20 milliseconds per web page loading event.

But, we didn’t implement it in the end. Because, when we use the “CSS Async” feature for these CSS Files, it is creating a “flicker effect”. It means that the browser was rendering the web page without CSS first, then it implemented the CSS Effects, this was creating a “turbulent page loading experience” or in other words, a flickering effect.

Just for gaining 15-20 Milliseconds, we didn’t want to cause such stress for the user. That’s why as an SEO and developer, you need to balance things while making a website more crawlable for search engines and usable for users.

So, at the moment, we have left this topic for further discussion, maybe we can use CSS Async only for one of the resources. But, I want you to remember that CSS Async is a feature that relies on “JavaScript” rendering, and this can also affect SEO. (Editor’s Note: I recommend you to check the section for “Image Place Holders”).

3.5.5 Deferring All the Third Party non-Content Related Trackers

All the third-party and non-content relevant JavaScript files were deferred. A deferred JavaScript cannot be rendered until the “domInteractive” event.

And, there are two important things to consider when using the defer browser hint:

  1. If you use defer on the main JS file, you will probably not see its ‘initiator’ effects until it is installed.
  2. If you use defer too much, you may cause a CPU bottleneck at the end of the page load.

Be careful whilst using “defer” to ensure that you do not block the user!

1n7nWD3K66baOZv9QYnWS6R1zucRCtAl9zm6Z4c oFOhQX c8DDcwOjIfBLeJQwS2fQkhTe UFMDgYj2fbqFe6MfBx2tCbiHX AYHQCyBI1A3snEIWOPcBhwVpT1eOK3HppO
JavaScript loading methods and their effects.

In Encazip.com, before the SEO Project’s launch, all the non-important JavaScript files were being loaded before the important content-relevant CSS and JavaScript files.

We have changed the loading order for the resources, so the most important web page resources load first and have deferred the non-important ones.

N/B: Deferring third-party trackers can cause slightly different user tracking reports since they won’t be able to track the user from the first moment.

3.5.6 Using Async for Only the Necessary JavaScript File

As with using the “defer” attribute, using the “async” attribute is an important weapon at your disposal for creating the best possible user experience. In the case of Encazip.com, we have used the “async” feature for only the “Main.js” file since it was the only file that was focused on the “content” and “functionality”. 

And, from the previous section, you can remember that you shouldn’t defer the most important and functional JS file.

With Puppeteer, you can block any resource, so you can see which JavaScript file is affecting what for the above-the-fold section.

3.5.7 Loading the Polyfill JS Only for Legacy Browsers

Polyfill is used for ensuring compatibility of these modern JavaScript methods and files with legacy browsers such as Internet Explorer. And, as an SEO, you always need to think about the user profile 100%. Since, there are millions of people who use Internet Explorer (I don’t know why, but they are using it!)

Since we don’t have “Dynamic Rendering” and “Dynamic Serving” technology at the moment for Encazip.com, Nedim Taş prepared another step to prevent loading Polyfill JS for modern browsers. If the browser is a modern browser, Polyfill JS will not be loaded with its content, but the request will still be performed. If the browser is a legacy browser, it will be loaded with its content.

Thus, for most of the users, we have saved them tens of KB.

1 X9JfkpdRoUupZrCC0xRE4GVmBeJ6Sdxa2z9E33z1a7 kqOXdQnNgBrnId1uKDRERmr7Ib8YRtG6df6DKE7Fdf6c9 IlAWBv ExQ OIWjZNv7tBNuQxPKUOIPuj80aW1KRkrYAcD6Wmq uO fcpiv9g3 qpK86mdHTqqxBjqDaUxx0q8 1p3yO4oVsD
From a Chrome session, you can see that “polyfill.min.js” is empty.

3.6 Aggressive Image Optimization with SRCSet and AVIF Extension

First, let me explain what aggressive image optimization is. How is it different from regular image compression? There are four different aspects of image optimization; “pixels, extensions, resolutions, and EXIF data.”

Pixel optimization in terms of “image capping” is actually a new term. Image capping has been implemented by Medium and Twitter before to decrease the image size by 35% while decreasing the request latency by 32%. Image capping means “decreasing the pixel count of image 1x scale, in other words, 1×1 pixel per dot. Since 2010, “super retina” devices started to become more and more popular. “Super retina devices” mean that they include more than one pixel per dot on the screen and this gives a device a chance for showing more detailed images with higher pixels.

x1trZHzmi2dKA8s0b7z8wrf6T1XvMoBpLZbT2AwtxLCWM2XhKiYaX2Myk77RO0eWGpPr8ZxBiNDdkTPuht58ugzL mnplXXtKHpLCRtG1VIkBPAijv JtMMuYHdBEc Ai3N12nBEZeV

Color and pixel differences based on devices. Above, you see “1 Pixel’s color profile”.

So, what is wrong with super-retina devices and 2x Scale or 3x Scale images?

  1. The human eye can’t actually see the details in 2x resolution or 3x resolution images.
  2. 2x resolution or 3x resolution images are bigger in terms of size.

For Pixel Optimization, you can use “different types of image pixel optimization algorithms”. Such as “NEAREST” or “BILINEAR” from PILLOW.

So, let me introduce you to other sections of aggressive image optimization for web page loading performance briefly.

Like “pixel optimization”, also “extensions are important”. This is common information now. But, most the SEOs, Developers, or Holistic SEOs are not aware of “AVIF”. Most people also know WebP. But, I can say that WebP is already outdated and outranked by AVIF.


Justin Schmitz is the inventor of AVIF.

EXIF Data (Exchangeable Image File) is an important aspect of SEO. I won’t go deep into this aspect, but you can watch the video of Matt Cutts from 2012.

As an aside: Google was sharing way too much information about their internal system before John Mueller. As an SEO, I can’t say that I like this change!

Exchangeable Image File includes the “light, camera, lens, geolocation, image title, description, ISO Number, Image Owner, and License information”. Some also call this IPTC Metadata (International Press Telecommunications Council).

For relevance, I recommend my clients use IPTC Metadata with minimum dimensions. But for performance, you need to clean them.

QR2QOUQ45ynM2szk2X8VZwooKNzrt32TX9sb7CqYc0xfMrm0rLcijDH76G8VL9YwDjXJZWMxFE TosoBBo
We have chosen “600px” as the fixed-width since it is compatible with both mobile and desktop view screens.

Resolution. For image optimization, unnecessarily big resolutions shouldn’t be used. If the website is not from the News niche, you probably won’t need big resolutions.

To use the best possible image extension based on user-agent (browser) differences and the best possible resolution based on the device differences, we have used “srcset”.

Below, you will see an example.

dZ2VRpxVFzxhB3f4Ee3XIDoE 3LTyaGfOv v1

Remember, we already “preloaded” the AVIF image, and now we are just telling the browser where to show it if it can do so. And, “<figure>” is being used for Semantic HTML as we will discover in the future sections of this SEO Case Study.

P.S.: Do I really need to talk about “alt” tags? Or, Image URLs?

3.6.1 Intersection Observer for Image Lazy Loading

Intersection Observer is an API to load the images only if the image is close to the view screen or in the view screen. In the Intersection Observer, you can determine when or where to load and show an image. Basically, Intersection Observer API is the practical name of Lazy Loading. But, why didn’t we use the “load:lazy” attribute of Chrome? Or, why didn’t we use a third-party library for lazy loading?

  1. We didn’t use Chrome’s “loading” attribute and “lazy” value because it is not compatible with every browser. For cross-browser compatibility, we need to use Intersection Observer.
  2. We didn’t use the third-party libraries for lazy loading, because it would also load unnecessary codes from another third-party domain. And, maintaining your own custom library is way much better than adding another dependency to your toolset.

Below, you will see what percentage of the users’ browsers support Intersection Observer API which is 91.98%.

4VadtXnmqyrmRlkMQkHRKgOQk4MNv3RBiZoVEoSOBFgRtDaPlQ lSbXK5mhHjFjtl7uhqrdcAzhc7rDyhyBum9iuP6KhOPboJVHrsT2 M2zxpWHpyPQmTyOpDmPA6TA5LbwVD8nJ10DzLIyW12vnovZYu RjiwS8nwtWJRd j1ZR HOtBFO2pFMge8wB

On the other hand, only 69.39% percent of the browsers support the “loading” attribute and the “lazy” value for it.

I recommend you check out Mozilla’s How to Create an Intersection Observer API tutorial.

Thanks to the Intersection Observer API, we increased the size of the “main.js” file just a little, but we gained control of the lazy loading’s default behaviors without any other dependencies and, of course, we have improved the initial loading time more than 50% thanks to lazy-loading. This 50% improvement was measured from the following relevant page speed metrics: First Paint, First Contentful Paint, Largest Contentful Paint, and Time to Interactive.

3.6.2 Image Placeholder for Better Speed Index and Largest Contentful Paint

Image placeholders are important for “visual progress” speed. To completely load the above-the-fold section of the web page in terms of “visual completeness”, placeholders provide a “smooth” and more “interactive” experience. Image placeholders are now being used in Encazip.com for improving “Speed Index” and “Largest Contentful Paint” timing.

But also, since image placeholder is a technology that relies on JavaScript, it also has some side effects such as not showing the image for the not-rendering crawling schemas of Googlebot. You can see the effect of image placeholders due to their JavaScript-based nature on Google’s SERPs.

As you can see, without the “rendering phase”, Google’s indexing engine can’t see the actual image, it doesn’t understand that the image is actually a placeholder. Thus, it shows this instead!

Thus, Google decided to not show the image after all because it was not in the “initial HTML”. Google thought that the image wasn’t important enough and also since they don’t render JavaScript every time, they couldn’t see the actual LCP Image continuously between crawl round-trips.

After a while, Google will show the image placeholder again, then the actual image, and then it will remove it again… This will continue as a loop. And, you should think of this as a reason for “ranking fluctuation” also.

ZbvREa6Sy1eKkTWvCir08X6TWCD9MPNiObe1SPugbNWqsHwuhQKcGMhRzaK5xLMMrkFgpd4zZmcxYv6xvL 54Cw9kyQyc8DmMczh854lQbbkvedQaSMdcyAVWdKHpO33ummmM1RZ8IMIFdaHNY5qnmyxMFIgAzKqpDP6aq3kQwIeMrRHG25Nxf npSad

And this will continue…

After 15 days, Googlebot fixed it. But, if you know Google, “fresh data” is always more important than “old data”. You should think of Google’s crawling behavior as a “loop”. 

LpgnjOBtBhk HaImq26VPR4MjGX1GIbEOrzAKmqQeRw3vUVCNuJJmSe3V6SCNt8F Zc6 P1knD8JtRBFOodN 4wWzx6KsSa864ACfEGt vQl2C aG lMKZn2b8k9IkaMPEznpDaQqSUnOJMbCw0M0 0zR6spsmt
You will see the “loop of” the Search Engine’s Decision Tree.

3.6.3 Using Image Height and Width Attributes for Cumulative Layout Shift

Image height and width are important for Image SEO and Visual Search. But besides Image SEO, this is also an important aspect for User Experience and thus for SEO. I won’t dive deep into Cumulative Layout Shift here, but suffice to know that every unexpected “layout shift” or “moving web page component” is a cause for Cumulative Layout Shift for the user.

To prevent this situation there are certain rules:

  1. Give height and width values to images.
  2. Do not use dynamic content injection.
  3. Late uploaded web fonts.
  4. Network approval before DOM Loading.

In this context, we have given height and width values for images so that the Cumulative Layout Shift can be decreased and Encazip.com can be ready for Google’s Page Experience Algorithm.

KINo0TJuWxenIXk19btMCIAMNBb3rWKfP2RnlR5bDgtAxbl7rVt c4HvSi0o2eq07hryP1PVQVvpXQmHVxGhFm7oZbEiu6LzSUZHrZtJpjZP mM2G1q3PPzmlinKY96P 0aMIlTNwtjzlmcTjWMLq9gJP8m228AGIrLamkejZcz2AnCOH3ymaEv6fnvS
An example of CLS.

3.7 Web App Manifest Usage for Progressive Web Apps

Web App Manifest is the gateway for Progressive Web Applications. Simply put, a Web App Manifest is a file that defines the website as an application and lets a device download the website to local storage with certain icons, shortcuts, colors, and definitions. Thanks to Web App Manifest, a website can be opened without a browser like an app. That’s why it is called a Web App Manifest.

In Encazip.com, we have started to use Web App Manifest, also the “words” and “shortcuts” in the Web App Manifest can increase the user-retention while reinforcing your brand entity to Google.

H lMO90iL1PmOJ8HYLADGu6Aa6h54NW19gzvv1jchoOh8n8XJxWGP U3dsGegef8Vj0LLinpOkpqRtDOupbDGBgQD4nznlGGbwL7N GspNnyUPtK6Dv9cABpLbSsvAmD 7CKsWM0quZnw5aezfAFnDX Gl1R105qFr65wTeHfHaYX6t2GoKQUbiigVgV

And, you will see that we have a “prompt pop-up” for installing Encazip.com as a local app. Below, you will see that we have Encazip.com as a local app on my desktop screen.

FCnqXRbxPq9ICrwVsv5a879C6f8xyffpnj8G2OKGcCXFNbiGswzEex2z2nLJZ6 15e0yMG24b7TEhgKgpsRNAs02OrpUR3tNbo DJHKiNzz6soHxRsrzm5pC8h3MmBiygmPfclzQZ1wc3qydQGwOqBGYkGI8KerCl9JkUCeTvXyQwuPw eDYqsywY4HM

3.8 Using Service Workers for Better HTML Payloads

Service Workers are another step for Progressive Web Applications. Thanks to service workers, a website can work offline. A service worker is actually local storage from the browser’s memory. A service worker can create a “cache within an array” and certain URLs are registered into this array. After these URLs and the resources within them are cached, the client doesn’t send requests to the server for these resources. And since they are in the local cache, website loading performance is improved for returning visitors.

jeZ u2rVytbtst6oCFufHctBzjC XXv9m7fHjFyZ9lM gKlHty8SHk1YqpiHA56sRbwwKBWuWgnBcHo0g7xD
A screenshot from the Service Worker prototype of Encazip.com

With Service Workers, we have cached the most important resources for the “initial contact” with the user. But, to be honest, it’s not obvious what the limit of storage is for a service worker, so we have tried to use it carefully and sparingly.

3.9 Cleaning Unused Code from Third-party Trackers by Localizing

This section is actually debatable. In Technical SEO and Page Speed Improvements for creating the best possible time and cost balance for the SEO Project, I always focus on the most important points that will make the difference.

In this short animation below, you can see which third-party dependencies generally impact internet users in terms of data usage and page speed. Thus, “cleaning and localizing” the third-party trackers can decrease the page’s size enormously and also remove the need for connecting another outsourced service for the client.

st3XpS1ovnCcHvKCVYQp1gJ2sCecXH1sYQGKxhWlbiWop5NbxqFiQQcO2FSN0t8DYZvQ9IFXTd4JDOjsImzAwdv78IE4zLIgMkPrdJXalJR3Ozzzr8MQElSbR9VSabYiAfWWTy LzNiLUFj7YV1UBJZ1AGfrCRpa HwLTNGevK9BqlqtsRb4M B8cSB9
Average Cost and Impact of third party scripts on the web.

But, Cleaning and Localizing the third-party resource has also some side effects:

  1. If you localize a third-party tracker, you won’t get the updates automatically.
  2. Localized third-party dependencies might not work perfectly due to sloppy cleaning.
  3. If the marketing team wants to use another feature from the dependent script, the process might need to be repeated.

Below, you will see the positive effects:

  1. Removing the Single Point of Failure possibility.
  2. You will only use the necessary portions of the dependent script for lesser code.
  3. You won’t need to connect to another outsourced service to complete the web page loading.
  4. It is sustainable if the development team can make this a habit.
8jtWM8j9elGJBHp0 SK5MPFa6EGbopGB5ejgCbPHm11tJgOcb2kUcAAkz 6wfvWpLqsejzXBSgBngBD9qB9WwwRtBKT1CMKJdhGtMoMH96KCrHye LDurwole 1wafO4LqWiq0QWwamTca YuUDRgn6NO7EI9dF8VJuLNqJw 9UJDN4UdYYA JFAfpf
You can track queries and your performance for these queries along with the SERP Features and queries’ search intent with Authoritas.

In Encazip.com, because of these side-effects, we didn’t implement this yet, but it is in the future scope of the project.  I’ve included it to show the lengths we will go to to get the best results and demonstrate the true “vision” and “perspective” of this case study.

3.10 Conflicting Document Type with Response Headers and HTML Files

Unfortunately, the majority of SEOs do not care about the Response Headers—and they should!

Response headers and their messages are actually more important than the actual HTML. So, any kind of message in the Response Header should not conflict with the information within the tags from the HTML Document.

Content type within the Response Header.

In our case, Encazip.com was using the “Windows-1258” encoding for the “content-type” response header while using the “UTF-8” for the HTML Document. And, this gives a mixed signal to the browser and indirectly to the Search Engine crawlers about the web page’s content type. To remove such a mixed signal, we started to use only “UTF-8” within the HTML Document.

wTkaWYgWGjy4C5wPhK2chVAiKYnrYpoTxrgAVsVBcc2H2dyTsV4wa7E62gyXQztD9KRKL1mCgxcECY2rW7R2CohkT724QQ6spTypEFNZXKrNmu7aXPAjgOJFAPvV8 fntVYN7Z2xVmVXW2xnI9q9NQOX3a8wgCTE3rsKzfIRBZ4d7 xVj3dmpU5x8 Um

With Authoritas, you can add tasks and also solve OnPage and Technical SEO problems.

3.11 HTML Digestion and HTML-Based Improvements

“HTML Digestion” is a term from the “Search Off the Record” Podcast Series which is created and published by Googlers, Danny Sullivan, Garry Illyes, Martin Splitt, and John Mueller. They also call this “HTML Normalization”. According to Google, the “actual HTML” and the “indexed HTML” are not the same. Googlebot and Caffeine Indexing System of Google are extracting the HTML Structure from the actual document with the signals they collect.

5YGjPcj4MD1wFERBLQc6GMl5kTpu47xj8BGp9yR01qCuXQxUD5megUogWrGNtDa5HNguqxJ84kL7VP9C1VHDIlBpLldx04BDePH1Rik7jz9ysH3Ddu33w3YEvg J2UuWavoTXBIXltW1qA57PiYZNn89HG32F Aa S8QJHxd Q55uA74IlD7mvd0QW1A
A headline from Barry Schwartz for the specific topic.

There is a simple quote below from Garry Illyes about “HTML Normalization”.

If you have really broken HTML, then that’s kind of hard. So we push all the HTML through an HTML lexer. Again, search for the name. You can figure out what that is. But, basically, we normalize the HTML. And then, it’s much easier to process it. And then, there comes the hot stepper: h1, h2, h3, h4.

I know. All these header tags are also normalized through rendering. We try to understand the styling that was applied on the h tags, so we can determine the relative importance of the h tags compared to each other. Let’s see, what else we do there?

Do we also convert things, like PDFs or… Oh, yeah. Google Search can index many formats, not just text HTML, we can index PDFs, we can index spreadsheets, we can index Word document files, we can index… What else? Lotus files, for some reason.

Garry Illyes,

You can listen to the Search off the Record Podcast Series, a related episode.

Remember what happened during August, September, and November in Google’s indexing system?  Everything went awry!  Google removed the Request Indexing function, and it mixed the “canonicalized URLs”, etc…

Google’s confirmation for a series of bugs. Even the Google Search Console’s Coverage Report was not refreshed for days.

Thus, having a simple, error-free, understandable HTML is helpful. Thus, while redesigning the website, we also cleaned all the HTML code errors from Encazip.com. 

P.S: HTML Code errors might make a browser work within “quirk mode” which is also harmful to web page loading performance even if it is just a small factor.

3.12 Semantic HTML Usage

Semantic HTML means that HTML tags can have context and meaning within a hierarchy. It gives more hints and makes it easier to understand a web page for the Search Engine Crawlers. Also, Semantic HTML is useful for screen readers and web users who have disabilities. 

AjbwuCBck3gEC77cDYjKufM4lxr5007QGkLAP3U2Wcqm6TefeVlqxDI3fyZ53xv61Sh2fm AsuA99ssVb0 7Sh6oBbzx 2 qAS9DwgnmskAR6qEZnXTgWrNNxK3rLGAePqsh9qVorK 5FR1IbH25LgWg9CenquMaYSAutNQdIcaJ8Msm75W2yE 6pOlM

A schema for Semantic HTML’s logic.

In Encazip.com, at the beginning of the SEO Project, Semantic HTML was not used. But, with certain rules, we have used Semantic HTML. You can see some of the tags we prefer to use within the website.

  1. Header
  2. Footer
  3. Nav
  4. Main
  5. Headings
  6. Article
  7. Aside
  8. Section
  9. Ol and Li
  10. Picture
  11. Figure
  12. Quote
  13. Table
  14. Paragraph

Every “section” had at least and at the most only one “heading 2”. And every “visual transition” was also at the end of the section. In other words, the “visual design elements” of the website and the Semantic HTML are compatible with each other. This helps to “align the signals” unlike “bad and mixed signals”.

qNZdY50eVGxm5aN4c6WpBk9hsM 6Auz1KF6XGVf9aTLTiC3g5XVfoIsc udfiFyOI4QoOc1cF1hMPFWn3RsrCiSRujByEJ3jFjISjPL1pjaAo8A0EAevxMBKAfu0ULs7exxKokv heZxjeUE3
With Authoritas, examining the referring domains and their relevance and importance for SEO Performance can be measured.

3.13 Decreasing the HTML DOM Size

The DOM Size is an important factor for “Reflow, Repaint Cost”. The Document Object Model is built from objects or nodes. Every additional node is an increase of +1 to the DOM Size. Google suggests having less than 1,500 Nodes in the Document Object Model. Because, having a large DOM Size makes it harder to layout, paint, or render processes for the browser.

2gAfk x2llQhskKtQKGO6q1IAa2gl tJo3OWR7u44yVrffQApi2646dsbUvWH wi8 AxpawMVyIpw8NiOrCX2VF7R02 5wviJC P cPpcnp2KShtQo3HNRHZPR1rFjDjsyMyCAmmN2rub2O65b2gYpa5wjW2FSoR0otyEptgZ8vRuMEe VzrnR2iBX

You can see the DOM-Tree Analysis for Encazip.com’s Homepage.

In Encazip.com, we ended up with 570 nodes in the DOM. It is much better than Google’s suggested limit, but our main competitor has an average of 640 nodes. So, we are better, but not much better than our main competitor, at least for now.

kyt8JsH3eapnfXbo3LSdD8choDQmGYeVmmioB8zryg1stV F9OdnDA686IzN6J7RUMO0GBA9nwHaHh8wyZ4V eh3

With Authoritas, you can find the best experts and authors for your industry, for your PR and Marketing campaigns.

3.14 Font File Count and Size Decreasing

Font file optimization is completely another discipline in page speed science. Thus, I will just give a simple and short summary here.

On Encazip.com there were more than 5 font files per web page. And most of these fonts were not even used for every web page or even if they were being used, they were only for a small portion of the web page.

I always recommend brands use “less color” and “fewer fonts”. Because they are not really critical, but still costly for the users and crawlers of the search engines.

bQHatrepUEV2LGehDKhWe6suNlv0WzHWWZYfiy4F csgqsqPwhfM2BQgC0ynuqLMbhZsI7khSIQ8iIayH303xfQp gaXNcdbkVrqIBdpQ2DP7aex1sz0SsTIxJY5f5QPWCkLJRbtAKFcLgS47gPk32Xq21HbyLb8H JP qA8ALEsStoZ6ZmhI37hPiJf
We have only one font file at the moment.

The first major issue is that all the font files did not have the Woff2 file extension. This meant that their size was unnecessarily large. The total size of the font files was more than 200 KB per page.

  1. At the end of the day, we have decreased the font file count to one.
  2. We have decreased the font file size to 44 KB.
  3. We have gained 4 requests and an average of 150 KB per web page.
VWoRkVoNm2sozwZQXbvVNIyovxgpeCOn2ogkn7fvmfe3zyBc9XHMQx0gP6saTYMKrWRyW5pwBztsNc3itiFlu5ChjA AwlO5KeSqR1Davsjm9YPE7J6KSeBwiPTYzegRNHXU9euXNgqSlub4IVJFcyW e4OMtKpUKhcIiMAsDPIOb6v5VCSARSEAdYkW

With Authoritas, you can find the best expert Authors based also on “domains”.

3.15 Using Font Variables

Font variables are one of the advanced page speed topics. Imagine that you are unifying the “bold”, “italic” and “regular” versions of a font into a single file. Thanks to font variables we could use different font variations with only one request.

“Font-variation-setting” is for font-variable.

Thus, we have managed to stick to only one font and different styles. You can see the “Font-variable” codes below from our CSS File. (Other thanks to Mr. Nedim Taş for this!).

3.16 FOUT and FOIT

Flash of Unstyled Text and Flash of InvisibleText are other important terms for web font optimization. FOUT and FOIT are also important for Cumulative Layout Shift and sometimes, the Largest Contentful Paint if the LCP is textual content. To prevent FOUT and FOIT situations, we have preloaded the font file while using the “font-display:swap” CSS Feature within our CSS. Below, you can see the necessary code block.

ABgc h6LzUOXTCY4mSryBhdvCiUjAoqlRXD4MTBHByOnRBaallb2oxtUR1Y3JRlhuGdLK84Zr3 aMYNlPaEHcE8wKCAOTLa2M6Ay7V5ogadXe7uZ4BxVABU2TkuIQ4EK5DBLVzYDQ3gU0zEsFrNSrIEvIbKE i4tJzDY60fYIf13910NLk4 H3Y UUT
Font display is from our CSS Files.

3.17 Using Browser-Side Caching for Static Resources

The browser-side cache is for the static resources of the web pages. If a resource on the web page doesn’t change frequently, it means that it can be stored in the browser’s cache. To perform this, “cache-control:max-age” and “Etag” or “Entity Tag” HTTP Header should be used.

H1RTp9YxfWLHX2DOJUTunMP6Hqi eLM3PcAkRSnbmic3ScY0slE6AVVwU3 77Z79jvTJzBIExgCX2 a1KgtcpK8MpuWcloFkzzGWnNPY8IFBdd2yd9gG

Working principle of browser-side cache.

In Encazip.com, we have used browser-side cache for some static resources, but some static resources’ browser caching is delayed due to some back-end infrastructure improvements.  So there are some more incremental improvements to make here.

Structured Data Usage for Holistic SEO

Structured Data is one of the other important signals for the search engine. It shows the entities and their profile and connection with other entities to the search engine. Structured data can affect the relevance, SERP view, and the web page’s main intent in the eyes of search engines.

y2uUuu05kcKR3Ypd3OCQVeRYE6jClG1E10Irw5PxsLeZInzYAGF5xoE6i4BF0yH gPe1xpmYG BAzLEtiUgsPzPKqzLDoyzNGm9MpK3KP XkU3wZ6SbiG LIEkD1Jw6B4eUysLARpPOIF f9SLuc7x85q9sdQKyja1jvt My MBuxfkRRz9xJsbtk Gh

Encazip’s Organization Structure Data Visualization.

In Encazip.com, the structured data had not been implemented correctly, so we adopted three different structured data types for Encazip.com.

  1. Organization
  2. FAQ
  3. AggregateRating

Why did we use these types of structured data?.

  1. Organization structured data was used for creating an entity reputation and definition for Encazip.com. Soon after, Google started to show Encazip.com’s social media profiles on the SERP.
  2. FAQ structured data has been used for the blog and service/product pages. In the future, we plan to also add more sections to the FAQ structured data within the Schema.org guidelines.
  3. AggregateRating is for the business partners’ of Encazip.com, and it was united with the Organization’s structured data. The main purpose here was to show the web page’s activity on the SERP with the reviews and stars.
  4. Images in the Largest Contentful Paint HTML Element have been added to the FAQ Structured Data for better web page elements, and layout functionality signals.

5.0 Website Accessibility: Every User Matters

Website accessibility is one of the most important things that are relevant to SEO, UX, and most importantly humanity. As an SEO, I must say that an accessible website is actually a human right. Thus, I believe that making websites accessible is one of the best sides of SEO. (And, as a color-blind person, I give extra-special attention to this area).

5.1 Using Accessible Rich Internet Applications for being a Better Brand and Holistic SEO

Encazip.com is a ‘mostly’ accessible website. I say “mostly” because, to be honest, learning and implementing “Accessible Rich Internet Applications” is not easy. But, we have implemented “role”, “aria-labelledby”, and “aria-describedby” attributes with proper values.

Furthermore, I can also say that pages that are legible to a screen reader can be understood easier by a Search Engine since it doesn’t leave anything to chance and connects every web page component to each other. (I recommend you to think about this, also having Semantic HTML in mind).

Lastly, we also cared about “light and color differences” between web page components.

PS: Do we really need to talk about alt tags, in 2021?

Website Redesign Process for Holistic SEO: Every Pixel Matters

For designing a website, there are lots of dimensions. The layout of the web pages, component order of web pages, style of components, texts, images, links, categorization of pages, and more are affecting rankings.

If a web page cannot satisfy the search intent, it cannot be ranked well by Search Engines. If a website’s layout is not understandable and requires “learning” by users and also Search Engine’s Quality Evaluation Algorithms, it can harm the SEO Performance.

PIV56X3YofFfiol IAnbW8EEMe9bLkhnrAkNQ8fZhPSRp5z B2XbvqVQNEKEmR9y cGs4QXcvsxKowaWqm9LV3kS8xTA0SLEvLX02zLKKp8e4mbkj4gkSgPYDroIC8Ku5H8Tzb5

With Authoritas, you can filter the branded queries and non-brand queries for CTR and keyword profile analysis.

Google can understand a website’s quality and expertise from its design, layout, or web page components. By just changing the design, I have overcome some SEO performance plateaus before, this also includes changes as minor as color palettes. Google also has some patents about this, to show the detailed insights that Google might seek to extract from a website’s layout and design.  I have chosen only four Search Engine patents, one is from Microsoft, and three are from Google.

6.1 Website Representation Vectors for SEO

1b9kCdqVO40bQv92qaS6cwSAd8KAkb4UmXaPar1yCAyxlQ9rehhX0W3vlJnKWehDvjRtMdRtcqT6m8NpLeAjpTxC4ageJBPrWHh4KPZ5nKSyNSImOPXakG rYs Os uSpMAjblE2UpQd uGg3Y8PPAK5gHoU6Pf5cdqbZyEaNg10RAL4qwm2l8eaWrgJ

Website representation vectors cluster websites according to their layout and design quality along with expertise signals. According to their percentage similarity, Google labels sites as expert, practitioner, or beginner by looking at texts, links, images, layout, and a combination of this and more.

6.2 Read Time Calculation

o3x2hQ5CNdO1FI6giI6NJdFqO0j 9FKe 6Oy6ZpezYW3gKsMSTn5o6 H9dhVEr8dru9w36x4gztT7UqlFl0F9hzOYCvvvYezq n6GR F4ZJ4X21pD7auijspu8owCth4oIfHJ2 ouzFddgzvtqX gt34LQ72rRn6JsaMEDxz fhtTQZi 9UwgKdPV5cf

Detection and utilization of document reading speed

Google might use “markers” to try to understand how a user can read a document, and how much time it would take them to find the right portion of the document for specific information or query. It also tries to understand the language of the content and its layout for the users’ needs. This patent is from 2005, but it shows that at some point, Google’s Search Quality Team cared about this in the middle of 2000-2010. And, we all know that the above-the-fold section and phrases and entities from the upper section of the content are more important than the middle and bottom sections.

6.3 Visual Segmentation of Web Pages based on Gaps and Text Blocks

CTvlnh6 5pYicUbb7phjwGDmssyoa8IpI3QGnUIdSLH g3bJQ zgC6IijcPJUl1lMU6xTBP19 6Cb48p2 Ilz17CA uAaSIKd WAiyxKubAKDdH7VHl3jwhdhlyUFcAzTdUdI0uvZuUVZlvm9MuBmmGHMBmT D355La8S9Wszm9w8kgcJRLjul6C CJv

Document segmentation based on visual gaps

Google can use visual gaps, text blocks, headings, and some marks to understand the relationship of different blocks with each other. But, also if you leave too much gap between blocks it can affect the “completeness of the document” while increasing the scroll depth and also “read time”. So, having a “complete visual block” that follows another one within a hierarchy and harmony is important.

6.4 VIPS: a Vision-based Page Segmentation Algorithm:

1zE9HZfiRzDoUfhAzE3HNPH6PvLWpqNbUYgmZLY53ZpySPJWoq6QyM4my1M4KLGN05XL isL2yS8LgGNs1n6mdAfPkoWKeFvtYA2XOkTb5 anw8goalHMGGxuRMSn9mqphB7dxSYQ8Xlp8gb JuYl MP TAQfaOjc2BRBPoD2e8IGlMCbmn026osrlDS

Another patent, but this time it is from Microsoft.  As you can see above, the Vision-based Page Segmentation Algorithm uses the Document Object Model and also visual signals to analyze the relationship of different web page segments with each other. You will see some rules from VIPs:

  1. If the DOM node is not a text node and it has no valid children, then this node cannot be divided and will be cut.
  2. If the DOM node has only one valid child and the child is not a text node, then divide this node.
  3. If the DOM node is the root node of the sub-DOM tree (corresponding to the block), and there is only one sub-DOM tree corresponding to this block, divide this node.

Why do you think that I have shared these rules? Because all of these are similar to what Google’s Lighthouse does for determining the Largest Contentful Paint. And, LCP “div” can also be used for understanding the actual purpose of the web page. Of course, it is not a “directive”, it is just a hint, but that’s why LCP is important for search engines. It means that the initial contact section of a page with a fast LCP Score can satisfy the search intent faster.

og2ZaQhySHTHmI6gLP0pN5CD9KgF AguYLICqvlRn2E8mZW4 BHqGirYCWkkSaFrnxS1GQuvxiE0nxCft8j4aLxYyleeHXPfpjk6XvM4xQpSFmJD kbO1kz2RL1G1F1Ell63Xqy36gjsCb9I OOjtzmsg3bHwt4sUbfzt7mELftXPW563zrokpLpLKEd
Device-based CTR Model by Authoritas.

I won’t go further in this section but know this, Google also has patents for page segmentation based on “function blocks and linguistic features”. It also checks the code blocks to understand which section is for what, whilst annotating the language style on these code blocks.

QaZgDnynCIk4585XkiA4p52tIxiooeRMnKrZYbbQzdaEROBm 66C6bXOt4wCGJoYkSzAvDSb78ZTqf58MYpe fCAiFJDmfjyWDo6UKRBXuW44uPpKWgmg4aHJdH9kN0m8YryI1jt4rVcUdJbi9TeObgxt4liq02ZMgwv9EO0Cyzncz2 LRu WzQMGWr9

Classifying functions of web blocks based on linguistic features

In Encazip.com, while designing the new website, the designers created a modern, useful web page layout and visual aesthetic for web users. During the design process, we also talked about the implications of mobile-only indexing, mobile-first indexing, search intent, visual consistency, and Chris Goward’s “LIFT Model” for page layouts.

8OdPAHuMsYCNDeEN91Voa99X6R1rdtlGOyl mKjK2Rc sIWfl1841j4DolOKIT2WhR3qT6 xCY c1FqL4nwJwOg03ZgQqcXlMRqE5KN4qe3vvWHmKKQYzeJbo kS85f1Amu3BCItAPNUVX9E4K7Xx7PIueE8HKLdoIumpxVWWD45fRAd67InBjmm8S l
LIFT Model of Chris Goward

The LIFT Model is also important for me because it lets me optimize a web page based on the dominant search intent along with sub-intents. With a proper hierarchy, everything coalesces nicely with a clear signal. While designing the website, we also talked about DOM size, the need for CSS code, styling of heading elements, and semantic HTML usage along with AIRA necessity. (Again, in this section, Mr. Nedim did an excellent job!).

pxiAH3QPjFpHeIhuU6aGgji5S7TENDBP6FjBz5SxSjDyqQy5AWTCxNC3IIUpiGNE6pMZ5P9 olo14uMsAACtbFmFR4GyRoSqhEy m31mqgo8uRW9eXPpJOAnK9 ecujl6CJfPUevLyQxGQun6hEKeUjqP3Xb3Q7O9pCq1xZeSigk yNTHnPg9DofHdCW

This image shows Google’s Indexing System’s working style with “aligning ranking signals”.

So, web page layout is an important ranking factor.  Along with E-A-T, it affects the user experience and conversion rate, it is also directly related to the web page’s loading performance.  To be a good well-rounded SEO, one should be able to harmoniously manage these different aspects of an SEO project.

7.0 Kibana and ElasticSearch Usage for Log Analysis

At the beginning of the Encazip.com SEO Case Study, we didn’t perform any kind of log analysis and actually, we didn’t need it. But for the future stages of the SEO Project, we plan to use Kibana and ElasticSearch.

If you are looking for more on this, Jean Christopher has a great article in the Search Engine Journal about how to use Kibana and ElasticSearch for SEO Log Analysis.

Ujtr3G1fNVReXijJ9gj02N0EC2sGD1x6RcqA8eKoG0Ik4t 1DrMKGjtb
You can crawl a website with Authoritas and blend the Google Analytics, and Google Search Console Data for the specific pages along with Technical SEO attributes.

With this in mind, we have started to prepare our log analysis environment. Of course, if you want you can read log files with a custom Python script or a paid service such as JetOctopus, or OnCrawl, it is up to you.

Authoritas also has a nice real-time log analysis tool in Alpha that uses a small JS snippet that you can insert into your website.  It detects bots and then sends the data to the Authoritas servers where it is then analyzed in Kibana and ElasticSearch.  It’s very fast to load (page impact 10-20 ms) as it loads over UDP rather than TCP/IP. Furthermore, it only works at the moment on sites running PHP.  As it’s loaded with the page, it won’t pick up any 5XX server errors, but it will help you track bots in real-time, and find 3XX, and 4XX status code issues and bad bots hitting your site. If you have difficulty getting access to your server logs, then this could be a simple and easy step.

8.0 Branding, Digital PR, and Entitization of Encazip.com: Every Mention Matters

Encazip.com is also a good example of an entity. I won’t go deep about entities here, but there are four key differences between an entity and a phrase.

  1. The entity has a meaning, but the keyword has not.
  2. The entity does not have a sound, a keyword has.
  3. Entities have attributes, but keywords are not so.
  4. The entity is about understanding concepts, keywords are about matching the string.

Entitization means the process of giving a brand an actual meaning, a vision, attributes, and connection with other concepts to a search engine. Being an entity will help improve your rankings. Google can evaluate a “source” beyond its own domain. To perform this, you need to implement entity-based Search Engine Optimization.

JilsBog765THUDVGUUdRAADQu5l9Z jl0pOAJIvDkUfs52wQk S7azDGEommRG7PwOFTSH4KvhFKvvYfCfAM66 4C3CklFXlwxtRIua3ZwfdEX06LvsDxDo1I1 rMBFWX9N69jjJFkI4BcJyM8E5XPUsJW8XeNREaRyOnwJNq5uSxYs2Uo

Device-based Position and CTR Relationship.

But, in the Encazip.com SEO Project, my general strategy failed. The best way to become an entity and get your entity ID is actually opening a profile page within a popular source for the Google Knowledge panel such as Wikipedia or Wiki Fandom.

a mAKiKA1v Q dxZC4wTJDGDf9PQEU0MdqVmTHnwEkKAAoU9M8BGdBlEcOFDwvw4VleIiSb5LwoBy z 74WgQh4jRpDCIxTVql6xSZ1 Kzk75ntLqr EOhAfttbIsMbucIZtb93FBSQ6O5w4mmrmAem5hsHNiekoLDK GJ3AFAByroBCArovVnC i5UK

Knowledge Panel Source Sites for Google.
Source: Kalicube.com – Data: Authoritas SERPs API.

In the Encazip.com SEO Project, also “entitization” was not just about the “brand”. Also, “sponsor”, “founder” and “manager” should be entities. The reliability, news, mentions, and relevance within the energy industry are also important for a search engine. Imagine that Google suggests a website on the SERP that the founder, owner, or manager of the site’s brand is actually a criminal. It wouldn’t be a reliable “source” right?

To create more E-A-T, we also used Mr. Çağada Kırım’s scientific articles and background on the energy field, we have opened some Wikipedia Pages for the Mountbatten Family’s members, because Encazip.com is also owned by Mountbatten Family members as partners.

But, using Wikipedia pages was not the best way to proceed with entitization. If you open a Wikipedia page for a brand, in a short time, you will get an entity ID. I have done this before for VavaCars, and you can see their entity ID and Knowledge Graph search result below.

9OnG0DD47gfvZv8LGSEt 4z30w5gAuUrrVsc6HXS1miTMI2nozjvm1G4Vm8G4EHj6SH496cYsnBm2C0vjR7Z628wOF N6WMGBqxqrHjqGvxj4b4IFU SrP8aX92F 67aFYlL8AP1ikQ10BMu3 YPf9vrhXhUiLhYy9V5s7Q719co1 okOBECWLclLwK2

The image above is a visual description of “how to query within Google’s Knowledge Graph for entities with Python”.  You can see the entity ID of VavaCars with the entity definition and link for the Wikipedia page that I created.

In Encazip.com, we did not manage to successfully create a Wikipedia page, so how did we solve the problem of becoming an entity?  Through increasing the volume of mentions, news stories, and third-party definitive articles about the site.

Going deeper into this process is beyond the scope of this case study, but in doing so, we also increased the latent search demand for the brand’s site. It was also useful in increasing the SEO performance and organic rankings since it is a direct “ranking factor”.

If users search for you, Google will promote you on the SERP for the related concepts and terms. Look at the 22nd of December from Google Trends for the search trends related to Encazip.com:

moML g1ic yRnh5NRTxHdZFB7U05ghbyixHasy92 NeIsRlrkVFa4pK9z0QKi2TB8X54p5WLgSGBGzIimxNnieWvTjFUdR5nCFymlJkms LsZjidBPuGAyvAth6CtkSpEY4gbaYUqrZqxnA70YPFsGvA9jPtUl dj2lumicl64C 7d PIYCUiyQIO9BC

It was 100. And, look how December 22nd affected the “Average Position” and also how it was a cornerstone for this SEO Case Study. After December 22nd, Google decided to use Encazip.com for broader queries with more solid expertise, relevance, and authority.

RbKCUs AGJhH3ZHB9N2UZ2EN4ntqCfO4LVg23XoYX6Go6kVEGT0heOJhW8q3Bn1j89iYCU6myttC2bj5MJt Y0vgE1owFUgMdjYVFDi6MpzeX4tDzYytBUssSd pnebDQNfVdZZWHab7RiSCk3wzlV67taqb6qDa6oyirNC30tj8J6ATAxoR kI4AOES

And, thanks to all this heavy branding and intensive news, mentions, and search demand Google has recorded Encazip as an entity in its Knowledge Base. The screenshot below is from Google Trends. If you can see a search term as also a “topic” it means that they are an entity.

qd0NxaVzVL93OfOn40pEfqRUe1mknoQ95ESVyvGpjywZk9zDLYi98DDqBJyJEfxck MhUuZv04 rQuRQnTVcfYx6GTU qGtFXpt64Z oN4MlyudyPxkRv1s 2RB5OZ4bDL jeWSw2 dDh0DCwKJ

Google Trends shows the topic and phrase for “encazip”.

If you choose a topic in Google Trends, you will see all related search activity for the entity. And, during all the branding work, press releases, and more, we always cared about the context. We always used the “Encazip.com” phrase with the most relevant and industry-centric concepts. Google calls this “annotation text” within its patents. It means that the sentence’s sentiment and annotations will create relevance between concepts.

Becoming an entity is not enough, but it’s a good start!  You should also create relevant annotations and connections between your brand entity and the industry so that you can become an authority.

Encazip’s entity ID for Google’s Knowledge Base is “2F11cmtxkff9”. An entity’s ID can be seen within the URL of Google Trends.

8.1 Social Media’s Effect on Becoming an Entity and Entity-based SEO

During the Encazip.com SEO Project, we also used social media actively. My general principle for Social Media is actually using “hashtags”, “images” and also mainstream social media accounts for giving constant activity signals to search engines.

EHGEEMhYGo8 i75GRE1vHS022DFtDz1ROI3P1XZIOff0CXTIIpAE 1iimDECGq9llSj4YxYzFea24SSyuhuJDriO1 1mErGczGoFEZXieosel3BQbnm50L94xbUvwDoIE ZH8NLBXimgoU 7uSl0xD9R3jZ7N jyUkmWGMMfS8UF6KE fXkwiJi16Lz4

We know that Google Discovery acts in great parallel to Social Media activity, even without an official statement. We also know that Google indexes social media posts by dividing them into hashtags, videos, posts, and images.

Furthermore, we know that Google wants to see a brand’s social media accounts in the organization’s structured data. Furthermore, we also know that Google has placed these links in the knowledge panels of entities, and even placed special places on social media links in the “Update the Knowledge Panel” section.


You can compare the SERP for two competitors’ social media searches. Result count and SERP features are signals for prominence and activity.

We have some old explanations from Matt Cutts about “social media links” and how they try to interpret them for search quality, and from Google’s old changelogs, we know that they scrape social media accounts and posts to understand the web better. Between 2010-2015, social media activity was an important ranking factor, even back then there were “post services” as a black hat method.(Editor’s Note: Not that anyone we know used them of course ;-)).

Below, you can see my general ten rules and suggestions for Encazip.com for social media activity.

  1. Always, be more active than the competitors.
  2. Always, have more followers and connections than followers.
  3. Create new hashtags with long-tail keywords.
  4. Use hashtags within a hierarchy, such as “#brandname, #maintopic, #subtopic”.
  5. Always try to fetch the latest and most popular hashtags for every mainstream social media platform.
  6. Try to appear in Google’s Twitter, TikTok, or Instagram short video carousels.
  7. Have more indexed content on Google and Bing within the social media mainstream platforms than your competitors.
  8. Use original images with links to the main content.
  9. Syndicate the content distribution with social media platforms along with content-sharing platforms such as Quora, Reddit, and Medium.
  10. Consolidate the ranking signals of the social media post and platforms with the brand entity’s main source which is the website.

The tenth point is actually the main purpose of my social media activity within all SEO Projects, and it can be acquired via links, mentions, image logos, and entity-based connections.

During the SEO Case Study, Encazip.com was active on Instagram, Facebook, Linkedin, Quora, Reddit, Medium, Twitter, Facebook, and YouTube with hierarchical and derived hashtags with keywords.

bWXBh1g1CZHBQDg7ZlahvgWB8e lH76O3aBTv3i4e7TedwIFGxKjtJ2v28WGkWfvAmpRU7JTUcYc6GRRPtOqS7aB3EghNn5sz86F1P7iFqGUwKfg3EEpSarG9cxE26qQuv33RKHPwgzZButofR
The same is also valid for LinkedIn, Twitter, YouTube, Reddit, Quora, Medium, and more…

8.2 Local Search and GMB Listing’s Effect on Entities in the SERPs

As a holistic SEO, it’s not just the technical, coding side or content side that is important, the local search activity is as important as the social media arena. Google unifies every ranking and relevance signal along with quality signals from the different verticals of search and web.

In this context, we can clearly say that the Local Search quality of an entity is also an active SEO factor for web search results. Thus, Encazip.com has performed a “review marketing” campaign with honest reviews that were requested from the customers.  The company’s social media posts with custom-designed images were being posted to the Google My Business posts.

64iUczVjUzgEBrehOs3ZS 2 PxkfNwz05KPU0iqjadznCThzjOUuF bAYaubEICeQt V3o ihHtJixmTqYkFvfRRINlNxRoSPQzdgMZ9hSYGFLU2PUe37oK96Tf2KP1nmbUH2MSPnWTHNN6LO9XFrBcqeym297AcBy1Yo0Tddb 2E 2C39ePYHmSzu0t
You may see the review count, questions, GMB Posts, and how Google relates Encazip.com with the most authoritative entities unlike the beginning phase of the SEO Project.

After many positive reviews and lots of related questions which were answered by Encazip.com’s experts, Encazip.com started to be grouped with the biggest energy companies on the Google My Business panel within the “People Also Search” feature. This was clearly a quality score increase and a relevance signal increase which was great. 

In other words, from all search verticals and web platforms, we created a high activity level, with strong quality and authority signals while consolidating them for the search engines’  algorithms.

9.0 Protecting the Site Migration Route

Encazip.com’s old domain was “Cazipenerji.com”. It has been migrated to Encazip.com, but during the project, the old 301 redirects expired because the registration for the old domain expired!  (Fortunately, Mr. Cagada Kırım noticed this problem before me and he bought the old domain again and redirected it to Encazip.com).

2VX61EQm hRzBTUVA9PMG8hzQJD glB7J6b3urBPlTJLTsXPPJEyBwBaVPrrBawnKjN4ULVoH6sZA8t8HaKZ8 Jo PgPUoArEbEBYT9yxNCGoLj9G6XUSwgeYFlCoGu8MZKMx1NM58QM2 lby8SvOVrwgBHsiZGhoSScsWwLxzx25M vN p870pXX6R
A screenshot from 2013 for Cazipenerji.com. You can imagine the “historical data and relevance” for the industry.

This is important to take advantage of the old domain’s brand authority and relevance for the queries that it has historical data. This section is also related to the “Uncertainty Principle of Search Engines”. It takes time to convince search engines’ algorithms, if you cancel a site migration, it can hurt your brand’s reliability, thus this is also another important step to get right.

10.0 Authoritative, Semantic SEO for Content Marketing: Every Letter Matters

In Encazip.com, I implemented the Semantic SEO principles. I used the “ontology” and “taxonomy” for all the relevant topical graphs under a logical hierarchy and structure. For content creation, we have educated the authors and taught them Natural Language Processing rules, terms, and their importance. In this process, I also should pinpoint the importance of educating the customer’s team.

lvQSXIZnRPmMa4npGIwnz3kCFFtyh4DwdcmK5rdA1o4DOW5E3baWi5wgZbOntycfslaO6dBj3ctiwcxCIyPjIhYUXwSv26AC2FalLspbcUcV52Xmtogbt3B tivEGZgGxKCpaivsswfUbnLKSgp Bymg8aH4vLRz6ZgYwamgoDZlaFG4u3m6H EsbXs
Brand and Non-brand query comparison based on position.

If you don’t educate your customer, you will be over-exhausted and you will compromise the quality of the content. To prevent this unwanted situation, a holistic SEO should educate his customer. In this context, I recommend you to read some important Google Research Papers below:

  1. Translating Web Search Queries into Natural Language Questions
  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
  3. Siamese Multi-depth Transformer-based Hierarchical Encoder for Long-Form Document Matching

And, lastly, I recommend you to read anything Bill Slawski and Shaun Anderson write 😉

Lastly, if you are a true SEO nerd, then you can read my Topical Authority Lecture with 4 SEO Projects with a summary of Search Engine concepts and theories.

Z6YQVa45mzIJnE8FlPhBCEkg lxA4T4CPbN1GAhUqXcBUe6Mw4wesQusKTvuMZTIO8q px22
With Authoritas, you can generate FAQ questions for different queries, languages, Search Engines, and regions. This is actually a unique feature for detailed semantic content marketing.

During the SEO Project, we have written, redesigned, repurposed, reformatted, and republished more than 60 articles. Also, we have started to create new content hubs by complying with Google’s entity taxonomies.

Since the day we started working with Mr. Koray, his work has always had positive effects on encazip.com. 

He also taught our team a lot about both SEO and technical SEO. Our work, with the guidance of Mr. Koray, has always ended well. 

He contributed a lot to encazip.com to become what it is today.

Yağmur Akyar

10.1 Different Contexts within Semantic Topical Graphs

An entity has different contexts. Google calls this Dynamic Organization of Content. A brand can be an authoritative source for an entity’s context. Such as ‘electricity prices. But, there is also a connection between electricity production and electricity prices. Also, electricity production is connected to electricity definition and science. Thus, for calculation, definition, production, consumption, science, and scientists, Encazip.com started a comprehensive content production process based on semantic search features.

zvmmjTzh OwlpoGx1xHZvAwa3RETL6wbZc8Eg3Sg YNIHOFJHuqy1iN3KjBBac37ViwYROOdWof4GPU61DKNFPTj6Cs1iKcD53 bfhFV3w2jOec sH7g1rHT5qXWwHk93zHlHI36GQJ dOm1C7vQ4 YYxwezul1c6DEKV vs2Gyr8 qEPkiyiWUlEuXi

In the image above, you can see a Dynamic Content Organization for the entity of electricity.

P.S: And, Do I need to talk about keyword gaps or etc… in 2021?

10.2 Image Search and Visual Content Creation

In images, there are two types of entities according to the Google patents, one is the object entity, and another one is the “attribution entity”. We also used EXIF Data” and IPTC for image SEO. Images are designed as unique or chosen as unique. The brand’s logo is being used as a watermark. We have determined Search Engine Friendly URLs and alt tags for the images. I also specified how an image should be selected for the article or a subsection of an article. To determine this, I used Google’s Vision AI and other Search Engines’ image search features and tools.

4mxyAE1cbaG9H b9YvDj6 6gOeRLd0T9jUkzKHBZKFeoZR n7L0a6Yl1l qxYDFyPR uo6szDXgkpFZ81DZXcd6wSeARViuBf t0vziwiqd1EgcjUqBjk2I FHH

In this section, I must also say that Microsoft Bing’s Advanced Image Search capacity helped me. Unlike Google, they index every image on a web page. And, they have a faster “snap and search” infrastructure for image search.

I recommend you read the documents below, to understand this section better.

  1. Ranking Image Search Results Using Machine Learning Models
  2. Facial Recognition with Social Network Aiding

Below, you can see why I talk positively about Bing’s Image Search capacity, if you read the second article above, you will see this more clearly.

LUC71viJdH a9M1wEapCj2finSjv35j D52ytD1 unAQ2FAYvdPFCWZLpyq4WZJidWmjCFGFNL3pKj6PJ6qga U6BxPi7tBVG7NMH Lhd Ax95EdZJg J0OTQiFdTvXuCwh4

10.3 Image Sitemap, Image Structured Data and Representative Images for Landing Pages

Google announced its sitemap syntax in June 2005 and improved the sitemap understanding, syntax, and tag structure over time, for instance, in 2011 Google announced that they can understand the hreflang from Sitemap Files. In this context, Image Sitemap files can have different tags such as “caption”, “geo_location”, “title”, and “license”. Image Sitemaps or images in the sitemap files are one of the useful communication surfaces with the search engine for helping its algorithms to understand the role, content, and meaning of the image for a web page.

In this context, all the representative images and also the Largest Contentful Paint Elements of the web pages are added to the sitemap files. In other words, a regular sitemap file has been turned into a complex sitemap with URLs and also images. Below, you can see a complex sitemap example for Encazip.com that includes images and URLs at the same time in the sitemap file.

eML5VcpI9cazt16EMuMlBuze6u3irxqZRJ7UyYDCRBSecwLpJyWB1EO6Exh2zfKbEAz6YWHO 9b9nMN2qyHp2CKSyrM3JMSHoMgHK9ySjY7omNZ0NmMSctmU UkdkhXZ3TcIX sl21DNOjrRBPymsmt8kDpKaZwI dz27p0diVsiTSIVfm8K AEVt9q

As a second step for better communication with the search engine, the images on the web page have been added to the FAQ Structured Data. The object and subject entities within the images, texts, colors, and any visual communication element can strengthen the context of the content within the FAQ Structured Data. Thus, not just the first image, but all images are added to the FAQ Structured Data as JSON-LD. Below, you can see an example.

cdBI2flwdrYzS2CAgS rw6a2bt5K fkDyVYEAmUX0iL5AkeFpBxuqfoyoHqwPpJJYycI7PAYlJldWAXb4SR9KVMBksL5qrmdqBhkFcJokvP5SZBHhJNskw42Hgtj2JKg7OxW8aesO

To differentiate these two image sections from each other, different subfolder names and paths are used. As a final step, to support web search via image search, and increase the quality, usability, and click satisfaction signals, the search engine’s overall selections for specific queries in terms of image search are used. In other words, if someone searches for a query and Google show certain types of images, we have analyzed these images’ object and subject entities to use them within our featured images too.

Below, you can see an example. For the query “Electricity Prices”, you can use a “table with prices” as an image (I have used an HTML table which is more clear for a Search Engine) or you can add a “bill that shows the electricity prices”. As a result, from 60+ rank, the web page has ranked as 6th for image search.

14V3QcMl LkmwtmdhikW5mtyJWdIZ1APj4 UKMDvuuSmERCAZEQ kRHIW6mDyDSKqs3SGIsZQwOHRIoAKWoW1QSfwqno4kqc2IDv5LtuucbOtvCBJ4TrWb05gq8TibaO K0V0YpHB9RTNoiN5OXHsm4XDD8QrHsiTshlJ8aPEPbhiNOW0rAh2103NUz3
The image design and color choices also show themselves in an attention-grabbing way in Google’s SERP Design.  Image N-Grams of the search engine have been used to understand the search engine’s perspective and content tuning for the Semantic SEO Optimization process.

And, for topical authority analysis, it is not just about “text content”, it is about “all of the content”. Thus, every gain from every vertical of search, whether it is textual, visual or vocal is a contributor to winning the broad core algorithm updates and dominating a topic, network of search intent and queries from a certain type of context.

UoV0jM5eYbx0j03emdeEoqTlz fmteoku0oIFASu ySneW5FjFCf0VW3Zi2PyNIh ZF QNtWKOMyycP8P fvGtUbPyMJjvtcc5Nooe Z ta dst1PA3KODlK90YfDz5E45ZLNVMH50FuTCPNoVPB2f5UHAwZRItEP7ITNyfQlprWNe4DBNeXKQ239drO
Encazip.com’s Image Impression has changed over the last 12 months.

11.0 Importance of Clear Communication & Passion for SEO on the Customer-side

To be honest, Encazip.com is the most easy-going, problem-free SEO Case Study that I have ever performed until now. Because the customer’s team is very positive and passionate about SEO. I know that I have spent double or triple the energy that I spent for Encazip.com on other SEO Projects that produced less efficiency. But, what was the difference?

You can connect Google Search Console to your Authoritas account for quick analysis and SEO insights.

The difference is the mindset. A Polish proverb says that “You can lead a horse to water, but you can’t make him drink”. And, SEO Project Management is on the same page with this proverb. That’s why I always try to be careful while choosing my customers. SEO is not just a “one-person job” anymore, it needs to pervade company culture. 

In my opinion, the customer’s character, mindset, and perspective on SEO are the main factors that govern an SEO Project’s success.

(As a confession: In the old days (before 1 August 2018, aka Medic update), I was a blackhat SEO, thus I didn’t need my customers to love or know SEO, but after a while, Google fixed lots of gaps in its algorithm while changing my perspective too. That’s why I have learned to code, understand UX web design, and much more!).

vJaX7Wa6p6XtuWPIBwKbWIpgJBx1Hb5aCe 1jsxHZ6683PAldEAZJI9Tht7NZvXJeoB85Ul kWCeNl pY0oksMbbWC0xusRQC0DVww83AHkav7g1ODS6g7 8cTrdfmxvw3Lnf12NtAVU6hvn9COT920AdV3AEzvIom6qi
With Authoritas, you can create SEO tasks and assign them to your team members.

And, that’s why creating an SEO Case Study with an uneducated customer is like “Making the camel jump over a ditch!”, or with the Turkish version, “deveye hendek atlatmak”.

11.1 Importance of Educating the Customer in Advanced SEO Concepts

How can you educate the customer? If you want to talk about just simple and easy SEO terms, it won’t help you to create SEO success stories. That’s why my biggest priority for improving the customer’s comprehensiveness for SEO is paying attention to the “smallest details”.

y of4rQapH8eHuvN7eRTjt21yR4yNgD37yydiKPlB NXoYIxsIyj1jDjGm7VIbGky1ZDiQ9it58fsXtrwKheDpW1gtElamoY0mNlsZSAM11 iF7UlUubFKZFifnlxQCDyBFXDtcFiKBJuc 2Beg1s7ljGPWZQlP2brfm3u dC2jqJAFFKHi0lc6gYWNT

You can track the share of voice with Authoritas.

That’s why the main headline of this SEO Case Study is “Every Pixel, Millisecond, Byte, Letter, and User Matters for SEO”.

From the technical side, you should focus on “bytes” and “milliseconds” with the IT and Developer team, while focusing on “UX, Content, and Branding” with your marketing and editorial teams. 

“Our work with the Holistic SEO Approach was at a new level of difficulty for us. Every SEO meeting was like an education and ended with a to-do list that included a lot of hard work.”

Erman Aydınlık

12.0 Importance of Broad Core Algorithm Update Strategy for SEO

Broad Core Algorithm Updates are the algorithmic updates for the core features of the Google Search Engine. Google announces its broad core algorithm updates officially with some extra details such as update rolling out time, and update rolling out finishing time. Before the Medic Update (1st August 2018 Google Update), all the Broad Core Algorithm Updates are called “Phantom Updates”. Because these updates are not officially announced, the SEO Community calls them “Phantom Updates” while Google calls them “Quality Updates”. Since Broad Core Algorithm Updates affect the crawl budget, authority, and quality assignment of a source (domain) on the web in the eyes of the Google Search Engine, having a solid Broad Core Algorithm Update Strategy will help an SEO to manage the SEO Project more effectively and time-efficient.

Oh7awY3sKAppEfqWDACVOVgCuULQiLdukOLRidTPUSFANLEbiVfMoVDJA4wr0I81nW90br1852Tnm94cTdnnyLTgZAerk3vBD ct tba7wLoEc7HR5tToaTD H0G2dWjoU4DNe 1V8yor1376OXXxPzGNZTIjB2NjiNOg6j9Cipg 5 Y 4L zq0 WiW

To use Broad Core Algorithm Updates as an SEO Strategy, I have written a concrete SEO Case Study with Hangikredi.com.

During the Encazip.com SEO Case Study and Project, there were two Broad Core Algorithm Updates, one is the December 2020 Broad Core Algorithm Update, and the latter one is the June 2021 Broad Core Algorithm Update. Encazip.com has won both of the Broad Core Algorithm Updates of Google, and in the next two sections, you will see how Google compares competing sources on the web with each other while deciding which one should be ranked for the determined topic, and niche.

12.1 Effects of December Broad Core Algorithm Update of Google on Encazip.com and its Competitors

When you put so much effort into planning and executing such a comprehensive SEO program, then you need to ensure you have a variety of SEO tools and software at your disposal to help you coordinate teams and activity, and to manage and report on SEO performance.  I use a combination of tools including Authoritas and Ahrefs.

Encazip.com was impacted by the December Broad Core Algorithm Update of Google. But, in this section, I will show a comparative analysis based on the Ahrefs data charts including Encazip and its competitors.

Its first competitor lost most of its traffic.

yrk7xHncyqN0pyzJLodtGUEmrfgRionY4s aK 1 SaXBPY1alfuBgl2jMWJ07j9BLPmQJLP 0Phe0Gk1NVRVzrASUNfCSlBk5QG4lGVpzmGVIHVqcokwMdgkcRyf2HUR2i9ShCne4 2QBp6wpd4QqoCrYL

Below, you will see the second competitor’s graphic.

And, this is the last year’s trend for Encazip.com


Every pixel, millisecond, byte, letter, and user is behind this difference!

Koray Tuğberk GÜBÜR

12.2 Effects of June 2021 Broad Core Algorithm Update of Google on Encazip.com and Competitors

Google recently announced another Broad Core Algorithm Update on June 2nd, 2021. Before the June Broad Core Algorithm Update was announced, Google was switching between sources in its SERPs and this was affecting the traffic of Encazip.com. 

During these “source switching periods”, I tried to publish and update more content while supporting the site with press releases, and social media and accelerating the delayed improvements. Search engines always try to differentiate the noise from data, and when they try to gather meaningful data from the SERP, feeding them more positive trust, activity, and quality signals is helpful. In this context, you can check the effects of the June 2021 Broad Core Algorithm Update and its consistency with the December 2020 Broad Core Algorithm Update in terms of the direction of the decisions of the search engine.

The first competitor, Akillitarife.com’s traffic can be seen below. They increased the overall query count, but their traffic continues to decrease, it is an indicator that there is not enough contextual relevance between queries and the source.

C2nl1h1LLztjtM1GGTxSUxiI2A9VJOIYq63BiF3b2SbLXNhDcYUlgauCLzRCC5kSc VFn8Es 5aw4t42R2BKPtdCCTkvIV8ht7UVABDc8ZWN987SO0IUO4cBFxOthrG7p27ciBVZU8 9J08QsA0O6GLm4vIxSOokls7Ig2C6 1jAH46ebMEAl Qdj8M0
In terms of the traffic amount, Akillitarife.com has a broader industry target than Encazip.com, it also requires more attention in terms of topical authority. The broader the topics for a source, the harder to become a real authority and be perceived as a prominent source for related queries.

The second competitor, Gazelektrik.com also increased the overall query count, but the traffic continues to decrease, and you can see how these two main competitors’ graphs are very close to each other, which means that they have been clustered together by the Search Engine.

Wvl2DzS7rN Cu e3vuf1sxMX2g 2pYtM O9lCGsYeOXgUJNCMAV68gmLLM I2tzU B89I5CAx3rQ EScEg0khF3f1sdmA4tbPWT0N2ix4gIm6EYEjW02F2HQiVAj45HJajzZJyEMHPVleJXodb7dV 4tdCK3 YSq00 iV h huTz

Below, you will see Encazip.com’s organic traffic change for the June 2021 Google Broad Core Algorithm Update which includes a 100% organic traffic increase.  (Editor’s Note: Don’t graphs like this make your day? ;-))

Dp7hzK y13x4I7VxppwDlRX1EwgI7BGm3mciAQjpGKY5G0VuCj4 WslmgCldNTmOEECqTgJOrWfzs06S0SzRItMFMe4WyIF8rkM1bVUT4rMDFzfrfVTQARrh 79vY6pgG5q5UUrnmSMdWIA4GpVIt eaKy2Y4ZjuquS9HuDLUSb9RasF1 MEiPkC5MWJ

A Broad Core Algorithm Update Strategy for every SEO Project should be improved and reinforced. Brands and Organizations might tend to forget the effect of Broad Core Algorithm Update’s strong effects, after two or three months. In this case, an SEO should make every member of the client remember how important it is, and how to create trust, quality, and activity signals for the search engine.

12.3 Effects of July 2021 Broad Core Algorithm Update of Google on Encazip.com and Competitors

Google finished rolling out the Broad Core Algorithm Update on the 13th of July. Also, another spam update that focuses on the affiliate links took place on the 27th of July. Encazip.com has tripled its organic traffic after the July 2021 Broad Core Algorithm Update. In other words, the consistent signals from the search engine became more obvious and strongly reflected. Below you will see the Encazip.com, Akillitarife.com, and Gazelektrik.com’s change graphics during the 2021 July Broad Core Algorithm Update.

Encazip.com’s organic performance change after the 2021 July Broad Core Algorithm Update:

mkz2qNk vuJnY3ZyuJf3 Jg6vZ380fIrrQiGieUpiejMfmXOXGpt6g5DkuFMwv3kdRBBl2 oe61o4wxZTn160SkdnUm uwa10qAXFgYtqIgIFp5Ezs0p5PqCr7ANpiQ4eo1pLbr9i7 3eeHziTLiEuHYlJ98MIsN
Source: SEMRush

The same change can be seen from Ahrefs too.

rtVIezIQ9sKueQ83aPkiabJrWh1Lub YfQ9ilOVsVtSKd2iyqHybgH56O9I9G2XXISZ0TK2B

Akillitarife.com’s organic performance change during the 2021 July Broad Core Algorithm update of Google can be seen with the negative impact as below.

DWb88f7eODyuXG3 Q aIAd lAdk9IPaed70SJLeWHH4SOfrASSs vfzifHhc k P8a 519sfsUyU 1yCWg0zR4 cfeDqxNH6RYsN7S

Gazelektrik.com’s change can be seen below.

psKcpY JZEp LFLvf Qs7lbRzFUIImu9Op0Z9FyEQIFhapYIZq31AbyUEk8jnc5Mo01VbSxjEjck5bLA8upOJFkesbvavzF hA rZd6fY48sS jA6c7QYwDYBLpAryfPod

Organic performance change graphic of Gazelektrik.com during the 2021 July Broad Core Algorithm Update from Ahrefs. The last situation of Encazip.com’s organic performance, and positive changes thanks to the reliability of the brand can be seen below.

rRgEEGwbj7emg S DY0tbRdvqM9p10bSShUgoLyg yoGi9RNMbqDkiXk7g25IvwNIDThxN39NsINqUjiMUCRhyrV4JeinvDWnuSn1oMFvNEtfHCBB DwK428PQfRWE9HTcDVBct Jshqee5V1tiHw94CgWMloy74JUPftRolQ5dq9RYGdAqPqRNcRvy

From 700 daily clicks to 10.000 clicks.

For every Broad Core Algorithm Update, every trending query span, and unconfirmed, or link, spam, and pagespeed-related update improved the organic performance of Encazip.com because for every update, the SEO project has been improved holistically. With two broad core algorithm updates, two spam updates, and page experience algorithm updates, along with countless unconfirmed and unannounced updates, the search engine has favored Encazip.com on the SERP with higher confidence for click satisfaction due to the always-on multi-angled SEO improvements.

Effects of November 2021 Broad Core Algorithm Update, Content Spam Update, and Competitors

The November 2021 Broad Core Algorithm Update, the November 2021 Content Spam Update, the following 2021 November Local Search Update, and many other changes continue to affect Encazip.com’s SEO performance along with its competitors. Since August, in these 3 months, lots of things have changed at Encazip.com. In this chapter, these SEO-related changes and the ongoing search engine updates’ effects will be discussed.

Below, Encazip.com’s SEO Performance change during and after the November 2021 Broad Core Algorithm Update can be seen from SEMRush.

EAndBpi914M4MdndFlgH5VxuZYpRax2 eWGSeUEbz MC I6qDY jR9saCaKFbzxc3G8EfB bbPCrB8QeOno4GQ7FzxRMaGRh0CxljIO3q2a2fYRwDnn3wZOzGkgXqC52 A oX443QdQ pGxoLgez2 ysR2fb5fhj1tJ6u73zAHptDgr6ptCptZ YAVKL

The Ahrefs organic search performance graphic for Encazip.com can be seen below.

UkcwH6UcVK8Sy2K M7RVYbheZythQZlKx3hJYH50n8gDDe4E 43uG5Iz0QmDd4peYFFuVdgc9uppTFQfu4wrhl434ANUEySA7iVMaeSoJbn1fCKzHBkLj3 XQeki1oiwKJlAAT5Mo478eH0dTBiKYwO9eSjMdNY0611LnOC3ij8OUhom0Xru5kSg Pen

Encazip.com has been affected positively by the November 2021 Broad Core Algorithm Update, and Content Spam Update along with the Local Search Update. The website has reached the maximum query and organic search performance.

The changes that have been done during this timeline can be found below.

Website Migration to ReactJS and NextJS

During the last 3 months, Encazip.com has performed two different types of site migrations. A site migration can be done in three different ways.

  1. Site migration without URL Change.
  2. Site Migration with URL Change.
  3. Site Migration with Framework, Back-end Structure Change
  4. Site Migration with Design Change

Website Migration to ReactJS and NextJS represents a Framework and back-end structure change. During the site migration, I determined basic terms and rules for the development and project management teams.

  1. Do not change the website structure, design, or URL tree before a core algorithm update.
  2. Do not change the content, design, and framework at the same time.
  3. Perform the migration during a “non-trending” season without risks.
  4. Be sure that image, text, and link elements are visible on the web page even if the JS is not rendered.
  5. Be sure that the request count, size, and request origins are fewer than before.
  6. Do not increase the size of DOM elements.
  7. Do not lose the previous improvements.
  8. Proper Structured Data implementation is lost.

After the July 2021 Broad Core Algorithm Update, I have given the positive signal for the website migration. And, since during the summer the searches were less volume, it was a safe zone for a migration. During the website migration for the framework, the mistakes, and obstacles below are experienced.

  • The Virtual DOM is not used despite the NextJS advantages.
  • The framework migration isn’t performed on time due to technical problems.
  • The request size and count were bigger than before.
  • Code-splitting gainings are lost.
  • The image loading prioritization and place-holders are lost.
  • The DOM Size was larger than before.

The biggest potential benefit of the NextJS and ReactJS migration is using the Virtual DOM. You can see how fast a Virtual DOM exercise is below.


And, this is from Encazip.com.

V4maDbGXeUhsFYsa0lc8tumyvWxiZvCj RnqOH2YV1CxQ2NcVChUzGeE nwPPbU7QgVlXQbtyL4Qq7KZl4nB8EYRVnV90indj1uclmPwvZakR EqeQ0B1 qHG8MjpQx3R8fmVZHH73aesfRA4kXl bm64i2SpRQkqvClYcktWdEBHymcnMrTDZ2YlR8H

Basically, from the Virtual DOM, I was able to open 4 different web pages in 6 seconds. It was only 1 for Encazip.com. Another problem is that during the framework migration, some of the technical SEO earnings were lost. Thus, another Technical SEO sprint has been started.

Thus, you can see how these mistakes and neglects affected the organic search performance in a negative way.


Since the development team lost time for the framework migration, the URL Property migration was rushed. Based on those errors, the Core Web Vitals were negatively impacted for the website. You can see the change for Core Web Vitals below.

A60IWJxpUhbO8Sv goRn7cyW1eJE1nG91xTVmuEwr VJs2chMQpZNrt6c5R97WvcXcHexH1kNPPrFA0HwK12TIrh djjsfePqtbBlVFN8t5kU 8fysKixZaxwqfJt8nrxgNkWWwxgRBQ ilP 8PU5vMN7S33xl5XgI4wISUo

Website Migration to the WWW URL Property from without WWW Version

During the URL Property Migration, the most important parts of the checklist are as below.

  1. Fix all the internal links, do not use them without the “www” version.
  2. Fix all of the URLs within the sitemap.
  3. Fix all the social media, and GMB (GBP) links with the new version.
  4. Create a new GSC Account, and submit the new URLs.
  5. Remove the sitemap from the previous GSC Account.
  6. Put at least 45 days between the Framework and URL Migration.
  7. Do not include the legacy URLs or old web server files.

During the URL Property Migration, the internal links are cleaned in an efficient way, but the legacy URLs are resurrected with 404 status codes. In other words, the old deleted URLs are put into the internal links, and Google starts to crawl, or even index some of those old URLs.

9fK UiEthseFqkdIStByFhFfKBL2wF02DAoPPAphJyeGHqHKZPV1pbKuQjmOMU2vZ0QG3k9B9Rj33xBL2aaKDowlwY3JngiK4e2rCP2Sve n0HaOA8vPYpfr5TInHVXjEy4ZS9zcGkinIv08tuiGv6lc445 jFvipD 3Omw 44TrLL2T4e A1vYJ5Pq

Some of those can be seen above. Because of these errors, Google couldn’t understand the canonical version of the website properly. It continued to keep both versions indexed. Below, you will find the “www” URL Property performance report.

WX8dd1RMmbkCtgFSjZrdi90seQiVGKsPQ1IrfjgDJwky63phXSU3j4mQNPSNo3jIfAExGBj7Arev mFtMfVn0852GGwpJvk Fl0AONIpcKnpk2e5xCYXVgNXFcxJcAI8psKbZQeMlWOeB E iUcvQGbZlhiuiDUhF8mAlk3OZnuHmbpLlu s092N5tvB

As you see there is a sudden increase, and later a gradual increase. Google couldn’t find the canonical URLs or canonicalized the proper URL Property due to the framework and URL changes in a problematic way. Thus, 25% of the website has taken traffic from the without “www” version for nearly 2 months.

GSgQhXMN27bsXV9wWILQvps9n41 xkuyqxNzkxHD3IKnfCvC7e9AyQWYWUWmBtQWaV412FOSPtbHXB9P1liaOjHODIfSZlXrZNwmuNUFxz0y3uEWKpTupM7JXrpO0nIwQZHA 9sSIuYmUS41QnhielBhtL7TffIE6qwN5e1BI1Uc0XoaV9yiKrdu1bDn

To fix these types of problems, the “301” redirection might not be good enough. Because Google crawl hits might use the “previous content” for updating only some resources on the web page. If there are URLs with less traffic, and some external links, it might be harder for Google to see the canonical version. Since the traffic is fewer, Google might not update the content with new crawl hits, and since there are external links, the canonicalization might be the benefit of the legacy URL Property.

After even 3 months, some of the old URLs are still being used.

sraSHVmfeEyHaKelFIQE6Nq46Y BISV2n43gilJK4MI7GJHQpM90Hsoq9 E9 szfT8EzxDcuuPESjI Sn5JTLCbp915vOdJTEVa7Z3nUZkyBO YtADgdWlIcZhZ6ZjnrsW18TLjMsurTzGaB19bsIkcuX J0sB1OyDm55gSQvldLUn oCVitetof8a5n

And, we see that Google has indexed the old URL Property’s “sitemap” file. It might happen because of the deindexation of other URLs, and Google might choose a “left-out” URL to the index since others have disappeared. It still shows there is a canonicalization problem for the website.

Some URLs are indexed with both versions, despite the 301 redirects. For these types of situations, the good practice is using the URL Hints with 100x Status Codes. In a Response Header, the canonical version of a content’s URL can be given below.

Link: <http://www.example.com/>; rel="canonical"

By using the canonical in HTTP Headers, you can state an extra signal to the Googlebot for the content’s actual new place. Adding multiple, consistent and meaningful aligned signals for a search engine is always beneficial to alert different algorithms for the same thing. During the URL Property Migration, the external references with Digital PR and even from news and YouTube channels have been used for further canonicalization.

Vym6 kky8MS6E4Fir92vd2NfsqX1xAniz GqP1FZWjzP Pe8mPYrlVp6pb 7hyLmc4QyHGccZ0Tx3cyaBg0L0 G5qE2kUeGEawoPzMhWRoI0IwAcPWMnE i0s4PtSJ0IwmVhFRcc8WLHNh4WgxGqVxDV0gBGK elWOSKc2F jkLQUqx27DR6o6oZ

One more change during the URL Property Migration was starting to use the CDN subdomain for the images. 

After a point, Google started to drop the “non-www” version from the SERP. In this context, the external references, mentions, social media references, and external links with digital PR were a protective layer for the branding and canonicalization signals along with the time.

Due to the URL Property Change for the Image Resources, the Google Image Search Results are also affected.

If you check both of the URLs below, you can see the “deindexation” and new “crawl pattern creation” needed by the search engine.


And, if you check below, you will see that Google still tries to find the old resource URL. 

zrah EfqdVd1n3OL

The effect of the image deindexation can be seen below.

Tt5LIe kekrscjYUFi950cTOc2Reu2Uj7hiokzTi4VL7RpRYmiwlg3q8GfbQevLOP7dSqqmNxlrbpxX8GbeTnzFnmEsD386cWsCKsSAXzhr3gOciXwRfV3ziWEFSvSuh59bWNWrUctzsc4RUv9KE VVn1hsX2cIfSk4L9kT8dfD VPKsQhZNVzvZOWlP

As a representative query, the “Jeotermal Enerji” had thousands of impressions from the image search, while now it is just a few. And, there are many queries in the same situation.

BxVC3hrr PBU6s92UsRrSSMXu ZVGzoVH2zJrWxRqfRJlHHofsnZjoaFGdQqj3RdhnFA 1Zkjw1wQ8yA6csawrnwgxyb9rv9M 5mf9UU8XkTQf7ootBy8InOcmsPNnMw6cfrgWz wL21Cyu3VaiXVj07LyzmPDIsQwL4AuYnM4KNEgVHMUsJHFUm LZt

Like, the “Gauss” query.

uP4RUs h0y5JtCnRmT1rJKhOorJRXYqL8lq5vcA0Kp6gYToTvYIE ZAJc50TG987mINTsO3y5HUECXIFdQuli79GR hGNZ5I4cuvjDJGY9nBfgqDjPL9yNFp n0VU 7ziGt1ENEZXyDrdRW6JaA

Thus, when the URL Property changes, the old image URLs should be redirected to the new image URLs. If for image optimization, the URL extension will need to be changed, try to optimize the image format without a URL change.

After all of these differences and new improvements, I can demonstrate some of the positive changes.

New Content Sprints and URL Count Restrictions

Most of these SEO-related chaotic errors happened because the client changed the team members. Since the team members are changed, the given SEO education is basically lost. Thus, in SEO Consultancy, keeping the SEO Culture alive for a company is a must. In this context, the URL Count Restriction is a new rule that I brought for the client because the “press releases” are added to the website by creating tens of new URLs.

These new URLs didn’t provide any information about the brand, and it created a dilution of ranking signals, and PageRank distribution by making the website bigger. Thus, I told the client to not add a new URL, or change a URL without the agency’s approval. Besides these chaotic things, five things were still good for the website.

  • Content Sprint with Semantic SEO has continued to be completed.
  • New Content Sprints are launched, especially for new industries, and localized search behaviors.
  • The Brand Reputation gets better with the new digital PR and press releases.
  • The trending searches and events in Turkey feed the website’s authority for the best-ranking web pages.
  • The Social Media Activity, Signals, and Subscriber count increased along with the engagement rate.

Lastly, most of those bad things didn’t happen “too early” before the Broad Core Algorithm Update. In other words, it couldn’t provide a change effect for the website, because it didn’t create enough historical data. In this context, the Content Spam, Local Search, and Broad Core Algorithm Updates during November are highly positive. And, most of those errors will be fixed or planned to be fixed for future updates.

3j0xifFhzW83Dhwg9uRT1uX5X RzJ22mRgi 5u1q5nCeO8TMXey8p 6 YIemgpGwVoNn6c4rvg3bibCuoqIDp7MIr0ukJ7yMCcYH2vMe RBYZyKbbNKgYLTCH0YIOGoofrTyELu06e9uHsgucNtmjCcoqtiTVoVHg9EGQOWd0l KqjHOXNOn9ElOalN

Encazip.com’s latest SEO Performance graphic from the Ahrefs can be seen above.

uGU5vdlKjIxgEc12yWj Ek8eoIoLUuwVwdhiM 15Sbiw6wTWPJqanzzlcw2 H1WF87pF KDMKUPfcOmliRaC81KFpxVduG7nCmjTU8trvuzALI3S 8P1I7u9AWqRN0JgKVOyk5PJFdvsWmAyPenG7NkRVb XjL0LMzMIYhA tEgPs pM8c2k7X1Spo

The last 6 months’ growth comparison for Encazip.com can be seen above.

For the next sections, you can check the competitors’ changes for the November 2021 Broad Core Algorithm Update, and Content Spam Update along with the Local Search Update.

Competitor Akillitarife.com continues to lose traffic based on Ahrefs Data.

Rz3o12eqkiJa2miKBTInFyCJYCS1B7WUgsSTUh3 OjqeFhiXgzqtqAsLmY NRNrhVc7ku o4 ek7s6UdhmDX3L5ZEN gAJvGqZU2riyau XNQ duO7TLbV ZIIV8vrWmQ CMgi tJgQ3yFKtMcl4ZfdAsfDI88vnL4Tud11X3Gj38lPZ16afuzMFGKX

Akillitarife.com’s SEMRush report can be seen below.

TK2O4rBeu7ZljV9shl5sJW2 huYnvNWVf0f103w8B ccFpnxy7qiIDgIeqbPxOWACbYhlkHSNt9

Elektriksepeti continues to lose traffic based on Ahrefs data can be seen below.

oBNvXv7OqqtfcU8006DQlI6qT3PK5l42XgrR0 NlQEUStvjV R0ORKWbwbI8NeNP0zkNPzIRiQXrDhfbji3U0Pgd1YHJ7Dq8uB1rw2f4Gxbm0Xn9s9hKu y8TuiyjTu90QKPT1E3i0u43K43YY5MkNFis iU A6Fh YjPd z7 DYENSVGcH4lnwKygn3

The Elektriksepeti SEMRush report can be seen below.

d 4cEqTNWEPJ OAr NPWbFf muXVR0oMlKQZuS924DQk20df5e5Lj6I5XDu10HNI1sr6PvCG6eYGJp1E2H4AIkasJVhWj2HUdWQ0vc1AvXOFmOnrcgiQqyEaCeXari3ngQ Asl

During the November 2021 Broad Core Algorithm Update process, the effect and prominence of Holistic SEO can be seen better. Taking support from every SEO Practice will guarantee success despite every kind of organizational problem, or unexpected error of individuals.

Effects of May 25, 2022, Broad Core Algorithm Update

May 25, 2022, Google Broad Core Algorithm Update affected finance, insurance, credit, news, and affiliate marketing websites heavily. 25, 2022 Broad Core Algorithm Update is perceived as an affiliate marketing inhibitor, while it is decreasing the click distance between the product purchase and the SERP. The same “direct-actionable” SERP design impacted the Aggregators for different types of services such as finance, insurance, and credit. Encazip.com is impacted by the May 25, 2022, Broad Core Algorithm update as slightly negative since it is an aggregator, and lost most of the technical SEO improvements during the website migration.

You can see the SEMrush organic search performance graphic for Encazip.com below.

V AQU8b3C7xyQ4OIDoc8wYQ BkUuws3iY cMSW gCuHhQNHG1IxcOI2Rx5N0667gtzom5b 8JF5a96vr3 6zxiPsH9HIUpeFt0iozBWtjLkB3UCqup5x8qiyl6JTQlyEx8 nsBJtb2zAf9kdSaI6hu81 e4V8GGgNnvz3zxs1JYovde6oL q940uFh7uyA

The marked point demonstrates that the website started to lose queries, and organic traffic connectedly. Below, you can see the Ahrefs Organic Search performance graphic for Encazip.com.

tddlEzdH842 A7Qw VvnYVUfhlpUZb89uHz79ZwxMFcojCsTx Cm7rLyzBz8kH9BBDzbX9e03zeiOisoy5yiHaMfA9Y5jyqgB48qGCpCjrMj6n3zxLgh6sPo8GLyjFAtD0jzFS2sq2KusLL7yP5hG18PaZQCMx91agwy48izfL3noqnvkm0 pBocrz W6w

In a connected and correlative way, Ahrefs demonstrates the lost query count, and organic traffic after the 25 May 2022 Broad Core Algorithm Update. In the next sections, you can compare Encazip.com’s situation with its competitors.

Gazelektrik has lost heavier traffic than Encazip.com during the May 2022 BCAU (Broad Core Algorithm Update).

SnxLGtz8zsuuLQpNF O3Knjpk3ivp2Gr4oPtP8o0p IB86jfFx4jSpkatt4056mPKRkFKgO63p MU5Kff3RSbbIZ e3XbdKo6trZ VQ6CYUdqbG9AHsoKDqbzw5yhcIWuj ij7y5old1McEB6g5c0k04sDr8vmhlCMhDI0VaV DT4P H2P6mpRFqYB4Rcw

Sepas has gained slightly better traffic, but it didn’t last. After the June 2022 Product Review Update, it started to lose traffic again. These types of short-term BCAU effects might signal that the Micro Core Updates, and later Product Review-like updates, are reversing the prioritization of the web source.

2DSaROon o a

Enerjiatlasi continued to lose traffic with May 2022 BCAU Update as other competitors.

GW iBYBz962sKpWapr xHkJyVI8CosLjM9mX0QiZ2 w9awamPpk8CTb6Ym3ePiE4RlPC73AfZK1Vpo1vxNF9HPwg4pelkUNxWGyRkRMTUlXOg5NWr PhEvkQeevYYprj RP9pJ4Tv3Z 1cja5NuNSmciK8QFOT2VYHsDg4lNvPxaTIBlKl1gKAx fXaifg

Below, you can check a service provider for the electricity industry, CKBogazici for the May 2022 BCAU.

VoiSFI DKlqyRguzIu9QmeNCNiWdvnsjvVfEpEp9qiyU 1egIpPiQ 61F1EjodCnQn6n54nz17X4T IV1pwClgJbsg0YLhlMqyWf5jpD7m4bgvwuL vKyeyPWZUF2NS6jdTW5guuFN8fXwFDAHKmc7BobGtBwdZ1heP Pp2pGifIaXAR1X1KKzUWrvXzRw

It increases its traffic slightly. A government website for the electricity industry, EPDK’s performance graph is below.

QTtx S0 wpXHKI1GBNzwggn2lmzGI6Lgt1UZnD nQ fqWNyGSUoxyUSXaWCgXG764Gl5mDHMDp0Q20BwuLXNELZromMulZQi AIE6zm9JCx83sTY4 xJEy40ohpze3NQiOfVEB svz7MItxW99Bbz88etu5FnafDBCp2ZPE aU3V5Oc9HjyLWZJBOMTS g

EPDK (official government website) increases its traffic with a big jump along with the query count.

Conclusions for the 2022 May BCAU of Google for Holistic SEO Case Study

The summary of the conclusions for 2022 May BCAU of Google for Encazip.com is below.

  • Encazip.com has lost the technical SEO improvements during the website migration.
  • The indexed URLs are changed over a long period of time.
  • Google has spent nearly 3 months removing all the “without WWW” version of the website from Google.com.
  • The re-indexing and 301 evaluation caused search engine to decrease their confidence in the indexed URLs since their URL ID is different, and they are new on the index, and “processing the text”, comparing the redirection source to the redirection target takes a longer time. This is why most site migrations show positive effects in the long-term when the search engine is ready to believe the migration, or that’s why the “Change of Address Tool” of GSC works with 6 months of timeline for waiting with a keep-alive 301 redirection.
  • The May 2022 BCAU of Google targeted the affiliate marketers without expertise while giving more girth to the actual service and product providers.
  • Encazip.com, and its competitors, lost the organic search traffic together.
  • In May 2022 BCAU affected the websites according to their classification based on their type, rather than the individual quality scores, or authorities.
  • It means, even if you have improved your website, it would only keep the website neutral for traffic change, or decrease the level of traffic loss.
  • May 2022 BCAU of Google is more related to query-web source distance rather than web source and quality comparison.
  • The results of the May 2022 BCAU show connectedness with the Helpful Content Update because it demonstrates the “who you are, and why you should rank for the query” connection stricter.

Effects of Expanding the Brand Identity and Service Area – Connectedness of Topics, and Propagation of Expertise from a Topic to Another

As in addition to the conclusions above, Encazip.com has expanded its identity, and services further. It is important because, a brand might hinder its trustworthiness heavily if it doesn’t provide quality, and unique service, information, and conceptual expertise for all the verticals in which it exists. Encazip.com has focused and proven its authority mainly for the electricity industry, now, it is also a project for Credit, Insurance, and even Natural Gas, etc. Thus, the newly published content makes a search engine examine the questions below.

  • Did this website rank for these queries before?
  • Is this the first time that this web page appears for this query?
  • Did web sources define this concept before?
  • Did web sources explain new services and products within their brand identity?
  • What is the valid brand-related reason for businesses to dive into this vertical?
  • What are the historical data, and user behaviors that overlap between the new and old query networks?
  • Are these new topics related to each other?

During this time, due to heavy investment pressure on the company, the articles, and content on insurance, credit, and other market verticals are not optimized for end-serving. It is mainly created for publication frequency, indexation, historical data, and a higher chance of brand relevance for the industry, but not for competitive ranking. Thus, this loss in brand identity and connectedness between the topics might affect its situation in the May 2022 Broad Core Algorithm results, and later.

Effects of September 2022 Broad Core Algorithm Update, Helpful Content Update, and Fifth Product Review Update

Encazip.com and its competitors’ overall situation for 2022 September Broad Core Algorithm Update, Helpful Content Update, and Fifth Product Update are explained in the context of the “Importance of Pixels, Milliseconds, Bytes, Letters and Users for SEO” case study and research.

Encazip.com has become the main winner of the 2022 September Broad Core Algorithm Update.

You can see the results of the SEMrush organic search performance increase graphic for 2022 September for Encazip.com, below.

New gained top 3 ranking, 400 queries, and 600 between 4-10 new queries, along with 5,000 new organic queries in total helped for returning Encazip.com to the top of the industry, again.

ICN8F1V VqmZizYf 0wTRIG2Rcj1VNgIWmxWLuDDZCs5gtQhYz MqNwz4hB60tORqMqXR HaSrI5C BFmgZ zRU 33uDy2xq4c7QcqgpasxgoWZcbBjSqS NjfP8CtT2GXPvrVklXg9LrEss7ReXytOCwFn2cfA7mBwl1ACcsh3Yj1bfJphP7ftDFtdbhA

The 3 months comparison Year over Year shows that the organic growth is over 650%. A quick summary of the organic traffic increase for the September 2022 Broad Core Algorithm is below.

  • %670,88 organic click increase which is equal to more than 600,000 organic clicks, 
  • %765,26 organic impression increase which is equal to more than 13,500,000 organic impression increase, 
  • %25,89 organic average position increase which is equal to more than 2,9 organic position increase

The Last 28 days of organic search performance comparison year over year is below.

nvPkF9uPGEYbvrkGRD0EXMzjy ZWZvzMvnLt NXW7fV4nqSxKfcf2UXbXys

Three weeks later version of the Encazip.com from SEMrush is below.

FkVn DkbAuDA Za4jJNPO2wJ92FpXCXHNnyj27pr2FluLZBNWG W8cKKzQJkoSI7lekW4taL0tlFXCqiwVpfUjd3AU2wp982 dIIl8su0lydBEQbZze1J7gaqbrIiIJ3bu2A0GGib3 Xvyu4Fb9plPRvpiZqCQzMJouYjmepXQcPwW0Ki2B se Oev4ZFQ

The organic search increase here is over %174,99, and the organic impression increase is over 245.66%.

To achieve this difference by dominating the niche while showing the brand for the new industries, and fixing the technical SEO tasks as before, there is a tight publishing and semantic SEO research and effort. The competitors’ feedback for the September 2022 BCAU is below.

Sepas.com.tr started to lose its traffic slightly.

Enerjiatlasi regained the positive ranking state by increasing its visibility slightly.

rGk9R6cyHkMAUXtfOQnMQ7pyLeWfW7w0tjy95yxpQepwKEegTNfxrH8j9IswLHZXvy4hfWdKhkwOoyymDaqnV7LX6QI0uAeZKNrqWME0PPIYv2qSWCjMY9I7nWjzR QQhdyisdQ5I1FYEyQ LouQugqrfxzdRT4sHAHldfaP NS0mdV3hXEsOwvWFokimw

EPDK continued to positive ranking state with new gainings during the 2022 BCAU Core Algorithm Update.

Evhj5 mwzFwjlgI4vDbqTUlNaJbRbroszMtlrFzZc3DK8u6hfl57WK8 kXdTFX5elpCyf4Nlo2LechEf0IjFkNVEgTHuAoh35yJrHBBn75cK1AoZFHhJ tDi58 0RAHVDpQSV7262qFFmIQffinnBmI h36e9ej9wX 78v2QSxBq 147BCqhGRmOT2OzcQ

Zorluenerji started to lose traffic as opposed to the results of the May 2022 BCAU of Google.

DHxnXr4ZxSytPpjxxy8cnrBk7PZ6ANY4Gm4cRKXvQw3a6YA4dwNCFs8S7LnsUh5bW0aMneGHQeTlJCWt3n6fjda98LSQcUPhSI0k YXsDKdhc

CKBogazicienstitu started to lose its traffic after September 2022 BCAU as a result of opposing effects of the May 2022 BCAU of Google.

o6i3 CH9KyAoj7G8PXJkBouwSYgCy7UgZDxxengh9iQqKltuUARfiA K6B RlMX3 xiHULBij1xk8fKkAQsoCtHDsnMv SqtUbgZnzK PA7YSghc5KVp5turiRysHCoiY TjsIHsbRxe0Ds9Tu6PDgt8a upnNbcda iDiszdM 5St CB3DkPt4G2gCRGQ

Topical Consolidation for the New Sectors, and Initial Ranking

The topical consolidation is processed in the updated version of the Entity-oriented Search SEO Case Study. The topical consolidation is used for both of the projects, Encazip.com and BKMKitap.com. The topical consolidation represents the increased topical relevance of web sources via using more central knowledge domain terms in the important information retrieval zones with semantically related entities. It helps for saving web sources from irrelevant queries while increasing the rankings for other relevant queries. Algorithmic hierarchy works with the output of an algorithm as input of another.

In other words, if a web source ranks for irrelevant queries to the web source identity, the irrelevant queries might make the web source less relevant to the core topic. For example, if a web source ranks for “electric consumption”, and “electric price fluctuations” mainly, the queries like “electric vibes” represent a social context rather than an economic, or scientific context. But, due to some common phrases, and phrase lists for index construction, the web sources might rank for these types of queries. And, these irrelevant queries might dissolve the overall relevance of the web source for the topic.

Encazip.com has started its publication for the Credit, and Insurance industry from 0. And, the new technical SEO sprint is started by fixing the existing errors, while fixing the design problems, layout component order problems for macro context, and search intent matching.

The initial results for the Insurance Sector initial rankings for Encazip.com are below. 

5kwoYj6I nE336fJPJdYZWh2EAYQZCeuLFAg6XKUlAILmD LN5wYg4kdtcd6ssH 6M1htUM7h eQhPoEGWAodbHPeAsEBsfypNuW7mFpqTFKAclz08hKgooZ6jWf5av6hIpSKaM7z0ObuGVTDNY0Ph3dzEtAFFbhh8WdVVV8ZH7VYn92PqEpzlSU4hYAYQ

The Insurance subfolder of Encazip.com with the increased organic traffic from SEMrush is below.

ALbu3 xvGY 6FGnu4riiEKaUYEbuCwXOpL aVsuLdU4JYhPkL2VDpoXitkVi Ec4icaJRhrYuIUxMmNVPkKEhkX1CFuDxIdkUau4ju 2kIn4OmVD8f6bEweTAisFc045cq7DStNaN vGMVVKWWoouGzQ56C4V5MVfGdw9oarZjpKr61QTd28mV6FUl5p7g

The Google Search Console graphic to show the initial rankings of the Insurance Industry is below. It signals the “How Google Ranks” SEO Case Study topics.

GX8pwVEJiqbSaOO8bFl8CzOsXnThpvXduoEMbP354hNHBI9quKu58h9 VMk5BSHs7gQTPq Is0udlAZ ZkvNY7bIHe78D3a6ssFIK1xqctlKmOhl1wv3mB8O0rYyG3rjov7v30AbAk 8Jd7SIaoLhgut6G2uLxujBRcxg fqZkD8AeHVN0CgJgiccZbYJQ

The Credit Industry subfolder’s organic search performance graphic for Encazip.com from Ahrefs is below.

U bk4CBV3zG3Ca3sQbdoZxRxgaAQfPUnTeyDW0HKg4rspz69bazm6p079mbVl8y22WHpEYRjFnhjJF

The same subfolder’s organic search performance from SEMrush for the same time period is below.

ajfm97Y4I srnF4XrV5ZWXECSuBI6rg0rH37TIpqFnq846S24dv0hnYvmfsrEJ jhGM Ci1mTU9NYA7en6ODr0Ndz TWG90QDY5QJO224A xGqRA J3pTw 0f5gvg8wyZ5zndlSxdhUXlTF IPk FMnVs

The credit industry organic search graphic for Encazip.com is below.

XmZ9ZIRhg8lgNX7x0Xnoufsl98TSKvMe1f83FmCZOF5fmNzvNTRFALolQsJuoj2MIIrGsAQf8aMfT2zAJVOYAW9r3qdLgSV3 93j1G1D5K1lvTkmppVp3uTjE6JpXLCT k51qvptg2JNqE4VmC Y1ANWmkz7KMhapsGfI

Both of the industries create 10% of the clicks daily for the entire web source for now. It means that the website started to unite different topics under its base identity, which helps further rankings.

Summary of Conclusions for 2022 September BCAU Effects of Google

The summary of conclusions for the 2022 September BCAU Effects of Google is below.

  • The BCAU 2022 September Broad Core Algorithm Update of Google reversed some effects of the May 2022 Broad Core Algorithm Update between the aggregators, affiliates, and actual service and product owners.
  • The sources with higher brand authority, popularity, and trust signals ranked higher even if they do not have a solid historical background. Because, Encazip.com, despite it being new for insurance and credit, it outranked many other sources thanks to the deep information that is provided, and the high level of trust signals.
  • Web sources are started to be judged with the helpfulness of content, information literacy, quality, and content advisories. 
  • Search engines started to diversify the search results further with different types of SERP tests such as multi-answered featured snippets, things to know, and popular products type of SERP features. Encazip.com started to rank for more featured snippets than its competitors by dominating a good deal of Information Extraction focused SERP features.
  • The Publication Frequency is kept higher than the competitors, and it continued to help for further query gaining continuously. 
  • The brand mentions with digital PR and publicizing the CEO of the company with high-authority sources helped even further.
  • The authoritative brands increased their organic traffic, while they lose traffic from certain queries. The service providers couldn’t rank better for the definitional queries, while they rank higher for their brand names, including comparative queries for certain regions. It signals that the brand search-demand might make the source closer to its own brand-related topics, while not leveraging its rankability for non-brand-related queries to the same degree.
  • The Ranking State and Rankability of the web sources continue negatively or positively in the same direction until they get another ranking confidence from a core update. Thus, Encazip.com continues to rank higher thanks to historical data, and continuous quality increase. The September 2022 Broad Core Algorithm Update helped Encazip.com to gain further queries, and rankings thanks to a long-time positive ranking state, and constant quality increase which requires staying as an active source.

The search engine Google’s quality, and SERP serving tests, announcements, and updates are explained in a more detailed way in the “Entity-oriented Search SEO Case Study”.

Notes for Helpful Content Update and Product Review Update

Helpful Content Update, and Fifth Product Review Update, try to understand the first-person experience and expertise for different topics to rank real-world authorities higher on search results. Helpful Content Update focuses on Information Responsiveness to understand whether the information on the website is responsive to the query needs or not. The Fifth Product Review Update tries to understand whether the product review is responsive to the possible product seeker for a certain need. Both of the updates encapsulated the 2022 September Broad Core Algorithm Update. 

Encazip.com focused on publication frequency rather than the individual article’s informativeness for the fast launch. It is an advantage because processing the text takes a longer time. The headings of the articles and context vectors are optimized for higher relevance and responsiveness signals, but the actual information inside the articles, they are comprehensive and good for only the root pages, but not for the individual definitional concept pages.

Thus, in the coming months, the situation of the organic search gaining might slowly change. To prevent this situation, and to support the gained rankings, a content revision and configuration campaign will be started. In this case, the historical data, positive ranking state, and search engine’s trust will be racing against the text processing capabilities of the search engine. To understand the effect of historical data and search engine trust, you can watch the related video SEO case study.

Encazip.com serves product reviews too, reviewing brands, banks, or electric distributors related to product review updates. And, these reviews are needed to be strengthened further. At the moment, they are above the thresholds. But, as I stated in the Quality Thresholds SEO Case Study, the understanding of quality depends on the competitors’ quality.

13.0 Last Thoughts on Holistic SEO and Case Studies

Most SEO Case Studies usually focus on just one perspective or angle. Holistic SEO tutorials, guidelines, and case study focus on every vertical of search engine optimization including image, video, web documents, PDF, GIFs, user experience optimization, web page loading performance optimization, local and technical SEO along with semantic SEO. The growth of the Holistic SEO Case Study subject website is announced by Koray Tugberk GUBUR as below.

ummqnSTCmwgGtr wFCGwggt7phBiFqa6RIKKw72xPeLbVaqo8j8VTaBHUKbVi1M7UxjqpwfMG HvsH CErW6DMyavxHgiYuCsrrCUEiEDu1Efnxkb5nB2MTrNXBw V7qxgut 32WiONDfBsnPLLP5Sczjg 18fTiDv80G0Ake spa8KSpRRIYookIKezQA

The Ahrefs shows over 400,000 organic clicks for the source, which actual organic traffic and growth for the holistic SEO case study is higher.

resim 17

The SEMrush shows half a million organic clicks per month for the Holistic SEO case study.

resim 1
Updated Version of the Holistic SEO Case Study with Organic Growth

If a project focuses on only one vertical of Search or SEO, the scope of its findings will naturally be limited. SEO is even being impacted by weather changes, search behavior changes, and the agenda of countries—so take a big-picture view but pay attention to every important detail!

A search engine might not always evaluate different data dimensions and their relationship to each other as it should. That’s why trying to understand the search engine’s perspective and rules/guidelines is important, whilst keeping a strategic eye across all aspects of the SEO Project.

Leave a Comment