Search algorithm. How to avoid being penalized

Автор: dinara
21.01.2023, 10:27

Search algorithm are the basis for the work of a SEO specialist. You have developed a website and launched it on the Internet. But there are no visitors or there are too few of them. Perhaps your site on the Web has existed for a long time, but you notice a sharp drop in attendance. What could be the reason?

What is SERP (search engine result page)?

Imagine: you come to a cafe and ask the waiter: “Do you have ice cream?” The waiter: “Yes, we have cream, chocolate, strawberry, banana, pistachio, with kiwi, raspberry. What kind of ice cream do you want?” You are in the place of an Internet user, and in the role of a waiter is a search engine. “Ice cream” is your request, and the waiter’s response is the result of a search results or SERP.

Search results are the result that a user receives from a search engine after entering a key query into the search bar.

A few of most popular search engines in the world are Google, Yahoo, Bing and Yandex. But in this article we will talk more about the search engines Google and Yandex as the most popular in the CIS countries. In the search bar, the user writes a query, for example, “top rank tracker for Google” and receives a relevant issue — a list of different rank tracker services.

TOP-10 is the first page with 10 websites that respond to the user’s request as fully as possible. On the first page of search results (hereinafter referred to as SERP) strive to get all websites, webmasters of which are engaged in promotion. This is a guarantee of the maximum number of visits.

SEO-promotion is the optimization of websites in such a way that they take the highest possible positions in the SERP. But if you do not see the website in the line of sight — perhaps it is still in the search engine database, or appears in the search for other queries.

How the search engine works

A search engine consists of several interrelated elements, each of which performs its functions.

  • Crawlers (also called spiders, robots, bots, agents) are programs that scan through links and send web documents to the database. There are spiders that study websites, images, videos. There are fastbots that instantly collect fresh information.
  • A database (index) is an ordered library of copies of web documents consisting of URLs, the date the agent visited, server response codes, and html codes.
  • Search algorithms — programs that form the SERP.

We presented the work of the SE in a simplified form, since the mechanisms are very complex, filtering, the fight against spam, SERP features, assessors and other mechanisms are connected to them.

What is a search algorithm?

A search engine algorithm is a set of parameters that determine the quality and relevance of a snippet or web page to a user’s query and its search intent.

When a popular query arrives in the search engine, the results are in many cases returned to the user from the cache. When you enter a low-frequency query the search engine algorithm generates a unique response.

Search engines change their algorithm hundreds of times each year, but only significant updates can significantly affect the results. These are often niche updates or updates designed to combat one particular problem, such as E-A-T (for YMYL topics) or Google October 2022 Spam Update (anti-spam)

How to get into organic SERP

  • The website should be indexed.
  • Your page matches a specific search term.
  • The page is better than the competitor’s.
  • The text is structured (headings, subheadings, lists).
  • Meta tags and keywords are correctly spelled out.
  • Visitors should get acquainted with the web resource in detail, spending a sufficient amount of time on it.
  • The page is referenced by reputable donor websites.

What are search algorithm updates?

Update is the process of updating the search engine database. Website positions are changing, so update tracking is included in SEO.

Several times a year this search engine makes significant adjustments to its mechanisms. In May 2022, Google SERP was badly shaken by an update called “Core update 2022”. Western SEO experts called it the most serious and worst in the last few years. Sometimes webmasters reported a drop in traffic of up to 80%.

Google is a smart search engine that learns on its own. And when its employees say “think about the user” — they do not wish the webmasters harm and really do not know what the result will be after updating the system. Of course, they remember what recommendations they “fed” to their artificial intelligence. But predicting in advance which websites will fall into the SERP is like pointing a finger at the sky. But don’t be too scared.

  • SEO can’t stop working. It moves endlessly in the direction of common sense, convenience and usefulness.
  • Good SEO always “shoots” and brings traffic steadily.

You need to know about updates, always be aware of the positions of the website, and if you observe a subsidence — with the help of materials on the Internet or empirically, you should “tighten” the screws and improve SEO.

Major search algorithm updates (Google)

Panda

Panda is one of the first Google algorithms aimed at optimizing the quality of text content.

According to this algorithm content should be relevant, high quality, original, not duplicated on the pages of the site, precisely hit the subject site and be structured. So, the entire algorithm is designed to improve the quality of content in the search engine network

Penguin

Penguin is the first update aimed at combating promotion to the top of the search using link promotion. Thanks to this algorithm, it is now necessary to choose resources more carefully. Links should be relevant to the subject and should be referred from quality donors

Hummingbird

Google Kolibri is an algorithm of Google search engine, which uses the concept of semantic search and is able to distinguish the “meaning” of a search query in context while giving the most relevant search results. Thus, the search engine has a better understanding of pertinent queries, giving results not with an exact phrase match, but with an exact thematic match. This is achieved with the help of natural language processing that relies on latent semantic indexing, co-occurring terms and synonyms.

RankBrain

RankBrain is a machine learning system that allows Google to better decipher and understand the meaning of user requests and provide more relevant results depending on the context. The point is that artificial intelligence recognizes not just the different grammatical forms of one word, but also synonyms and even the semantic meaning of a word.

The search engine is now able to distinguish the semantics of words, thanks to the Knowledge Graph database, which contains all the information about the possible semantic load of words and various semantic relationships. Thus, artificial intelligence links even queries that are not directly related and produces the most appropriate results.

E-A-T

The E-A-T algorithm aims to fight for quality, accurate and meaningful content. The translation of the abbreviation E-A-T means “Expertise, Authoritativeness, Trustworthiness”. According to this algorithm, specialized content for YMYL articles should be created only by specialists with sufficient expertise in the field. YMYL (Your money — your life) is aimed at securing the users in matters that concern their health and money. It is relevant to Legal, Medical, Financial services and Security issues.

Google Passage Ranking

Using a special algorithm SMITH (Siamese MultI-depth Transformer-based Hierarchical), Google is now able to understand the meaning of individual fragments of text on pages, thus providing the most relevant results in the search. In this case, a small text fragment on a page that is blurred in meaning, which very accurately meets the user’s query, is able to rank high enough

Core Web Vitals

Core Web Vitals (CWV) is a component of the Google Page Experience search algorithm. The purpose of this filter is to separate those resources with which users are comfortable interacting. It was created on the real experience of visitors to sites. The components of the algorithm are: a measure of speed, interactivity, an indicator of the stability of the layout. A number of studies have shown that sites that load longer than 4 seconds, fall lower in the rankings.

Are filters as scary as SEOs paint them?

Filters are a negative reaction to violations of the rules of search engines, especially the algorithms we wrote about earlier. This is a kind of sanctions. They do not allow you to rise high in the TOP and interfere with promotion. Previously, it was considered an “acute illness” that required immediate treatment. For now are similar to chronic diseases. The symptoms of “diseases” are blurred, it is difficult to identify filters and it is also not easy to get rid of them. For example, a few years ago, treating a “patient” with excess incoming link mass required simply removing those links. And now we have to revise the marketing strategy completely. Therefore, the promotion of the website should be competent, so that later colossal efforts are not made to get rid of filters.

Types of filters

● Manual (superimposed with the help of human assessors).

● Algorithmic (the search engine detects violations on its own and imposes a filter on the website).

Filters are imposed for text spam and bad content, the acquisition of incoming links, black SEO, cheating behavioral factors, lack of a high-quality mobile version of the web project, slow page loading speed, broken functionality, etc.

How to identify a filter

A clear indicator of sanctions is the lack of traffic and a decrease in search positions. You will learn about problems with SERP if you regularly monitor the positions in the SEOGUN service. SEOGUN mechanisms allow you to instantly see suspicious changes because of the notification system and regular reports.

Comments:

Leave a Reply

Your email address will not be published. Required fields are marked *

Search posts

Recent posts

Recent comments