Technical SEO. Why is technical SEO important?
Technical SEO is an important stage in the promotion of any website, which is aimed at the correct operation of its functionality. Even with great content, a website runs the risk of failing in search engine rankings if, from a technical point of view, it is done incorrectly. What requirements should a website meet at a basic level, so that search engines distinguish it from similar queries and raise it in the issue and what technical SEO should be aimed at — read on.
What is technical seo optimization of the website?
Technical SEO-optimization implies a set of measures that will improve the interaction of the website with the algorithms of search engines.
In order for a website to get high positions in the SERPs, it is not enough to have high-quality content filled with keywords and anchor links, and relevant to queries in search engines. It is important to work out the website on the key elements that make up technical optimization:
- website structure.
Bringing to perfection these indicators will improve the quality and speed of crawling and indexing of website pages by search engines, improve user experience, increase conversions and website authority.
Why do you need technical SEO for your website?
The work of search engines is configured in such a way as to show users the most useful, in their opinion, content. There is a whole algorithm by which Yahoo, Bing, Google, Yandex, and other systems search for websites relevant to the query, pre-crawling and evaluating web pages for certain parameters. The better the technical settings are debugged, the more convenient it is for the search engine to evaluate the website. As a result of excellent optimization — getting into the TOP-10 in the search results.
Technical optimization is aimed at increasing the comfort of users when working on the network, its main goal is to make the website fast, reliable and simple. Neglect, like excessive fanaticism, can be too costly. It is better to stick to the golden mean and focus on the user’s needs and search engine criteria.
Website high quality criteria
The main criteria for the technical quality of the website are:
- website loading speed;
- structure and indexing;
- optimization of the website structure;
- optimization of website content;
Next, we will analyze these criteria in more detail.
Page loading speed is one of the first significant technical SEO components that affect the ranking of the website. This parameter is part of the recently published Google Core Web Vitals ranking factor. It is believed that this parameter is similar to face-to-face communication of people and the delay in loading is equivalent to a long pause when talking. If a page loads in more than three seconds, users lose interest in it. For most websites, this is a clear loss of customers and a decrease in profits.
Nevertheless, this indicator remains a “sore spot” of optimizers, since the abundance of images or the presence of developer errors can lead to a critical loss of website speed. Before adjusting this indicator, it is important to understand what factors affect the speed of page loading. First of all, it is:
- The speed of the server’s response;
- First byte arrival time (TTFB)
- Page code processing time and content loading time
- Start rendering (page visibility).
These indicators depend on various criteria (the purity of the page code, the correctness of the website structure and layout, the weight of the files) and directly affect the ranking. So, the lower the TTFB, the higher the page in the Google SERP. When choosing a hosting, you should stop at an option that has the best speed of access to the website and the least period of unavailability of the resource in case of technical problems.
You can determine the loading speed of the website using the free Google Page Speed Insights tool https://pagespeed.web.dev/
Adaptability for mobile devices
Most users value their time and prefer to access websites from any device. Therefore, the mobile version is another important criterion for holding positions in the TOP. If the interface of the page is “lame” when loading from a mobile device or it is inconvenient to read, most likely the user will prefer to leave it.
Special programs, such as Yandex Metrica and Google Search Console, allow you to calculate and compare the percentage of “departures” from a page from stationary PCs and mobile devices. If there are more of the latter, this is a signal that you need to come to grips with the adaptation of the website.
But even with excellent adaptability, the download speed remains an important technical SEO indicator. It can differ significantly from the PC version and affect the ranking result. Therefore, when launching the website, it must be checked on popular mobile browsers.
It should be borne in mind that the search results for mobile and desktop versions of the website in different browsers are different. So, Google shares the issue, but the priority is the mobile version. Its absence affects the ranking.
Security is a basic technical SEO parameter. Back in 2014, Google identified the use of the SSL protocol as a ranking factor. SSL (Secure Sockets Layer) is a certificate that authenticates a website. It confirms that your domain has an individual key with which the information between the server and the user is encrypted, which means a secure connection is established.
Determining the use of a secure protocol is quite simple, for this you need to pay attention to the URL string. An SSL certificate is indicated by a domain that begins with “https://” rather than “http://” and a padlock character in the URL string.
Crawling and Indexing: Basic Principles and Impact on technical SEO Result
In order for the website to get into the issuance of search engines, you need the correct search index. It means that the search engine has rated and remembered the page, and will show it when it is requested.
Scanning is the initial stage when the system “sends” its robots, crawlers or spiders to get acquainted with the website. “Running” through the pages, they read the data and index it. After that, the search engine analyzes the information, determines the keys for each page and saves the data in the search index. Each search engine conducts indexing according to its own principles.
So, indexing in Google takes place on the mobile version of the website. At the same time, the indexes in Google are constantly updated. The quality of pages and their relevance is determined by more than 200 ranking factors; the highest quality is selected, which fall into the issuance of the request. At the same time, low-quality pages are lowered in the ranking, but are not removed from the index.
Of great importance in the efficiency of crawling and indexing is the architecture of the website. The deeper the build and the farther the pages are from the home page, the more difficult it is for the system to identify them. And in the presence of a large number of internal links, it becomes even more difficult for search engines.
Therefore, it is better to stick to a flat construction, which is more convenient for both robots and users. In this case, the rule of “three clicks” is saved to get to the desired page from the main page.
Another point that helps search engines in indexing is properly configured internal links. They should lead from the homepage to the most relevant pages. Robots detect this connection faster, and nothing is lost during scanning.
In order to indicate to the search robot on the pages, their priority, open or close certain pages it is necessary to add XML websitemap and Robots files.txt
This tool includes a list of pages that search engines crawl and index the website on. For robots, the map is created in XML websitemap format, which includes links that the system will rely on when indexing and take into account when ranking. Also, the system can read the map by posts, tags, images and by the date of the last change.
And although the XML sitemap is less important for SEO optimization than the mobile version of the website, it can contain important information for robots, such as:
- The date the page was last modified.
- Frequency of page refresh
- Priority of the pages of the website.
You can test the correctness of the XML sitemap using the Google Search Console for webmasters, which will show how and what the search engine sees the website when crawling. Sometimes in the indexing report you can see that Google Search Console can not fully visualize the content, because full crawling for some reason is not possible.
In this case, the SEOs launch a “frog” on the website — the Screaming Frog program that finds and shows almost all common problems or errors, for example, broken links. This allows SEOs to quickly make corrections or schedule more extensive work, such as correcting meta descriptions.
Indexing of the website is often not as fast as the optimizer would like. The speed of the process is influenced by various factors, from the number of pages to the crawling budget. But if the SEO is not able to influence the process itself, then it is quite possible to build a strategy for bypassing the website for a search robot.
Robots.txt is a file in which the SEO specifies various information, for example, information about search robots or makes a ban on crawling certain pages. This helps the system not to waste time on unnecessary operations.
The mandatory information that is specified in robots.txt includes the address of the website map so that search robots can find it faster. You can also check the correctness of the file using Google Search Console.
Optimization of the website structure
This criterion is also paid attention to when conducting an audit of the website for technical SEO optimization. In addition to the website map for search engines, it is necessary to add additional attributes of the website structure, which would be understandable not only to robots, but also to users. The structure shows the hierarchy of pages and their relationship. A well-thought-out structure allows the user to easily navigate between pages and quickly find content, and search engines better understand them, index them more fully. In addition, a competent structure makes it easy to add new elements.
For SEO, the right structure is important, first of all, from several angles:
- influence on the behavioral reactions of users (time spent on the website and depth of page viewing);
- simplicity and accuracy of indexing;
- better interaction of search engines with the website and its understanding;
- elimination of errors that affect ranking;
- additional links in the search results.
To competently build a structure, you need to understand what you should pay attention to first of all.
This tool shows the navigation path from the homepage to the one the user is viewing. Thus, you can collect information about the most frequently visited pages, ways to go to them to add internal links. This helps users and search engine crawlers find the pages they need faster.
“Breadcrumbs” help not only to improve usability and make the layout of pages more understandable for users, but also participate in linking and distribution of reference weight.
In order for the crumbs to work correctly, it is necessary to comply with certain requirements.
- All intermediate pages should have “bread crumbs”.
- The current page, which is last reflected in the navigation path, should not contain a link to itself.
- In order for the entire chain to appear on the search page, you need crumb markup, which is done using HTML tags. They give information to search engines about the type of content.
URL structure. Proper Link Building
The URL is always displayed in the search result and for the system this indicator is important from the point of view of quickly evaluating the content of the page. The URL looks more attractive and clickable when it does not consist of an abstract set of numbers and letters, but has descriptive categories by which it is convenient to search for the desired content.
Despite the fact that search engines are able to recognize any URLs, it is better not to be lazy and make them correctly, according to all optimization criteria.
- Keywords in the URL help improve rankings.
- It is better not to use a hash in the URL, since such page addresses are most often not indexed.
- To separate words, use hyphens — this will tell the search engine that these are separate words. Such addresses are better perceived and ranked.
- Short URLs are preferable because the system determines its content faster and more accurately.
- The clickability of a URL depends on its accessibility to users. They are much better at clicking on links when they can read where they will lead.
- Make up urLs only from lowercase letters so as not to create confusion in search engines.
Thus, short and most understandable addresses will help improve rankings and change the position of the website in the search results.
Internal linking is the linking of pages to each other by creating hyperlinks to related pages of the optimized website. With the help of linking, the weight from the pages is distributed throughout the website. This technique allows you to raise the resource in the ranking for low-frequency queries, as well as increase the number of transitions within the website.
Internal linking is extremely useful for optimization, as it not only allows users to move quickly and conveniently within the resource, but also helps robots recognize the most relevant pages.
Set up a redirect
In the case of changing the name of the website after rebranding, many owners are faced with a “subsidence” in the issuance and a huge loss of traffic. This is due to incorrect configuration of the redirect (automatic direction of the visitor from one address to another). With a properly configured redirect, when they get to a page with a changed name, the user is simply automatically redirected to the new URL, most often without even noticing it. In this case, the link significance of the original page (donor) is preserved when transmitted to the acceptor page. Redirection occurs without violating the rules of search engines and harming the user. Thus, the redirect helps the user to get from the already irrelevant pages to the active ones.
Redirect is used in the following cases:
- when redirecting from http to https;
- when you switch from an address from www to an address without www;
- when moving the website to another domain in order to exclude the loss of traffic;
- if you need to redirect visitors to another page;
- when redirecting to the mobile version of the website.
In each case, the redirect has its own HTTP status code, according to which robots and search engines understand what kind of redirect is required. For example, a 301 redirect indicates that a page has been permanently moved to a new address. In this case, the browser quickly takes the user to the actual page.
On redirect 302, visitors are temporarily redirected to new pages. This is what online stores do, transferring potential buyers to pages with a relevant product. With such code, you need to be extremely careful, because in this case the reference weight of the original page is not transferred to the acceptor, and search engines perceive these pages as duplicates. As a result, one of the pages is excluded from the issue.
303 and 307 redirects are also temporary and redirect traffic to a new page that partially duplicates the requested one.
You can check the correctness of the redirect in the browser, after clearing the cache or using special services.
Multilingual website versions
It’s highly important to learn how to make multilingual website viersions to build technical SEO strategy. To improve the user experience and rank capabilities in different regions, you must set up different language versions. That is, for the same content translated on the website into different languages, you need to indicate belonging to the region and language. This will make it easier for search engines to understand multilingual architecture. To show the user content relevant to the language, you need to use hreflang markup, that is, the link attribute rel= “alternate” hreflang = “x”.
Optimization of website content
It is necessary to technically optimize both the website as a whole and certain pages and parts of the content. The quality of the website is not a defining indicator for SEO in terms of technical audit, but it is of great importance for users. No matter how well the resource is configured, if it does not interest users with its content, it will lose in conversion to more interesting content. Therefore, this indicator should be given attention.
When creating content, you need to adhere to certain criteria.
- Visual appeal. The user determines whether he wants to get acquainted with the content in the first 5 seconds and during this time he should be interested. The text should be divided into blocks, and the pictures should be located at the level of the “running eyes” along the trajectory of the letter Z.
- Use. Content should carry as much information as possible upon request.
- Uniqueness. With a large number of similar texts, it is important to find than to hook the user.
- Volume. Texts that are too large, not separated by subheadings, look unreadable and most likely will not keep the user on the page.
In addition, the content should be understandable for any user, which is primarily worth focusing on.
This tool helps to structure the data about the page, and search robots quickly recognize the subject of the content. Meta tags should be understandable primarily to the search engine, so that it can easily determine the priority issue on the user’s request.
The main tags that the system uses most often are Title, Description, and H1 Heading. Their system compares with key queries and the position of the website in the issue depends on how correctly they are compiled. Tag optimization is about the ability to quickly recognize them by search engines and the ability to quickly edit.
We have already highlighted Breadcrumbs above, as they are important structured data for determining the structure of the website. But the markup of structured data can be much more diverse.
Micromarkup does not have a direct correlation between its presence and the position of the page in search engines, but at the same time it is an important factor that makes the user go to the promoted resource. Because when you use markup Schema.org, Open Graph, JSON-LD separately from each other or together, extended snippets appear.
Advanced snippets are a visual representation of the promoted page on the SERP page. Advanced snippets attract more attention and increase click-through rates, and therefore revenues of SEOs and website owners.
Micromarkup can be done using one of a variety of services, such as Google Markup Helper or google data marker.
You can check structured data for errors using Google Structured Data Testing Tool.
You can determine how structured data works in search results using the SEOGUN search position verification service.
The service is convenient because to determine their presence you do not need additional actions, it is enough to carry out your routine work — to score keywords in the service to check positions and on the keywords and overview pages you can automatically see the most important SERP features and immediately determine which of them are present in the issue, and which pages of your website are ranked.
SEO optimization of images and large files
Images are an integral part of any website that also require optimization. The advantage of proper placement of images and their search engine optimization is not only to increase the relevance of the page and its position in the issue. Pictures from the page appear in the issuance of images of Yandex and Google, and this will definitely attract traffic.
When optimizing, the following indicators of images are taken into account, which are paid special attention to:
- The size of the picture.
In this case, it is better to give preference to large images. If duplicated images are detected, search engines will choose the one that is larger. But even here it is important not to “overwride”. So, too large photos or pictures are better to place on a separate page. They will be accessed by a link from the main one, on which the preview of the image is placed. Small pictures should also not be too small otherwise the search engine will take them for a design element that does not pass into the issue. Therefore, it is better not to use images smaller than 150 pixels.
The mistake of inept optimization is to compress large images using styles. In this case, the image is distorted, and it also loads slowly when loading, which increases the overall page load time.
The location of the image in the text depends on how it will be evaluated by search engines, which cannot yet accurately recognize the essence of the picture without a text hint. Placing the image closer to the text or in its surroundings will help search engines evaluate the content of the image and determine its relevance. One way to flesh out a picture is to caption it. This technique is especially relevant if there is little text on the page. The signature also requires SEO-optimization, since it is by the keys that search engines evaluate the subject of the image and its compliance with user requests.
3. Image attributes.
They are used to describe pictures if for some reason their display is not available. For example, the alt attribute reveals to the visitor the essence of images if their display is disabled in his/her browser. Optimization in the alt attribute will positively affect the optimization of the entire page, so do not neglect to fill in the description field.
Another title attribute, which is displayed as a pop-up description, also helps with SEO optimization of images.
4. Friendly URL.
A small but significant nuance that will positively affect optimization is to correlate the image and its name by a key element. So, if the main element of the image is a tree, then in the name it is worth writing so — derevo.jpg. Such marking is perceived by search engines better than with assigning a number to a picture.
The basic working rule is that the lower the weight, the faster the loading. Among the most image formats, jpg, .png, .gif that have the optimal combination of image size and weight are considered the most optimized.
As for large text files, here the main point is to highlight their structuring using H1-H6 headings, bulleted and numbered lists, highlighting paragraphs. You also need to remember about meta tags that affect rankings, getting into the snippets of search engines.
Services for verification
To conduct a technical SEO audit, search engine optimisators use various tools that optimally complement each other.
Depending on the amount of data, both paid and free services may be suitable for you.
Probably the most popular among such services is Screaming Frog SEO Spider https://www.screamingfrog.co.uk/SEO-spider/pricing/ , its free version will be enough for a small website containing up to 500 pages.
Similar in functionality will be Netpeak Spider https://netpeaksoftware.com/ru/spider .
In addition, there are separate programs that allow you to evaluate and analyze the website for individual indicators. Many have already been mentioned above. Note that to determine the quality of technical SEO work, it is important not only to change the technical SEO parameters of the website, but it is also important to track the change in the positions of the website. You can monitor the positions of websites using the SEOGUN position tracker service.
Leave a Reply