Formation of the site structure
Structure is a logical diagram that shows the relative location and relationships of pages on a website. The first thing you need to do is ensure that it is correct. Why? Just because a site has an incorrect structure at first, it doesn’t make sense to continue working on it. You still need to recycle.
What does it mean:
Indexing – Search robots are more adept at indexing sites that have the right structure.
Visibility – A well-designed structure, in combination with keys and meta tags, increases the site’s visibility to targeted queries.
Usability – Site navigation is intuitive and simple, which is good for behavior factors.
A good example of a well-formed structure
Schema of the correct site structure
SEO is something that every programmer is familiar with. The higher a page is placed in the tree, it is more visible to search engines. This does not necessarily mean that every page should be moved to the next level (where sections are located on a diagram), as this will confuse the navigation.
BCS Coding Days Competition
December 4-5, Online, No Charge
Events and courses at tproger.ru
It is not good practice to misuse redirects. For example, if a user clicks on a page in “Subsection 1”, he suddenly sees a page in “Subsection 2”.
Menus, breadcrumbs and sections with recommended products, materials, most popular themes, and others are additional navigation elements. These are all internal links that should be looped back, roughly speaking. What does this mean? Let’s suppose that there is a menu at the top of the page with a link to Section 1. A back link to Section 1 is necessary for better indexing. This link can be found in the logo or header of the site. Each page on the site should lead to the main site page.
Technical audit of website
This analysis allows you to find weaknesses in the technical part of the site. It is presented by SEO professionals in a report format. You can still analyze the site.
Website loading speed
This is an important metric in SEO. This indicator is important for search engines to rank websites. The programmers must keep track of site speed. PageSpeed Insights allows you to check how fast your website loads from both desktop and mobile devices.
SEO for programmers: Website loading speed
These indicators can also be accessed through the service:
FCP – First Content Rendering;
LCP – rendering the most content
FID – Delay of the first interaction with the webpage;
CLS – Cumulative layout Shift (read our article to learn how to improve CLS).
Validity refers to the conformance of the source code of a site with the rules and regulations set forth by the World Wide Web Consortium (or W3C). Search engines can work more efficiently when a code is present. All other things being equal search engine robots will give preference to sites with valid codes. This means that the site will appear higher in search results.
Validate your site with the Markup Validation Service HTML or XHTML Document Validator. Take into consideration any errors in the results.
You should pay particular attention to tags like title, description, and subheadings H1, H2, and H3 (e.g., only one page should have the subheading h1), as well as alt images. All of these should be optimized according to the key queries for each page. Google Chrome SEO Meta Tags in 1 Click extension is the best tool to quickly check meta tags.
Write a robots.txt File
robots.txt contains search robots’ site indexing parameters. This file contains special instructions (directives) that allow you to control the site’s indexing. Robots.txt can be used to tell the search engine robot which pages are allowed to be indexed.
For this file, two directives are required: User agent and Allow or Disallow. Sitemap is an optional, but very common directive. Let’s look at them in more detail.
The User-agent is the name of the search robot to whom the files belong. Exemples: User-agent Yandex, Googlebot, and User-agent * (includes all Bots).
Indexing individual pages or entire sections of a page is prohibited by Disallow. Examples: Disallow: / catalog /, Disallow: /catalog/page.html.
Allow allows you to index pages and sections of the site. Allow works for all pages of the site, except where otherwise specified by the Disallow directive. Allow is most often used in conjunction with Disallow when one section must be closed from indexing and the other must open.
Sitemap links to the sitemap. This is where you can see its entire structure. Example: Sitemap: sitemap.xml.
A compiled robots.txt example for all search engines with all pages open and a sitemap.
You should ensure that your sitemap is complete and that all pages start with the appropriate protocol for your resource. Most often, this is https: //),. Pages that are important to search engines had a server response code 200 and must be indexed by search engines.
Screaming Frog for SEO
For complex site analysis, pay attention to the Screaming Frog SEO Spider Desktop Program: It is easy to check meta tags and page canonicity, security, and many other things with its assistance.
Uniform address format is used for links. This applies to the slashes at end and the names of sections. Everywhere there should be transliteration or translation into English.
Cross-browser compatibility is important. Mobile devices can be adapted.
System files should be removed from indexing.
Verify that the 404 page works correctly. It should work wherever it is needed.
External links can be made to open in a new tab by adding target = “_blank”, which will affect your behavioral factors.
Repair broken links promptly and fix unclosed paired HTML tags.
SEO for programmers also includes placing capture points – any retention element that encourages them to interact further with the site.