There are technical elements at the website and webpage level that can affect search engine rankings. Some of these elements will likely be handled by a website administrator or developer. Others happen on the page level and may be handled by content authors.
Search engines work by using crawlers to consume all the content on a website. The data is then indexed so it is available to serve up to users on a search engine results page, or SERP. Explore the tabs below to learn more about the technical elements of a website that enable these processes.
A sitemap should be well structured and include all the important pages within a website. It explains the hierarchy and organization of the content on the website. The sitemap can be submitted to search consoles (e.g., Google Search Console) to improve their crawling efficiency.
A robots.txt file provides search engines with information on which pages they can and cannot crawl. This allows users to prevent crawls on specific pages. It is important to make sure that the robots.txt file is not blocking a search engine from finding vital information that the unit does want crawled.
Website health is an indication of the overall quality of the website and the content on the website. There are many factors that a search engine can use in evaluating website health. A few key factors are explored on the tabs below.
Many, if not most, users search using mobile devices. For this reason, it is vital that websites be mobile-friendly if they are to be ranked in SERPs. Websites should look current and function well on mobile devices.
A slow page load will frustrate users, so search engines take page load speed into account for their page rankings. Optimizing code, images and caching mechanisms can speed up load times and improve ranking.
Broken links lead to errors and frustration for users. Websites with high rates of broken links can be penalized by search engines. Website administrators should run scans to identify and fix broken links on a regular basis. When changes on a website will lead to broken links, use redirect links to repair the user experience.
Search engines such as Google will prioritize secure websites. Savvy users have also learned to look for the HTTPS as an indication of trustworthiness in a website.
Structured data refers to the way content is presented to search engines with labels it can understand. Schema markup is the code added to websites to implement the structured data. Schema.org has many different schemas depending on the type of content. Website administrators choose the schema most appropriate for the webpage, implement it with code snippets and then search engines can read the markup. Structured data can lead to more rich search results on the SERP and can improve the crawling and indexing of the website content. It is important to use accurate data markup, focus markup on the most relevant information on a page and test the markup using tools from Google or other search engines.
Google has a specific set of metrics related to user experience on a webpage. These are used as part of its search engine ranking algorithm. The data is pulled from real, anonymous users over a 28-day period.
The three vitals focus on three key areas of user experience:
Website administrators can use these three metrics as a gauge for website performance and to identify areas for optimization and improvement. To be considered a “good” page, all three Core Web Vitals need to meet a specific threshold for at least 75% of visitors over the 28-day period.
Google Lighthouse is a free developer tool that audits web performance and SEO. Lighthouse looks at factors such as performance, accessibility and SEO. It also provides best practice recommendations. The data is based on a one-time audit of the website at a specific point in time and does not use real-world data.
Website administrators can use these scores to get a sense of possible areas for improvement, but they should not be considered as representative of the experience all visitors have on the website. Scores range from zero to 100, and the goal is to get the highest score possible across all categories. The Semrush blog has a helpful explanatory article on how to use Google Lighthouse.
Metadata SEO elements are found at the page level of the site tree in Content Editor.
Metadata SEO elements are found at the page level of the site tree in Content Editor.
Metadata SEO elements are found at the page level under the Page Info tab on the Details menu.
Documentation updated: May 2, 2024