Skip navigation links

Technical SEO

There are technical elements at the website and webpage level that can affect search engine rankings. Some of these elements will likely be handled by a website administrator or developer. Others happen on the page level and may be handled by content authors.  

Crawlability and Indexability

Search engines work by using crawlers to consume all the content on a website. The data is then indexed so it is available to serve up to users on a search engine results page, or SERP. Explore the tabs below to learn more about the technical elements of a website that enable these processes. 

A sitemap should be well structured and include all the important pages within a website. It explains the hierarchy and organization of the content on the website. The sitemap can be submitted to search consoles (e.g., Google Search Console) to improve their crawling efficiency.

Benefit for Sitecore users: MSU IT manages the sitemap submission for Sitecore XM Cloud users as part of the site launch process.

A robots.txt file provides search engines with information on which pages they can and cannot crawl. This allows users to prevent crawls on specific pages. It is important to make sure that the robots.txt file is not blocking a search engine from finding vital information that the unit does want crawled.

Benefit for Sitecore users: The Sitecore XM Cloud interface provides a means for preventing crawls on specific webpages. Website administrators and content authors can use this function to easily manage the robots.txt file associated with the website. 

Website Health

Website health is an indication of the overall quality of the website and the content on the website. There are many factors that a search engine can use in evaluating website health. A few key factors are explored on the tabs below.

Many, if not most, users search using mobile devices. For this reason, it is vital that websites be mobile-friendly if they are to be ranked in SERPs. Websites should look current and function well on mobile devices. 

A slow page load will frustrate users, so search engines take page load speed into account for their page rankings. Optimizing code, images and caching mechanisms can speed up load times and improve ranking.

Benefit for Sitecore users: MSU’s Sitecore XM Cloud system is edge-based. It is cached on servers around the country and served up to users from the server closest to their location. This speeds up page load times.

Broken links lead to errors and frustration for users. Websites with high rates of broken links can be penalized by search engines. Website administrators should run scans to identify and fix broken links on a regular basis. When changes on a website will lead to broken links, use redirect links to repair the user experience. 

Search engines such as Google will prioritize secure websites. Savvy users have also learned to look for the HTTPS as an indication of trustworthiness in a website. 

Structured Data and Schema Markup

Structured data refers to the way content is presented to search engines with labels it can understand. Schema markup is the code added to websites to implement the structured data. Schema.org has many different schemas depending on the type of content. Website administrators choose the schema most appropriate for the webpage, implement it with code snippets and then search engines can read the markup. Structured data can lead to more rich search results on the SERP and can improve the crawling and indexing of the website content. It is important to use accurate data markup, focus markup on the most relevant information on a page and test the markup using tools from Google or other search engines.

Benefit for Sitecore users: MSU’s Sitecore XM Cloud components were developed with a data structure. Content authors do not need to worry about adding schema markup. 

Google Tools for Technical Improvement

Core Web Vitals

Google has a specific set of metrics related to user experience on a webpage. These are used as part of its search engine ranking algorithm. The data is pulled from real, anonymous users over a 28-day period.

The three vitals focus on three key areas of user experience:

  • Loading speed, based on how long the main content of the page takes to load
  • Interactivity, based on how responsive the page feels after a user interacts (e.g., clicks or taps)
  • Visual stability, based on how much the layout jumps around as content elements load

Website administrators can use these three metrics as a gauge for website performance and to identify areas for optimization and improvement. To be considered a “good” page, all three Core Web Vitals need to meet a specific threshold for at least 75% of visitors over the 28-day period.

Lighthouse Scores

Google Lighthouse is a free developer tool that audits web performance and SEO. Lighthouse looks at factors such as performance, accessibility and SEO. It also provides best practice recommendations. The data is based on a one-time audit of the website at a specific point in time and does not use real-world data.

Website administrators can use these scores to get a sense of possible areas for improvement, but they should not be considered as representative of the experience all visitors have on the website. Scores range from zero to 100, and the goal is to get the highest score possible across all categories. The Semrush blog has a helpful explanatory article on how to use Google Lighthouse

SEO in MSU’s Enterprise CMS Platforms

Metadata SEO elements are found at the page level of the site tree in Content Editor.

  • Titles Section:
    • Page title — This is the H1 tag. 
  • Navigation Section:
    • Include in sitemap — This impacts when pages appear in the XML sitemap file, used by search engine crawlers to index a website.
  • Metatags Section:
    • Meta description — This is the meta description field. Text entered here will appear in search engine results.
    • Meta keywords — These are keywords that apply to the whole webpage. 
    • No index — Checking this box will prevent search engines from indexing the page.
    • No follow — Checking this box will prevent search engines from following links on the page to other pages or other websites.
    • Canonical — If there is a webpage that has duplicative content and it is preferred that page be the one served up as a search result, enter the page URL here.

Metadata SEO elements are found at the page level of the site tree in Content Editor.

  • Titles Section:
    • Page title — This is the H1 tag. 
  • Navigation Section:
    • Include in sitemap — This impacts when pages appear in the XML sitemap file, used by search engine crawlers to index a website.
  • Metatags Section:
    • Meta description — This is the meta description field. Text entered here will appear in search engine results.
    • Meta keywords — These are keywords that apply to the whole webpage. 
    • No index — Checking this box will prevent search engines from indexing the page.
    • No follow — Checking this box will prevent search engines from following links on the page to other pages or other websites.
    • Canonical — If there is a webpage that has duplicative content and it is preferred that page be the one served up as a search result, enter the page URL here.

Metadata SEO elements are found at the page level under the Page Info tab on the Details menu. 

  • Title — This is the H1 tag. 
  • Keywords — These are keywords that apply to the whole webpage.
  • Description — This is the meta description field. Text entered here will appear in search engine results.

Additional Resources

SEO Basic Principles

What is search engine optimization, why is it important and how do search engines rank content?

Content Best Practices for SEO

Consider how writing, metadata, link strategy, media optimization and page layouts can benefit SEO.

Documentation updated: May 2, 2024