SEO is an essential part of developing any website. Without proper implementation of SEO practices the website and its content will not be findable on search engines. Without showing in in search engines or in search queries – what was the point in developing the website?
If you want the website to be seen and used by the target audience, then you should be considering SEO from the very start of the project. Web developers often view SEO as an item that the digital marketing team will work on after development has been completed – but there are a few key SEO considerations web developers should keep in mind as they develop the website.
Here are the top six SEO considerations for web developers and web development teams to keep in mind DURING the website development process.
1. Mobile First Web Development
This should come as no surprise, Google prioritizes Mobile First Indexing. This means that the Googlebot scans the mobile version of a website before it scans the desktop version for search engine indexing. Furthermore, the mobile version of a website is used in Google’s search ranking calculation rather than the desktop version of a website. Prioritizing mobile first development best practices will help the website rank once it has gone live.
Mobile First Web Development Best Practices
- Make sure Google can access and render your content
- Use the same meta tags on mobile and desktop.
- Don’t lazy-load primary content on user-interaction.
- Google can not see lazy loaded content.
- Make sure content is the same on mobile and desktop
- If the mobile site has less content than your desktop site, consider updating your mobile site so that the primary content is the same as your desktop content.
- Use the same headings, make them meaningful & clear.
- Use the same meta titles & descriptions on mobile and desktop.
- Structured data
- Make sure your mobile and desktop sites have the same structured data.
- Use correct URLs in structured data.
- If the URL’s change from desktop to mobile use your mobile URLs.
- Check visual content
- Use high quality images but in small file sizes.
- Use supported image formats.
- Mobile sites should have the same alt tags and titles tags on images as they do on desktop.
- Place videos high up on the page.
2. Website Speed and Page Speeds
Site speed and page speed are two top Google ranking factors. Google especially prioritizes mobile site speed, as Google indexes mobile websites first.
Page speed is not only crucial for user experience but impacts a crawler’s ability to navigate your site. Slow page speed means that search engines can crawl fewer pages. Resulting in a negative effect on website page indexing.
Page speed has a large impact on user experience. The longer pages take to load the higher the bounce rate and the lower the average time on page will be. Long load times have also been shown to negatively affect conversions. Conversion rates drop about 4% for each additional second of page load time over 5 seconds. 70% of users polled said that slow page load times affect their willingness to complete a purchase on a website.
Site/Page Speed Key Metrics
- First Contentful Paint ( FCP ):
- When the browser rendered the first bit of content.
- Goal: 1.2 seconds or less
- Font load time is particularly important for FCP
- Ensure text remains visible during webfont loads
- Time To Interactive ( TTI ):
- The amount of time it takes for the pages to become fully interactive.
- Goal: 2.2 seconds or less
- Speed Index:
- How quickly the content of the page is visibly populated.
- Goal: 3.4 seconds or less
- Largest Contentful Paint ( LCP ):
- Measures perceived load speeds
- It marks the point in the page load time when the page’s main content has loaded ( usually the largest image or text block ).
- Goal: 2.5 seconds or less
- Contributing factors :
- Server Response Time
- Resource Load Times
- Client Side Rendering
- Total Blocking Time ( TBT ):
- The total amount of time that a page is blocked from responding to user input. Such as mouse clicks, keyboard presses, etc.
- Goal: 200 milliseconds or less
- Cumulative Layout Shift ( CLS ):
- Measures the visual stability of a page, quantifies how often things move around the layout unexpectedly
- Goal: 0.01 seconds or less.
- ou can improve CLS by including size attributes on image/video elements, animate transitions in a way that provides continuity.
Site/Page Speed Best Practices
- File Compression
- A configuration of files on the website that allows the website to serve smaller files via gzip compression and improve page load times.
- Removing any unnecessary characters from the source code to reduce the file size.
- Reduce Redirects
- Redirects create an extra step while loading a page more so redirect chains create several extra steps while loading a page which increase page load times.
- Prioritize scripts that are necessary for the initial page render, after critical elements are rendered other elements will begin to render.
- Optimize Images
- Keeping image sizes small, and ensuring they are the right format and compressed for the web.
- Leverage Browser Caching
3. Website Structure
Website structure is an important area to consider when optimizing for SEO. Website structure is how both users and crawlers will understand your website and its content. In the sense of its categorization, hierarchy and logical flow of information – these items are crucial to plan out in the development process of the website in order to avoid major structural problems post site-launch.
Page URLs are more than just the address you use to reach a page. It establishes your website structure. A clear site structure helps both search engines & crawlers understand your website from a high level.
In particular, it identifies how users travel around the website and the content hierarchy. Because Google is moving towards automation a well crafted URL should provide both humans and search engines with an easy to understand indication of what to expect at the page destination. Additionally, URLs carry some weight as a ranking factor. Google uses URLs to determine a page’s relevance to a search query.
Planning out URL structure during the development process creates clarity to the website structure, and helps to avoid URL changes after website launch which may create 404 errors or require additional 301 redirects.
Website navigation should be a clear path of links to access the website pages. From how URLs are broken out into categories to how you link those URLs into pages and menus. This should be intuitive. Neither users nor crawlers should not have to work very hard to find out what they are looking for.
Common Website Navigation Mistakes
- Having a mobile navigation that shows different results than your desktop.
- Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler.
- Forgetting to link to a primary page on your website through your navigation.
- This is called orphaning your pages.
4. Crawling, Indexing & Robots.txt File
A search bots’ ability to crawl a website is the first step to being indexed and subsequently ranking – bots get their crawling rules from the Robots.txt file.
If you are not seeing a website show up in search results this could be due to several crawler related issues:
- Your site is brand new and hasn’t been crawled yet
- Your site isn’t linked to from any external sites
- Your site’s navigation makes it hard for a robot to crawl it effectively
- Your site contains some basic code called crawler directives ( found in the robots.txt ) that is blocking search engines
- Your site has been penalized by Google for spammy tactics ( looking at you keyword stuffers )
- Your site content is hidden behind a login
Crawling vs. Indexing
Crawling and indexing are the fundamental steps that take place while a robot is processing a website. It is important to know what these two processes are and what the difference is between them.
Crawling is when search engines scour the Internet for content. They are looking over the code/content for each URL they find. Indexing is when search engines store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
Robots.txt files are located in the root directory of websites ( ex. yourdomain.com/robots.txt ). It tells crawlers which part of your site search engines should and shouldn’t crawl. As well as, the speed at which they crawl your site.
Robot.txt Must Knows
- In order to be found, a robot.txt file must be placed in a website’s top-level directory. Robots.txt is case sensitive. The file must be named “robots.txt” not “Robots.txt” or “robots.TXT or any other iteration.
- The /robots.txt file is publicly available. You can add “/robots.txt” to the end of any root domain to see a website’s directives.
- Each subdomain on a root domain uses separate robots.txt files. This means that both blog.example.com and example.com should have their own robots.txt files.
- It’s best practice to indicate the location of any sitemaps associated with the domain at the bottom of the robots.txt file.
Web development teams should be aware of different crawl errors and how to resolve them. These errors are either 400 or 500 level http status errors. These are errors that make pages entirely inaccessible to site visitors.
- 400-level errors mean that the content cannot be found or it is gone altogether.
- 500-level errors indicate an issue with the server.
5. Website Redirects
Redirects include permanent ( 301 redirects ), temporary (302 redirects & 307 redirects ) and redirect chains. Setting up proper and appropriate redirects is essential. They impact user experience and crawlability.
Types Of Redirects
There are 3 main types of redirects, permanent, temporary and redirect chains. They all have different uses.
- Permanent Redirects: Or 301 redirects are a way to tell both searchers and search engines that your page has moved permanently. This option is best for SEO.
- Temporary Redirects: Or 302/307 redirects divert users from one URL to another temporarily. These types of redirects do not pass much “link equity” between pages.
- Redirect Chains: a redirect from one page to another that redirects to another page and so on. The problem with this is that it takes a few seconds for every redirect to load on the user side. As a result Google decreases the “link equity” each redirect. When a chain is too long, Google won’t even attempt to reach the final page.
Using 301’s Redirects To Avoid 404 Errors
301 redirects are the optimal choice in SEO. Without the redirect, the authority from the previous URL is not passed on to the new page. Which helps Google find and index the new version of the page. Additionally, this redirect ensures users find the page they’re looking for or something similar to it.
The presence of 404 errors on your site do not harm search performance. However it does impact the user experience. Allowing your visitors to click on “dead” links will take them to error pages instead of a page with what they were looking for or something similar, which can be frustrating.
6. On-Page SEO
On-page or on-site SEO is the practice of optimizing web page content for both search engines & users. These are the types of things that can improve search volume and rankings.
On-page SEO is key to gaining and improving SERP rankings and visibility. On-page SEO tells search engines about the content on your page and the value it provides to visitors.
Creating and publishing content is not enough. You must add value by optimizing for search engines and human experience. On-page SEO helps search engines find the most relevant search results for a query.
- On-page SEO includes items such as:
- Including page meta descriptions and titles
- Proper title tag markup on links
- Proper alt and title tag markup of images
- Use of H structure
- Use of internal links
- Clear URL structure
- Length and quality of content
So … Why Should Web Developers Care About SEO?
Simply put, just because you built a website does not mean people will see it. Nor does it mean it will just show up in relevant search results. Take the time to work with an SEO early on in a website development process. Working with an SEO or with the guidelines we worked through will ensure each page on your site has proper SEO structure and markup.
If you are responsible for building a website you are also responsible for making sure it has the ability to rank. Many site ranking factors come into play early on in the development process. Such as site navigation, mobile development, url structure, etc. If you are in the habit of circling in an SEO at the end of the project or solely assigning SEO tasks after the site is built, you are creating problems that do not need to exist.
Furthermore, site maintenance involves ongoing SEO practices. Things such as crawl errors, redirects, etc. Understanding the best practices and how to accurately address them without harming site visibility is an essential part of working with SEO as a website developer.