Auditing Technical SEO: A Step-by-Step Guide

Auditing Technical SEO

Written by Jeremy Earle, JD

July 1, 2022

Learn how to create a technical SEO audit for your SEO clients by reading this in-depth tutorial. An in-depth SEO audit is a big thing, and I’m not going to lie about it. In addition, as an SEO expert, there are few finer words than, “Your audit looks fantastic! Is it possible to bring you on board at this time?” Thank you!” In the absence of actively seeking a new job, knowing that your SEO audit was spot-on is an enormous confidence booster. However, are you afraid to begin? Is this your first-ever SEO audit? Or, maybe you have no idea where to begin? When you provide a high-quality SEO audit to a prospective customer, you’re in the greatest possible position. So don’t be rushed. Remember that your main aim is to provide value to your consumer in both the short-term and long-term by recommending relevant websites to them. For the sake of this column, I’ve summarised the essential procedures involved in doing an SEO audit and provided some insight into the initial part of my workflow when working with a new client. Sections are included below. You may skip forward to the next section if you’re certain you’ve mastered the previous one.

Which Time of the Year is the Best Time to Conduct an SEO Audit?

An intro call is scheduled when a prospective client sends me an email showing interest in working with me and completes my survey (Skype or Google Hangouts is preferred). As part of my preparation for the call, I do my fast SEO assessment based on their survey responses to get a sense of their market. It’s like dating a stranger. If they’re on Facebook, Twitter, or Instagram, you’re going to stalk them. #soIcreep. A sample of my survey is shown in the following image. During your initial encounter with the customer, be sure to ask the following questions: What are your company’s long-term objectives? Do you have any PR, social media, or other channel-specific goals? Who are you trying to reach with your product or service? How many business alliances do you have? How frequently is the website being updated, and what is the average frequency? Do you have an IT department or a web developer on staff? Is this the first time you’ve hired an SEO consultant? Or, was there any past SEO work done? There are also some terrific questions to ask a new SEO client that Sujan Patel recommends. Then, after the call, if we both agree that this is a good fit, I will send over my official proposal and contract. First, I prefer to give my customers a month-long trial period to see whether we’re a good match. First, we get to know one other as friends before deciding to date. An in-depth SEO analysis will take place this month. These SEO audits might take anywhere from 40 to 60 hours, depending on the website’s size. The audits are organized into three sections and presented using Google Slides. Errors in the crawl and indexing, hosting, etc. Keyword research, competition analysis, content mapping, metadata, etc., are all examples of content. An examination of a backlink profile, growth strategies, and so forth. The SEO audit suggestions will be implemented if the customer appreciates my work after the first month. In the future, I plan to do a mini-audit every month and a thorough audit every three months. So, here’s a quick review of what an SEO audit looks like for one of my clients: Month one. A month (mini-audit). Fortnightly (in-depth audit). Before an SEO Audit, what you need from a client. When I begin working with a new customer, I’ll provide them with a Google Doc containing a list of passwords and service providers. This includes The ability to use Google Analytics and any other analytics tools. Ads from Google and Bing. Tools for webmasters. Access to the website’s database from the outside. Accounts in social media. An alphabetical list of suppliers. An alphabetical listing of the internal team members (including any work they outsource). Auditing SEO Tools There are many tools I use to do an SEO audit, so here’s a list: Insanely shrill Frog The Xenu Sleuth and Integrity (for Mac users) (for PC users). Browser for search engine optimization (SEO). The Archive. Moz. BuzzSumo. DeepCrawl. Copyscape. A tag manager for Google Chrome. Chrome extension for Google Tag Manager. This is Annie Cushing’s campaign tagging guide. Analytical tools provided by Google (if given access). Webmaster Tools at Google (if given access). Tools for optimizing websites for search (if given access). A Signal Is Received. Pingdom. PageSpeed Indicator. Sublime Text is a text editor. doing a technical audit for search engine optimization The following are the tools required to conduct a technical SEO audit: Insanely shrill Frog DeepCrawl. Copyscape. Mac Integrity (or Xenu Sleuth for PC users). You may use Google Analytics to track your website’s (if given access). You may use Google’s Search Console to monitor your website’ (if given access). Google’s Webmaster Resources (if given access). DeepCrawl and Screaming Frog should be added to your site. Tools: DeepCrawl. Copyscape. Insanely shrill Frog Using Google analytics. Integrity. A tag manager for Google Chrome. The code for Google Analytics. When Using DeepCrawl, What to Look for My initial step is to add the website of my client to DeepCrawl. Your client’s site may take a day or two for the crawl results to be returned. Following your DeepCrawl findings, here are some things I check for: Content Duplication To find duplicate material, go through the “Duplicate Pages” report. The <meta name=”robots” content=”noindex, nofollow”> tag will be added to the pages containing duplicate content if it is found, and I’ll urge that the client update these pages. The following are examples of common duplicate content mistakes: Using the same meta titles and meta descriptions is a bad idea. I’ll utilize Copyscape to see whether any text on tag pages is being copied. Two fields (ex: yourwebsite.co, yourwebsite.com). Domains inside a domain (ex: jobs.yourwebsite.com). The same stuff is on a separate website. Pagination pages that are not correctly implemented (see below.) What to do: Let Google know your preferred URL by including the canonical tag on your pages. Invalid URLs should not be allowed in the robots.txt file. Rework the Text (including body copy and metadata). This is an example of a problem I encountered with a customer of mine regarding content duplication. They had URL parameters without the canonical tag, as seen below. This is what I did to remedy the problem: I took care of any 301 redirect problems that may have arisen. Asked Google to crawl the page by adding a canonical tag. The parameters in Google Search Console should be updated to eliminate any criteria that do not provide unique content. The robots.txt prohibit function was added to the wrong URLs to increase crawl budget. Pagination You should read the following two reports: Review the “First Pages” report to see which pages use pagination. Then, you’ll be able to manually inspect the pages on the site to see whether pagination has been properly applied. “Unlinked Pagination Pages” will inform you whether the rel=”next” and rel=”prev” is linked to the previous and next pages to determine if pagination is operating properly. Using DeepCrawl, I was able to determine that the following client had reciprocal pagination tags: What to do: Add the rel=”canonical” tag if you have a “view all” or “load more” page. Crutchfield has a good example: Add the normal rel=”next” and rel=”prev” markup if you have all your pages on different pages. Macy’s offers the following illustration: Make sure that you include the URL of the paginated page in your javascript. Here’s one from American Eagle as an example. The maximum number of redirections. View all sites that redirect more than four times in the “Max Redirections” report. If there are more than five redirects in a row, John Mueller said in 2015; Google would cease tracking them. The term “host load” is used by Gary Illyes to describe the phenomenon of crawl mistakes eating up the “crawl budget.” Your pages must display correctly to maximize the efficiency with which your web server resources are used. Some of the response codes you’ll encounter include: Most of the codes you’ll come across throughout your investigation are 301. If there is just one redirect and no redirect loop, then a 301 redirect is OK. For more than a few months, I would personally convert these codes to 301s to ensure that they remain permanent. Often, when an item is out of stock, I’ll notice this error number on an ecommerce site. No one can access the page at all. 403 – Unauthorized users are attempting to get access to the page. 404 – Unable to locate desired web page (usually meaning the client deleted a page without a 301 redirect). Error code 500 – An internal server error that requires a conversation with the web development team. What to do: Delete any internal links to previous 404 pages and replace them with the redirected page’s internal links. Remove the intermediary redirects to break the redirect chains. When a redirect goes to redirect B, C, and D and you want to reverse those two redirects, you’ll want to accomplish so. A to D will be the outcome. If you’re utilizing Screaming Frog or Google Search Console, there is also a method to accomplish this below. When Using Screaming Frog, What to Look for Screaming Frog is the second tool I use to acquire a new client site. In certain cases, I may modify the settings to crawl just a portion of your client’s website at a time. Here’s how I set up my Screaming Frog spiders: Spider settings or excluding sections of the site may be used to do this. Here are the things I check for when I get my Screaming Frog results: Code for Google Analytics Screaming Frog may assist you in determining which sites are missing the Google Analytics code by analyzing URLs (UA-1234568-9). In the menu bar, choose Configuration, then Custom. This is how you locate the missing Google Analytics code: When you add analytics.js to Filter 1, you should alter the drop-down to exclude it. What to do: Request that the code is added to the pages missing from your client’s website by contacting the website’s developers. To learn more about Google Analytics, continue reading the section that follows. Google’s tagging service To find out which sites are missing the Google Tag Manager snippet using Screaming Frog, follow these steps: Customize your settings by going to Settings > Configuration. You need to include iframesrc-“/www.googletagmanager.com/ It doesn’t include any of the items you’ve specified in the Filter. What to do: Check Google Tag Manager to check if there are any issues and fix them if necessary. Ensure your client’s developers have the code and see if they can put it back on the site. Schema If your client’s site uses schema markup, that should be another thing you look for. Search engines may better grasp a page on a website by using schema or structured data. Screaming Frog may be used to check for schema markup. Customize your settings by going to Settings > Configuration. Make sure that ‘Contain’ is chosen in the Filter for itemtype=”http://schema.org/.” Indexing Follow these steps in Screaming Frog to find out how many pages your client’s website has been indexed: Check Directives > Filter > Index to see if there are any missing parts of code once your site finishes loading in Screaming Frog. What to do: Search engines like Google may not have indexed your new site since it is so fresh. Verify your robots.txt file to ensure nothing is blocking Google from crawling your site. Verify that Google Search Console and Bing Webmaster Tools have received your client’s sitemap. Carry out a manual investigation (seen below). Flash For the sake of website loading speeds, Google stated in 2016 that Chrome would begin suppressing Flash. It is important to know whether or not your new customer is utilizing Flash while conducting an audit. Try the following in Screaming Frog: The Spider Configuration may be found in the menu. To verify that the SWF file is valid, click the Check SWF button. After the crawl is complete, sort the Internal tab by Flash. What to do: YouTube videos may be embedded. It was a foregone conclusion when Google acquired YouTube in 2006. You may also use HTML5 video standards instead of Flash. Add a video using HTML5 by following this example: JavaScript As long as you’re not banning anything in your robots.txt, JavaScript is OK to use on your website, according to Google’s declaration in 2015. Nevertheless, you should look at how Javascript is provided to your site. What to do: Verify that robots.txt is not blocking the execution of Javascript. Verify that the server is running Javascript (this helps produce plain text data vs dynamic). This post by Ben Oren explains why Angular JavaScript can be harming your SEO efforts. Check JavaScript in Screaming Frog by clicking on the Spider Configuration in the navigation bar. After the crawl is complete, use the JavaScript filter on the Internal tab to narrow down your results. Robots.txt First, check the robots.txt to determine if anything critical has been banned or forbidden in the file. The following code is an example: All web spiders are unable to access the website of your customer. You should be OK, though, if you have a robots.txt file like Zappos’s. They merely prevent web crawlers from finding what they don’t want them to. The web crawler has no business seeing this prohibited material. What to do: Update your robots.txt to use all lowercase letters. If you wish search engines to crawl any of the sites designated as Disallow, remove them from the list. By default, robots.txt-restricted URLs will not be loaded by Screaming Frog. Screaming Frog will disregard all robots.txt files if the default parameters are changed. After your Crawl is complete, go to the Response Codes tab in Screaming Frog and choose the Banned by Robots.txt filter to see the blocked sites. The robots.txt should be distinct for each subdomain if you have a multi-domain site. Robots.txt should include a sitemap. Errors in the Crawl To detect and cross-check crawl problems for my clients, I utilise DeepCrawl, Screaming Frog, and the Google and Bing webmaster tools. Screaming Frog’s crawl issues may be found using the following steps: Go to Bulk Reports after the Crawl is complete. The server-side error report and the client-side error report may be exported from Response Codes. What to do: As a result of the client error reports, you can likely 301 redirect a large portion of the 404 problems on your site yourself. Collaboration with the development team is essential in determining the root cause of server errors. Make a backup of your website before making any changes to the root directory. PHP memory limit may need to be increased, or you may just need to create a new.html access file. These permanent redirects and any internal or external links in the sitemap should be removed. You may also include a 404 in your URL to assist Google Analytics in monitoring your site better. Chain Reversal Redirect chains are bad for user experience, page performance, conversion rates, and whatever link love you may have had in the past are all negatively impacted by redirect chains. Any organisation may benefit from fixing reroute chains. What to do: The crawl route of your redirects may be seen in Screaming Frog under Reports > Redirect Chains. Your 301 redirects may be monitored in an Excel spreadsheet to ensure they are still 301 redirects. A 404 error indicates that something is wrong with your site. Links to both internal and external resources It’s not a nice user experience when a user clicks on a link to your site and receives a 404 error. Even worse, it won’t boost your search engine rankings. Integrity for Mac is what I use to track down all of my shaky connections, both internal and external. If you’re a PC user, you may also use Xenu Sleuth. If you’re using Screaming Frog or DeepCrawl, I’ll teach you how to locate these internal and external connections. What to do: Use Integrity or Xenu Sleuth to find all the broken links on your client’s site. There are two ways to do this: you can either do it yourself or approach your dev team for assistance. If you’re using Screaming Frog, go to Bulk Export in the menu bar, then All Outlinks when the Crawl is complete. The URLs that are returning 404 errors may be sorted. Repeat the same procedure with all of the links you’ve added. In the Internal Links area of DeepCrawl, click on the Unique Broken Links tab. URLs It’s important to check the URL structure of new clients whenever you accept a new project. Which URLs should I be searching for? If the URL contains unusual characters such as, what do you do? You can’t avoid duplicating material by not optimising dynamic URLs like = or +. I like to keep URLs short and basic, and I remove any unnecessary slashes to make them more accessible to end-users. What to do: In Google, use “site:www.buyaunicorn.com/ inurl:” to get parameter URLs. ” or anything else you can think of as a parameter.” Take a look at URLs after running the Crawl on Screaming Frog. To avoid duplicate material, look for the following parameters and propose them: You should include a canonical tag on the main URL page. You may see this on the website’s main page: www.buyaunicorn.com/magical-headbands/? The canonical tag would need to be added to www.buyaunicorn.com/magical-headbands if dir=mode123$. In Google Search Console, go to Crawl> URL Settings and change your parameters. In the robots.txt, exclude the URLs that have the same content.

Step 2: Examine Search Console and Bing Webmaster Tools in Google and Microsoft, respectively.

Tools:
  • GSC (Google’s search engine).
  • The Bing Webmaster Tools.
  • The Sublime Text (or any text editor tool).
Specify a Domain of Preference When it comes to SEO, it’s important to tell the search engines which domain you prefer. Having your connections concentrated on a single website rather than being dispersed throughout many, such as two or more, is an additional benefit. What to do: The gear icon in the top right corner of Google Search Console is where you’ll find all of the settings for your site. Decide on your favourite domain name from among the available options. The preferred domain does not need to be specified in Bing Webmaster Tools, as long as your sitemap is submitted and Bing uses it to decide your preference. Backlinks Your client’s backlinks must fulfil Google’s guidelines now that Penguin is real-time. Your client’s site may be getting a lot of backlinks originating from a single page on another website, and you’ll want to clear it up ASAP! What to do: Go to Links > then arrange your top connecting sites in Google Search Console. To get the connections removed, contact the firms connecting to you from another website. Add these to your list of things you don’t want to do. Be very cautious about how and why you put firms on your disavow list. Important links should not be deleted. An example of my disavow file may be seen here: Keywords My goal as an SEO consultant is to familiarise myself with my client’s market. Who is their intended audience? What are they looking for? How do they search? First, I check out the keywords that are currently driving traffic to their site. To see which terms bring in the most traffic for your client, go to Google Search Console and click on Search Traffic > Search Analytics. Sitemap Google and other search engines need sitemaps to index your client’s website. It’s like a translator for them. There are certain aspects to keep in mind while designing sitemaps: Sitemaps should not include parameter URLs. Do not include any pages that are not searchable. The rel=”alternate” element should be included in the sitemap if the site has separate subdomains for mobile and desktop. What to do: Compare the URLs in the sitemap to those in the web index by going to Google Search Console’s Index > Sitemaps. Do a manual search to find out why certain pages aren’t being indexed. Make sure to delete any outdated redirected URLs from a client’s sitemap. In the absence of their removal, the outdated redirects will negatively influence your SEO. Submit a sitemap for the client in Bing Webmaster Tools and Google Search Console if the customer is new. Crawl You should keep an eye out for crawl issues since they are detrimental to your users and your website’s rankings. According to John Mueller, a low crawl rate may also indicate a poor quality site. Visit Coverage > Details in Google Search Console to verify this. Go to Reports & Data > Crawl Information in Bing Webmaster Tools to verify this. What to do: For example, if you find outdated items that no longer exist or encounter robots.txt errors that should be forbidden, manually inspect your crawl problems to see if they can be fixed. 301 redirects to comparable sites that link to the dead pages may be implemented after the origin of the visitors has been identified. Additionally, you should compare the average download time reported by Google Search Console with the number of pages crawled each day in Google Analytics to determine an association between the two. Data that is organised See the schema part of Screaming Frog’s documentation on Google Search Console for further information. Use Google Search Console’s individual rich results status reports. Notice: The structured data report has been retired.) By doing this, you’ll be able to see which pages contain mistakes in their structured data, which you can then rectify. What to do: When you test the live version, Google Search Console will inform you what is lacking in the schema. Make changes to the schema with the help of your error codes and submit it to the web development team. Sublime Text is the text editor I use for all of my work. As for PC users, they may utilise Google’s recently acquired YouTube.

Step 3: Go over your Google Analytics data

Tools: Using Google analytics. The Chrome extension for Google Tag Manager Assistant (GTM). Campaign Tags for Annie Cushing. Views I create three separate Google Analytics views when I have a new customer. View from the field. View from the top. View in test mode. I can make adjustments without altering the data, thanks to these distinct perspectives. What to do: Create the three views above in Google Analytics by going to Admin > View > View Settings. Bots and spiders should be excluded from search results by checking the Bot Filtering checkbox. Make sure your Google AdWords and Google Search Console accounts are linked. Finally, ensure that the Site search Tracking is enabled. Filter So that you don’t receive any bogus traffic, you should add your and your client’s IP addresses to the Google Analytics filters. What to do: Take a look at the Filters section of the Admin View. Exclude > traffic from IP addresses that are equal to. Tracing Code Use my Screaming Frog approach above or manually verify the source code. Check whether the code is firing in real-time if it is already present. Check your client’s website to see whether this is the case. Google Analytics > Real-Time > Locations should now display your location information. Google Tag Assistant is a Chrome extension you may use if you’re using Google Tag Manager to keep track of this. What to do: Double-check that the code snippet is right if the code doesn’t run. Code from another site may have been included in your own. Use a text editor, not a word processor, to copy the code snippet onto the web page before pasting it there. Whitespace or additional characters may result as a result of this. It is important to ensure that all lowercase letters are used in the code. Indexing Google Search Console has a column under “Coverage” that you may have come upon when experimenting. Google Search Console indexing and Google Analytics data are compared while inspecting a customer. In this manner: Search Console’s Coverage tab is where you’ll get this information. Go to Acquisition > Channels > Organic Search > Landing Page in Google Analytics. Go to Advanced > Site Usage > Sessions and choose 9. What to do: Comparing the data from Google Search Console and Google Analytics can tell you whether a small percentage of the sites are receiving organic traffic, even if they have been indexed. Tags for campaigns Checking that your customer accurately identifies their campaigns in Google Analytics will be the last thing you do. Your efforts should not go unnoticed because you neglected to apply campaign tagging. What to do: Set up a Google Analytics campaign tagging approach and show it to your client. Annie Cushing has put up a great campaign tagging guide. If your customer is running mobile advertisements or videos, you should enable Event Tracking. Keywords You may discover keyword gems for your client using Google Analytics. Follow these procedures to locate keywords in Google Analytics: Search Terms may be found in Google Analytics by going to Behavior > Site Search. This will show you what consumers are looking for on the website. After that, I’ll build a New Segment in Google Analytics using those search phrases to check which pages on the site are already ranking for that specific keyword term.

Step 4: Recheck Manually

Tools:
  • Using Google analytics.
  • Server and host permissions for the client.
  • A Signal Is Received.
  • PageSpeed Insights.
  • The Archive.
  • Your Client’s Site Is Searchable in One Form or Another.
If you want to find a certain website, try searching for it in several ways. For instance: http://annaisaunicorn.com https://annaisaunicorn.com http://www.annaisaunicorn.com As Highlander would put it, “there can only be one” searchable website. All non-canonical URLs should be 301 redirected to the canonical site.

Indexing

To find out how many pages Google has indexed, do a manual search in Google and Bing. Your Google Analytics and Google Search Console statistics may not always be correct, but this amount should give you a fair idea. Follow these steps to make sure: Use a search engine to look up the website’s address. Scan manually to ensure that just the brand of your customer appears when you search. Make sure the site’s homepage is at the top of the page. According to John Mueller, the homepage does not need to be the first result in a search. What to do: It’s a greater problem if another brand appears in the search results. You’ll need to analyse the data to figure out what’s wrong. Perform a manual review of the website to identify what’s missing if the homepage doesn’t display as the first result. Alternatively, this might indicate that the site has been penalised or that the site’s architecture needs improvement. Using Google Analytics, check whether the number of organic landing pages matches the number of search results. As a result of this, you’ll know which sites the search engines consider important.

Caching

The top sites will be checked to determine whether Google is caching them. Using Google’s cached pages, your material is linked to search queries. To see whether Google has cached the pages of your client, do the following test: http://webcache.googleusercontent.com/search? q=cache:https://www.searchenginejournal.com/pubcon-day-3-women-in-digital-amazon-analytics/176005/ Use the Text-only version instead of the graphical one. The Wayback Machine is another place to look for this information. What to do: The client’s server should be checked to see whether it is offline or running slower than normal. There might be a problem with the server or with the database connection. If many people are trying to access the server at the same time, this may happen. Use a reverse IP address check to find who else is on your server. Your Get Signal webpage may be used for this step. A CDN may be necessary if several shady websites share your client’s server. Inquire about whether or not the customer is deleting pages from the website.

Hosting

The hosting software linked with your client’s website is critical to your SEO performance, even if it’s a bit technical for some. Hosting might harm SEO, and your efforts will be in vain. Your customer’s server must be accessible to you to manually investigate any difficulties. TLD mismatches and sluggish loading times are the two most typical hosting concerns I encounter. What to do: If the TLD of your client is incorrect, you must verify that the country IP address assigned to your customer corresponds to the nation in which your client conducts the majority of its business. If your customer owns both a.co and a.com domain, you’ll want to redirect the.co to the.com domain. If your client’s site is taking too long to load, you’ll want to fix this right away since it affects their search engine ranking. PageSpeed Tools and Pingdom may help you figure out what’s slowing down your website. Here are a few of the most typical page speed problems that we’ve encountered: Host. Rather large pictures. You may watch videos right in your browser. Plugins. Ads. Theme. Widgets. Scripts that are too long or complicated.

Vitals Audit of the Core Web

Core Web Vitals is a set of three metrics that reflect the user experience. Core Web Vitals are crucial since they will be included in Google’s ranking algorithm in the Spring of 2021. Core Web Vitals scores should be audited to discover areas for development, even if the ranking element is likely to be a minor one. As part of your audit, why is it important to include core web vitals? Better Core Web Vitals scores will boost your search ranking, but they may also result in greater conversions and revenues for your business. Sales, traffic, and ad clicks increase due to increased website speed and performance. Even if you upgrade your web hosting and install a new plugin, you won’t notice much of a difference in your website’s performance. While downloading your site to a mobile device, the data is collected and used to make adjustments. That suggests their Internet connection and mobile device are the bottlenecks. A fast server will not improve slow Internet on a low-cost mobile phone. On the other hand, a page speed plugin will be of little value since many of the solutions entail modifying the code in a template or the core files of the content management system itself. There are a plethora of materials available to assist in solving problems. It’s not uncommon for solutions to need the help of a developer who is experienced in making changes to the core files of your CMS. It might be tough to fix Core Web Vitals problems. Content management systems (CMS) such as WordPress, Drupal, and others were not designed to get high Core Web Vitals scores. For Core Web Vitals to improve, it is necessary to alter the core of WordPress and other content management systems (CMS). To enhance Core Web Vitals, a website must be made to perform a function that it was never meant to do when the theme or CMS was designed. A Core Web Vitals audit aims to determine what needs to be fixed and then send that information on to a developer who can subsequently implement the required changes. ”

What Are the Most Important Aspects of a Website?

Your website needs to load quickly so that users can engage with it quickly (for example, by clicking a button). Still, it is also vital for your page to remain stable so that users can interact with it quickly (for example, by clicking a button). There are several options: Paint with the Largest Content (LCP). Delay in receiving the first input (FID). Shifts in Layout Percentage (CLS). Core Web Vitals scores come in two flavours: Results from the lab. In the field. Results from the Laboratory You create lab data when you run a page using Google Lighthouse or PageSpeed Insights. A virtual gadget and an Internet connection create scores for lab work. The goal is to provide the person working on the site with an indication of what needs to be improved in the Core Web Vitals. PageSpeed Insights is useful since it pinpoints the particular lines of code and parts of a page that contribute to the page’s poor performance. Data Collected in the Field Chrome User Experience Report (also known as CrUX) Field Data are genuine Core Web Vitals ratings obtained by Google Chrome browser for the Chrome User Experience Report. The Field data may be found in Google Search Console under the Enhancements tab at https://search.google.com/search-console/core-web-vitals through the link titled “Core Web Vitals.” A minimum number of visits and measurements are required for Google Search Console to deliver statistics on a given field. Google will not record a score if it doesn’t obtain enough votes. The Screaming Frog Auditor for Core Web Vitals Version 14.2 of Screaming Frog can now show whether or not a Core Web Vitals evaluation is a success or a failure. Screaming Frog needs an API key to access the PageSpeed Insights API. First, go to Configuration > API Access >PageSpeed Insights to register your API key with Screaming Frog. You’ll notice a location to input your API key and link it to the service there. If you’d want to see certain data, you may pick the Metrics tab in the same PageSpeed Insights window and cross off the boxes. Only mobile devices should be selected as the device type for ranking reasons. Choosing the Opportunities option at the end of the Crawl will bring up a list of potential fixes for your site (like defer offscreen images, remove unused CSS, etc.). Before You Begin Climbing Every website page should be thoroughly analysed to determine what’s wrong with it, although this isn’t always necessary. It’s a good idea to do a preliminary crawl before really crawling. First, choose a subset of pages that reflect the many sorts of pages found in each part or category of the website you’re designing. Enter URLs manually into the Upload tab in Screaming Frog by creating a spreadsheet, text file list, or manually pasting URLs into the Upload tab. Similar page structures and content were found on most websites. All the pages in a “news” category will be identical, and all the pages in a “reviews” category will be similar as well. Crawling a representative set of pages may save you time by identifying issues that are common to all pages on the site, as well as problems that are common to certain categories of those pages. Issues found will be comparable as a result of the commonalities. It may only be required to explore a few example pages from each category to discover the unique concerns in each segment. Sitewide problems that affect the whole site, such as unused CSS loaded from every page or a Cumulative Layout Shift caused by an ad unit in the page’s left margin, are the most prevalent types of issues being addressed. Because current websites are templated, the adjustments will occur at the template level or in the stylesheet. Screaming Frog Crawls the Site When you’ve finished crawling all URLs, you can click on the PageSpeed tab, read all the suggestions, and see the pass/fail statuses for the metrics. Zoom in on the URL’s Prospects. Useful features include selecting a URL from the top pane of the Screaming Frog Core Web Vitals Audit and seeing options for improvement in the bottom pane. The image below shows the bottom screen with a chosen opportunity and the specifics of that improvement opportunity in the right-hand pane (see picture below for more information). Google’s Official Tool Google has released an auditing tool. https://web.dev/measure/ is where you can find it Add a URL to see how the page is doing. Google will keep tabs on the page for you if you’re logged in. If you click the View Report link, you’ll be sent to a new page with a report outlining what’s wrong, as well as links to instructions that explain how to remedy it.

You May Also Like…

Robot.txt File for SEO

Robot.txt File for SEO

Robots.txt to optimize SEO the Ultimate SEO Guide Beginning SEO Users One of the latest and most significant...

Google Algorithms Explained

Google Algorithms Explained

Search engine optimization (SEO) and Google’s algorithms are well-known concepts in internet marketing. Has your...