Performance Metrics

Frequently Asked Questions

  • Good catch! To get you started, we’ve pre-analyzed your top 10 pages by Page Authority.
  • Analysis can take up to 24 hours to complete. Once analysis is complete you will receive a notification in-app.
  • Pages can be analyzed multiple times a day, however, if an analysis is already in progress you must wait for that analysis to complete before requesting a new one.
  • Your page limit will reset monthly. You can verify the date this will occur on the All URLs tab view.
  • There are various reasons why a Page Performance Score may fluctuate including the distance of the hosted site from our server. Sites hosted farther from servers and/or less optimized sites may experience fluctuations in scores.
  • Performance Metrics within Moz is powered by Lighthouse which is a different tool than Pagespeed Insights. For this reason, scores within Performance Metrics may vary from what is seen in Pagespeed Insights and should not be compared. When comparing Performance Metrics data to Lighthouse data, there are various reasons why a Page Performance Score may fluctuate including the distance of the hosted site from our server. Sites hosted farther from servers and/or less optimized sites may experience fluctuations in scores.
  • There are various reasons why a Page Performance Score may fluctuate within the app or from what you see in Lighthouse directly. These include things like: The distance of the hosted site from our server, and especially whether or not the site is hosted outside the United States, depending on your CDN setup. Or the device used. Our processes simulate a mediocre mobile device and internet connection. Running an analysis in Lighthouse on your own browser takes into account your actual device and internet connection.
  • Google Lighthouse does not analyze pages which return 4xx status codes therefore we are not able to offer analysis for those pages in Performance Metrics. Only pages returning a 200 OK status code will be available for analysis.
  • If we were unable to crawl your site during your weekly Site Crawl, we will not list pages in the Top Pages by Page Authority subset within Performance Metrics. Be sure to check out the Site Crawl section of your Campaign to see about resolving any potential crawl issues. You can also select a different subset of pages within Performance Metrics to get started analyzing pages.
  • Yes, the Performance Metrics reports in Moz Pro are powered by the Google Lighthouse tool. You can run a full Lighthouse report here or learn more about Lighthouse performance scoring here.

Introduction to Performance Metrics

With Performance Metrics in Moz’s Site Crawl, you can consolidate your technical audit workflows in one place. Within this tool, you can bulk analyze Google Core Web Vitals and other performance metrics. This streamlined approach to technical SEO work is fundamental in planning updates for your site to ensure you continue to rank in SERPs.

Performance Metrics is located within the Site Crawl section of Campaigns.

Running a Performance Metrics Analysis

To get started with Performance Metrics head to the Site Crawl section of your Campaign and then to Performance Metrics within the left-hand navigation. You will then have the option to select the pages you’d like to analyze.

Using the Choose page source drop-down, select a subset of pages from your site to analyze. You will have the following options:

  • All Pages - Will display all pages crawled within a Campaign which returned a 200 http status code
  • Top pages by Page Authority - Will display pages from your most recent Site Crawl based on Page Authority
  • Top landing pages from traffic - Will display pages from your site with the highest organic traffic based on Google Analytics data (Please note: you must have Google Analytics connected to your Campaign to see traffic data in this view)
  • Pages with crawl issues - Will display pages from your site which have been identified as having crawl issues based on your latest Site Crawl
  • Top pages by rank - Will display ranking pages based on the most recent rankings collection for your Campaign
Use the drop-down provided to select a subset of pages.

After selecting a page source, use the check boxes to the left to select the pages you’d like to analyze and click Analyze selected URLs at the top of the screen.

If needed, you can apply filters to further segment the pages available to analyze. Multiple filters are able to be applied at once.

Pages are only able to be analyzed once per day. If you are not able to select a URL in the table to analyze, this means it has already been analyzed within the last 24 hours or that the analysis is currently in progress.

After selecting a subset of pages, select the exact pages you'd like to analyze using the check boxes on the left.

If you would like to select all the visible URLs to be analyzed at once, check the top box provided. You will then be given the option to Select all selectable rows in the grey box at the top of the table which will select all the URLs in this subset of pages which are available for analysis.

URLs which are in the process of being analyzed will be marked as Analysis in progress. You will also see the date the URL was last analyzed listed in the Analysis status column.

The Analysis Status column indicates when a page was last analyzed.

As you analyze pages in Performance Metrics, the graphs shown will update to reflect the latest data. Within these graphs, the tool will illustrate the percentage of analyzed pages with scores that fall within each score range. Hover over the graph to see more information and use the radio buttons to toggle between Mobile & Desktop.

The bar graph shown will update as you continue to analyze pages.

Data Included in Performance Metrics

Once analysis is complete for a page or set of pages, you will receive a notification in-app letting you know that you are able to view your reports.

For each analyzed page you will see the following data:

A. URL - The URL of the analyzed page

B. Mobile Performance - The overall performance metric for this page on mobile

C. Desktop Performance - The overall performance metric for this page on desktop

D. Page Authority (PA) - A Moz proprietary metric from 1-100 which predicts how well a page will rank in Google based on a machine learning algorithm of link metrics

E. This is a dynamic, sortable column which will identify a different metric based on which page source is selected.

You can select the Page Source at the top of the tool using the drop-down above the graph.
  • When Top pages by Page Authority is selected, Crawl Depth will be noted
    • Crawl Depth - The number of links the crawler followed from the homepage to find the URL noted. Also thought of as the number of clicks a visitor would need to perform to get to this page.

  • When Top landing pages from traffic is selected, Traffic will be noted
    • Traffic - The number of unique organic visits this page received based on Google Analytics

  • When Pages with crawl issues is selected, Issues will be noted
    • Issues - The number of issues identified for this page during the most recent Site Crawl

  • When Top pages by rank is selected, Rank and Keyword will be noted
    • Rank - The highest ranking position for this URL during the most recent data collection
    • Keyword - The keyword associated with the Rank and URL noted

  • When All Pages is selected, Crawl Depth will be noted

    • Crawl Depth - The number of links the crawler followed from the homepage to find the URL noted. Also thought of as the number of clicks a visitor would need to perform to get to this page.

G. Largest Contentful: Mobile - measures when the browser renders the largest content element - this approximates when the main content of the page is visible to users (Measured in seconds)

H. Total Blocking Time: Mobile - measures the total amount of time that a page is blocked from responding to user input, which is calculated by adding the blocking portion of all long tasks (tasks taking longer than 50 milliseconds to complete) between First Contentful Paint and Time to Interactive. The blocking portion is the amount of time the task takes to complete beyond 50 milliseconds (measured in milliseconds)

I. Cumulative Layout shift: Mobile - a sum of the total layout shifts that occur more than 500 milliseconds after user input, which measures the instability of content. It considers how much visible content has shifted on the page as well as the distance the elements were shifted

J. Details - Provides a link to view the full report

Please note: To see all columns in the table, please scroll to the right.

The page performance scores for mobile and desktop are a weighted average of the metric scores. According to Lighthouse, more heavily weighted metrics have a bigger effect on your overall page performance score. You can learn more about how metrics are weighted into the overall scores here. Within the full report and the Details drawer, the tool will identify Opportunities for improving your site’s performance. In general, the results of Opportunities that can be fixed are not taken into account for your overall page performance scores. However, improving or fixing the Opportunities likely impact the metric values, so there is an indirect relationship.

When evaluating your Page Performance scores:

  • Red triangle - Indicates Critical
  • Orange square - Indicates Needs Improvement

  • Green circle - Indicates Looking Good

Click the arrow in the column on the far left to see more information about what Opportunities we’re identifying that could help to improve your site’s performance. Once you’ve identified possible opportunities for improvement, you can work with your developers or webmasters to implement those on your site. After they’ve been implemented, you can re-run an analysis of the affected pages to see how your overall page performance scores have changed and if the values for your Performance Metrics have improved (just a reminder that pages can only be analyzed once a day).

Click the arrow in the details column to learn more about your score and opportunities.

In addition to the Opportunities outlined in the expanded view, you may also see a Diagnostics section. The Diagnostics section outlines additional opportunities we’ve identified which may improve your page’s overall performance.

To view the full report, click See more details next to any of the Opportunities or Diagnostics noted for a specific page.

You can also click View Report in the Details column or click the link in the Report by URL to be taken to the full report and see even more data about this page and its Performance Metrics.

The View Report link to the right will take you to the full report for this analysis.

Please note: To see the View Report link, please scroll all the way to the right of the table via the scroll bar at the bottom of the page.

How to Track URLs Over Time

Within Performance Metrics you have the option to track URLs automatically. By adding URLs to your Tracked URLs, the tool will re-analyze them on a weekly basis and update your graphs and data points accordingly.

To add URLs to your Tracked URLs, select URLs from the All URLs views using the check boxes on the left and then select Add to tracked.

Use the check boxes on the left to select which URLs you'd like to track over time.

When a URL is being tracked, it will have a Tracked label in place of the Analyze button.

If you decide you no longer want to track a URL, you can follow the same process of selecting the URLs with the check boxes on the left but instead, click Remove from tracked to no longer track those URLs over time.

Tracked URLs Data View

You can view your tracked URLs and their associated data all in one place via the Tracked URLs view available at the top of the tool.

Within this view you will see when your tracked URLs were last analyzed and when the next analysis is expected to run. You will also be able to keep track of how many URLs you’re tracking out of your total allowance.

Below this information will be a Tracked URLs summary which will show you the average Mobile and average Desktop scores for your tracked URLs.

The Tracked URLs tab gives you an overview of your tracked URL data.

This Historical performance scores graph will show the historical Tracked URLs Summary over time. Within the graph the red line represents the page’s Desktop score and the blue line represents the page’s mobile score. If there is a gap in analysis of 14 days or more, the span of time will be represented as a dotted line. Up to 90 days of data will be represented in the graph based on when analyses were performed. Hover over the data points in the graph to see more information.

View of the Historical performance scores graph in the Tracked URLs tab.

Within the Tracked performance metrics chart, you will have the same data filters and view options available as seen in the All URLs view.

Viewing the Full Report

Clicking View Report in the Analyze column will take you to the full report for this page where you’re able to see more information about what is impacting its Performance Metrics.

You'll see your mobile and desktop performance scores at the top of the report along with a graph indicating your historical scores.

Within the Performance Metrics Report you will see the Mobile Page Performance score and the Desktop Page Performance score for the page.

  • Red (0 to 49) - Indicates a Critical (low) page performance score
  • Orange (50 to 89) - Indicates a Needs improvement (mid-level) page performance score
  • Green (90 to 100) - Indicates a Looking good (high) page performance score

Just a reminder that the Overall Performance scores for mobile and desktop are a weighted average of the metric scores. Naturally, more heavily weighted metrics have a bigger effect on your overall page performance score. You can learn more about how metrics are weighted into the overall scores here. In general, the results of Opportunities that can be fixed are not taken into account for your overall page performance scores. However, improving or fixing the Opportunities likely impact the metric values, so there is an indirect relationship.

Next to the Overall Performance scores, you will see a graph illustrating the Historical performance scores for this page.

Within the graph the green line represents the page’s Desktop score and the purple line represents the page’s mobile score. If there is a gap in analysis for more than 60 days, the span of time will be represented as a dotted line. Up to 90 days of data will be represented in the graph based on when analysis was performed. Hover over the data points in the graph to see more information.

Within the Historical Performance Scores graph you can hover over the data points to see more information.

You can export historical data or download the image of the graph by clicking the three lines on the top right of the graph.

Use the menu nested under the 3 bars to the right to export your data.

Performance Metrics Data Chart

Within the Performance metrics chart, we will display the scores for each metric from the most recent analysis. These metrics are color coded based on whether they need improvement or they are in good shape.

Click the arrow in the details column to see more information about a given metric.

Beside each metric you will also see a delta value. This value indicates the change seen in this metric since the last analysis. If no change is indicated, no delta value will be noted.

By clicking the arrow in the right-hand column, you can see more information about the metric, the ranges for each level of score (Critical, Needs Improvement, and Looking good), along with historical graphs for both Mobile and Desktop performance.

In the expanded drawer you can see historical graphs of your data.

Largest Contentful Paint (Core Web Vital)

Largest Contentful Paint measures when the browser renders the largest content element - this approximates when the main content of the page is visible to users (Measured in seconds).

Mobile

Desktop

Critical (Red triangle)

4+

2.4+

Needs improvement (Orange square)

2.5 - 4

1.2 - 2.4

Looking good (Green circle)

0 - 2.5

0 - 1.2


Total Blocking Time (Core Web Vital)

Total Blocking Time measures the total amount of time that a page is blocked from responding to user input, which is calculated by adding the blocking portion of all long tasks (tasks taking longer than 50 milliseconds to complete) between First Contentful Paint and Time to Interactive. The blocking portion is the amount of time the task takes to complete beyond 50 milliseconds (measured in milliseconds).

Mobile

Desktop

Critical (Red triangle)

600+

350+

Needs improvement (Orange square)

290 - 600

150 - 350

Looking good (Green circle)

0 - 290

0 - 150


Cumulative Layout Shift (Core Web Vital)

Cumulative Layout Shift is a sum of the total layout shifts that occur more than 500 milliseconds after user input, which measures the instability of content. It considers how much visible content has shifted on the page as well as the distance the elements were shifted.

Mobile

Desktop

Critical (Red triangle)

0.25+

0.25+

Needs improvement (Orange square)

0.1 - 0.25

0.1 - 0.25

Looking good (Green circle)

0 - 0.1

0 - 0.1


First Contentful Paint

First Contentful Paint measures when the browser renders the first bit of content, providing the first feedback to a user that the page is actually loading (measured in seconds).

Mobile

Desktop

Critical (Red triangle)

4.02+

1.6+

Needs improvement (Orange square)

2.33 - 4.02

0.93 - 1.6

Looking good (Green circle)

0 - 2.33

0 - 0.93


Speed Index

Speed Index measures how many milliseconds it takes to display the visible parts of a page (measured in seconds).

Mobile

Desktop

Critical (Red triangle)

5.8+

2.31+

Needs improvement (Orange square)

3.4 - 5.8

1.31 - 2.31

Looking good (Green circle)

0 - 3.4

0 - 1.31


Time to Interactive

Time to Interactive measures the amount of time it takes for the content on a page to become functional in order for the user to interact with the content on the page (measured in seconds).

Mobile

Desktop

Critical (Red triangle)

7.33+

4.51+

Needs improvement (Orange square)

3.75 - 7.33

2.48 - 4.51

Looking good (Green circle)

0 - 3.75

0 - 2.48


Click the arrow in the Details column to learn more about each of these Performance Metrics. A description will be noted for each along with the ranges for the 3 rating levels for Desktop and Mobile.

Performance Metrics Report - Opportunities

Below the Performance Metrics breakdown in your full report, we will outline key Opportunities for this page. These are fixes or changes you can implement to help improve your scores for Performance Metrics which may not be scoring as high as they should. Improving your scores within the Performance Metrics themselves will help to raise the overall Mobile and Desktop Performance for this page.

Under Opportunities, you will have the option to select Desktop or Mobile. Since these platforms each receive their own Page Performance score, you may see different Opportunities listed for each.

For each Opportunity listed, we will note the Potentially affected metric. By clicking the arrow to the right, you can expand the view to see more information about this Opportunity including What it is, Why it’s an issue, and How to fix it.

Use the tabs to select mobile or desktop and then click the arrow to the right to learn more.

Opportunities marked with a red exclamation mark are considered more critical.

Performance Metrics Report - Diagnostics

Below the Opportunities outlined for this page, a Diagnostics section will note any additional suggestions that may help page performance. Fixing these can affect page performance, but they do not directly impact your Page Performance Scores.

In the Diagnostics section the tool will show you additional tips to help improve your page.

Dashboard and Insights Modules

After you’ve selected URLs to track over time in Performance Metrics you will see new modules within the Dashboard and Insights sections of your Campaign.

In Insights, you may see a Performance metrics summary module. This module will show you a summary of the scores for your tracked URLs as well as the top 3 URLs with the greatest increase and decrease in overall score for mobile. This can be a great way to see a high level overview of your scores week to week.

In your Dashboard you will see modules for your Tracked URLs summary and Historical performance scores. The Tracked URLs summary will display a high-level overview of your Mobile and Desktop scores for your tracked URLs along with how the scores have changed week over week. The Historical performance scores will show your average overall performance scores for your tracked URLs over time.

Please note: If you are not currently tracking URLs over time in Performance Metrics these modules will not show data. Click the button provided to head to Performance Metrics and start tracking.

Why is my data different than what I see in Lighthouse or CrUX?

When using the Performance Metrics tool in Moz Pro, you may see your data fluctuate or vary from what you see in Google Lighthouse directly, CrUX, or Google Search Console.

There are various reasons why a Page Performance Score may fluctuate within the app or from what you see in Lighthouse directly. These include things like:

  • The distance of the hosted site from our server, and especially whether or not the site is hosted outside the United States, depending on your CDN setup.

  • The device used. Our processes simulate a mediocre mobile device and internet connection. Running an analysis in Lighthouse on your own browser takes into account your actual device and internet connection.

CrUX data, also used in Google Search Console, is collected from Chrome browsers of real users on your site. In this sense, it reflects the performance of the devices they use, so may differ significantly from lab data that Moz Pro provides, which often assumes a fairly mediocre device and connection.

In addition, some metrics can’t be replicated in the lab, most notably First Input Data, which measures the time taken from the user’s first input to the page processing their input. This depends heavily on when the user chooses to attempt that first action, so lab equivalents like Total Blocking Time may show significantly harsher scores.

In a general sense, lab data is data recorded in a controlled, artificial environment. It is beneficial because it is more consistent and easily reproducible. The downside is that it doesn’t always accurately represent nuances of the real world.

For Moz (and Performance Metrics), lab data is obtained using artificial tests that simulate a typical browser, device, and connection quality. This allows us to make a fair comparison between websites but it won’t take into account the differences between real users of your site.


Woo! 🎉
Thanks for the feedback.

Got it.
Thanks for the feedback.