What Does Core Web Vitals “Not Enough Usage Data” Mean
If you’re in the habit of checking your Core Web Vitals report on Google’s PageSpeed Insights or Search Console, you might have run into a frustrating scenario: instead of seeing graphs and performance metrics, there’s a blank “not enough usage data” message. If this sounds familiar, you’re not alone.
Core Web Vitals are essential for understanding and optimizing user experience, especially since Google now uses these metrics as a search ranking factor.
In this guide, we’ll dive into what “no data” means for your site and how to optimize its performance, even if Google doesn’t yet have enough field data.
What are Core Web Vitals and why do they matter?
Core Web Vitals are metrics used by Google to gauge essential aspects of a website’s performance:
- Largest Contentful Paint (LCP): How fast a page’s main content loads (ideally under 2.5 seconds).
- Interaction to Next Paint (INP): Measures a page’s responsiveness based on the time it takes to respond to user input.
- Cumulative Layout Shift (CLS): Measures the visual stability of a page, especially as content loads.
Google uses Core Web Vitals as part of its ranking algorithm, making them essential for site owners aiming to optimize user experience and improve SEO rankings. That means it’s important to improve Core Web Vitals.
Here’s the performance ranges for each status:
Good | Need improvement | Poor | |
---|---|---|---|
LCP | <=2.5s | <=4s | >4s |
INP | <=200ms | <=500ms | >500ms |
CLS | <=0.1 | <=0.25 | >0.25 |
Easy way to boost Core Web Vitals on WordPress
Take control of your site’s performance today and give FastPixel a try for free!
Difference between lab data vs. field data
Lab and field data aren’t unique to web performance. These concepts distinguish controlled experiments from real-world observations.
In web performance, they offer different but complementary insights.
Lab data
Lab data is collected in a stable, controlled environment, known as synthetic monitoring.
Lab tools simulate a single device and network setup, loading your website under these set conditions.
This makes lab data ideal for diagnosing performance issues in a consistent way and testing new changes. However, it’s limited because it doesn’t capture the range of experiences real users might have.
Field data
Field data is collected from actual users’ visits. It reflects a variety of devices, network speeds, and locations, showing how users truly interact with the site.
This data can reveal performance issues missed in lab tests, especially since it’s shaped by real-world factors like varying internet connections and device capabilities.
Because lab data uses a specific setup and field data shows diverse user experiences, differences between the two are common.
For example, while lab data might reflect smooth performance under optimal conditions, field data may expose slower loading on older devices or poor connections.
Both types of data are useful – lab data for replicable, detailed diagnostics, and field data for understanding actual user experience.
Why do you see “Not enough usage data” in Core Web Vitals reports
The Chrome User Experience Report (CrUX) is where Google gathers data for Core Web Vitals. This report aggregates real-world user experience data from Chrome browsers, which is used to generate insights on website performance.
This data collection depends on several factors:
- Traffic requirements: A website must have sufficient user traffic. Google has set a minimum threshold for a page or website to be included in CrUX, but the specific number isn’t disclosed. This requirement ensures that Google collects enough data to represent real-world usage accurately.
- User data opt-In: CrUX only collects data from Chrome users who have opted into data sharing and syncing. These users must:
- Be signed into Chrome with sync enabled.
- Not use a Sync passphrase.
- Be on a supported platform (Windows, macOS, Linux, Android).
- Platform limitations: CrUX does not collect data from Chrome on iOS or from other Chromium-based browsers (like Microsoft Edge).
When these conditions aren’t met, Google can’t collect enough data to provide insights in Core Web Vitals, hence the “not enough usage data” message.
Why Core Web Vitals field data might be missing for your site
CrUX has several eligibility requirements, covering everything from the visibility of your pages to their traffic levels.
Here’s a breakdown:
- Public discoverability
- A page must be publicly accessible and indexable. Google uses the same criteria as its search engine for determining if a page is “publicly discoverable.”
- Pages won’t be considered publicly discoverable if they return an HTTP status other than 200, contain “noindex” tags, or have an HTTP header with “noindex.”
- Sufficient traffic
- Pages and sites must meet a minimum level of popularity based on visitor numbers. CrUX tracks the popularity of the entire website at the origin level (e.g.,
https://www.example.com
) and the page level. - Pages that don’t meet the popularity threshold won’t be included in the CrUX dataset.
- Pages and sites must meet a minimum level of popularity based on visitor numbers. CrUX tracks the popularity of the entire website at the origin level (e.g.,
- Origin-based data
- CrUX aggregates user experiences at both the page and origin level, where an origin represents an entire website.
- While individual pages must meet the popularity threshold, experiences on multiple pages across the same origin count toward that origin’s overall popularity.
So, if your Core Web Vitals report lacks data, it’s typically because your site doesn’t meet CrUX’s eligibility or popularity thresholds.
Smaller websites or newer sites may not attract enough users to meet CrUX’s popularity requirements, making it difficult for Google to collect data.
Also, if Google can’t crawl or index your site, it won’t be included in the CrUX dataset. Make sure the website is publicly accessible, crawlable, and indexable.
Lastly, CrUX depends on Chrome users who have opted into data sharing. If there aren’t enough users in the dataset, Google might not have the data needed to report on your site’s Core Web Vitals. In that case, you may need to wait until more users visit your site.
How to optimize Core Web Vitals without CrUX data
If your Core Web Vitals report shows “not enough usage data” you can still improve performance using lab data and third-party testing tools.
Here’s how:
Run lab-based tests / performance audits
Lab-based tests / performance audits allow you to simulate visits and test your site’s performance under different conditions.
This lets you find bottlenecks, especially for Core Web Vitals like LCP, FID, and CLS.
Start by running tests with Google’s PageSpeed Insights, which can simulate real-user conditions and give you insight into how your page performs on various Core Web Vitals.
You can filter relevant audits above the diagnostics list (see screenshot below), helping you assess and improve your site’s metrics, even without field data.
Beyond PageSpeed Insights, Lighthouse and GTmetrix offer detailed analyses that reveal bottlenecks affecting loading times, script execution, and layout stability.
These tools provide breakdowns of elements like JavaScript execution time, image dimensions, and asset load sizes. By examining this data, you can identify specific areas where your site is underperforming and make targeted optimizations.
For example, using the SEO category in Lighthouse is a quick way to check if your site is easily “discoverable” by search engines. This audit flags any issues with indexing and crawlability, which directly impact whether Google can properly access and rank your pages.
Fixing discoverability issues ensures your content can reach users and meets the baseline for Core Web Vitals reporting.
Optimize for lab metrics
To reduce JavaScript execution, defer non-critical scripts to boost INP and TBT. By loading essential scripts first and delaying others, you free up the main thread for tasks that affect users right away. This keeps your page responsive while cutting down on delays caused by background scripts.
Compressing images is another easy win for improving LCP. Using next-gen image formats like WebP and AVIF reduces file sizes while keeping image quality high. Smaller images load faster, so they’re a quick way to get your page’s main elements up on screen fast.
Another effective strategy for optimizing lab metrics in WordPress is using a website accelerator like FastPixel to reduce server load and improve load times. Implementing such a solution can drastically cut down the time it takes to serve pages, which benefits metrics like TTFB and FCP.
Here are the before/after FastPixel lab test results for a kid-centric portal via GTmetrix:
Focus on server and network performance
Using a CDN is one of the most effective ways to cut down on load times. CDNs cache your site’s resources on servers located closer to your users, which means data has a shorter journey to their devices. Result? Faster loading, especially for global audiences.
Enabling caching is another trick to improve load times on repeat visits. When users return, they won’t need to download your site’s assets all over again. This cuts down on data transfer and speeds up page rendering.
For even snappier performance, optimizing your server’s response time is essential. A faster Time to First Byte (TTFB) means the server starts delivering content sooner. Upgrading hosting, reducing database query time, or enabling server-side caching can all reduce TTFB, giving users a quicker start to their browsing experience.
Gather user experience data manually
Gathering user feedback with tools like heatmaps, session recordings, and surveys can give you a detailed look at user experience, completely independent of CrUX data or Google’s thresholds.
Heatmaps, like those from Hotjar or Crazy Egg, show you where users are clicking, scrolling, or hovering on your pages. This data can highlight areas where users might be struggling or reveal which sections draw the most attention.
Session recordings with tools like FullStory let you watch real user interactions in action, giving you insights into potential roadblocks or confusing elements on your site.
User surveys are another way to gather direct feedback. Tools like SurveyMonkey allow you to ask users about their experience, helping you catch issues that might not be obvious from analytics alone. This qualitative feedback rounds out your understanding of user experience, guiding improvements in usability and engagement.
Key takeaways
Receiving “not enough usage data” in the Core Web Vitals report or Google Search Console doesn’t mean you can’t optimize your site for Core Web Vitals.
By focusing on lab data, improving discoverability and server performance, and proactively gathering user feedback, you can improve user experience on your website.
This positions your site to be ready for Core Web Vitals ranking benefits once you meet CrUX eligibility requirements.
Useful links
- Best practices for measuring Web Vitals in the field
- Debug performance in the field
- Why lab and field data can be different
- CrUX Methodology
Final thoughts
Google prioritizes data quality and user privacy by only including data from popular, public pages with Chrome users who have opted into data sharing.
As your traffic grows, your site will likely meet CrUX criteria for Core Web Vitals reporting.
Until then, follow best practices for Core Web Vitals to ensure a fast, responsive, and stable experience for your visitors.
Looking to improve Core Web Vitals on WordPress?
Take control of your site’s performance today and give FastPixel a try for free!