İçeriğe geç
Tools

Googlebot Crawl Stats Report Analysis Guide

·5 min min read·Technical SEO Editor
## The X-Ray Film of Your Crawl Budget: What are Crawl Stats? The ultimate single data table an SEO expert looks at when doing a general health check-up on a website: The "Settings > Crawl Stats Report" area inside Google Search Console (GSC). This area, where curves, colorful and giant bars roam, is the place where you watch second by second how many times (Pings) Googlebot has entered your website per second/minute in the last 90 days, how many gigabytes of data it licked, how much effort it expended while licking that data and blowing up your server (Server Response Time graphs), and how many doors (URLs) you blocked off that it bounced off and fell. If your SEO traffic is crashing downhill and you don't know the algorithm reason, the probability of finding a red flag here is 99%. ## Which Red Flags Should Be Focused on in the Report Details? ### 1. The Main Graph: Server Response Time Bloat The "Average Response Time" under the 3 tabs you will see is very critical. If Googlebot normally receives a 200 milliseconds response from your site and one day this data shoots up to "1200 milliseconds (1.2 seconds)" and continues to hover horizontally at those numbers, it means you are inside a massive hardware or software crash. Your database might be bloated, you might be locked to the RAM limit of your hosting provider. A long "Server Response" implies that the bot will now crawl your site less frequently since its time is precious (Crawl Budget constriction). As the ratio approaches the low range (Milliseconds), health improves. ### 2. Download Request Codes (HTTP Status Codes Graph) You can see the % distribution of status codes in the sliced (Pie chart / Table) structure at the very bottom of the report. * **[OK (200)] Signal in the 80% - 90% Band:** This is what we want. The bot entered your site, ate what it wanted, and left. * **Rising [Moved Permanently (301) - (15%)] Signals:** It is a sign that there are too many redirects on your site. * **Yellow Alarm: [Not Found (404) - (5-20%)] Signals:** This is a massive jump in the number of your dead URL chains where it comes to the door and is not let inside. You have a broken link disease on your site which allows the links of products you have deleted to still roam in Categories or Footers. * **Critical Crash: [Server Error (5XX) Signals]:** It is the most toxic search error in the world. Even if you see 1%, you should urgently consult your server (DevOps) expert because Google has received a "Site Down" warning and been confronted with a horrifying experience. ### 3. Crawl Purpose Distribution Again, the 2 tabs that appear before you in the table are very important: (Discovery) and (Refresh). *If* the Discovery percentage is lingering around 5%, your newly printed news or products' probabilities of entering the indices and getting found have rusted terribly, and the bot now only comes to and from your site to "Have what I previously indexed changed?" (Refresh), meaning your internal site hierarchy (Link/Tree) architecture is registered as broken.