Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget

Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget

Gary Illyes from Google added to the big crawl budget article he published on the webmaster blog in January 2017 a new question and answer. It basically says that if you disallow a URL in your robots.txt file, then those URLs do not affect your crawl budget.

Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget
Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget

Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget

Leave a Reply

Your email address will not be published. Required fields are marked *