Crawling

Crawling is the process search engines use to discover and scan web pages through bots called crawlers.
Example: Ensure your robots.txt file allows crawlers to access important pages.