An automated program that scans websites to determine their content and purpose. The name reflects how the software “crawls” through the code, which is why they are sometimes also referred to as “spiders”. Crawlers are used by Google to find new content and to evaluate the quality of webpages for their index. Webmasters and SEOs can request additional scans through Google Search Console.