Definitions for "Crawlers"
Keywords:  robot, spider, ceaselessly, visit, bots
Also called spiders or bots (short for robots), these programs automatically visit Web sites, read pages, and collect information. Used often in search engines, crawlers can artificially inflate the number of page visits for a particular site up to 30 percent. The better traffic-analysis tools filter such visits out when creating traffic reports.
Crawlers are the bits of software used by search engines to read your website during indexing.
Also know as a "robot" or "spider", a crawler is an automated software program that runs at many search engines, reads sites' content, analyzes it, and inserts them into the index (or collects information for later insertion into the index).