Also called spiders or bots (short for robots), these programs automatically visit Web sites, read pages, and collect information. Used often in search engines, crawlers can artificially inflate the number of page visits for a particular site up to 30 percent. The better traffic-analysis tools filter such visits out when creating traffic reports.
Crawlers are the bits of software used by search engines to read your website during indexing.
Also know as a "robot" or "spider", a crawler is an automated software program that runs at many search engines, reads sites' content, analyzes it, and inserts them into the index (or collects information for later insertion into the index).
Search Engine automatic indexing agents that visit your site once you have submitted your site url. Also see "robots".
The term used for the tools that search engines automatically send out to find web sites, record them and index them within their databases. Also known as robots or spiders. Some crawlers (also known as "spiders") will only visit the home page of a web site while others may 'deep crawl' and index many sub-pages, depending on the structure of the site.
(also called search engines) - Programs designed to search and categorize the World Wide Web.
A class of programs designed to ceaselessly search the Web, looking for specific content or simply following links to see where they go.