Crawler Definition

Also known as Spider or Robot, a crawler is a search engine program that “crawls” the web, collecting data, following links, making copies of new and updated sites, and storing URLs in the search engine’s Index. This allows search engines to provide faster and more up-to-date listings.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.