Crawler
A crawler (also known as a spider or robot) is a program That collects and gathers data on a website, along with following links to add into the database of a search engine. Having crawlers in a website mean that the website receives search traffic, which helps a website appear in search results.