Web crawler – (Also spider, spiderbot, or simply crawler.)

A web crawler or spider is a computer program that automatically fetches the contents of a web page. The program then analyses the content. Search engines often use web crawlers.

en_USEnglish