Web Crawler

1 Posts Back Home

How Search Engines Work – Web Crawlers

There are basically two types of search engines. The first are robots called web crawlers or spiders. Search Engines use spiders, sometime called a spiderbot to index websites. When you submit your website pages to a search engine by completing their required submission page, the search engine spider will index your entire site. A ‘spider’ is an automated program ran by a search engine system. A spider “crawls” a website. When a Spider crawls a website it reads the content, the Meta tags and also follows the links. The crawler then returns the information to a central depository that indexes the data. The spider will visit each link you have on your website and index those sites as well. Some spiders only index a certain number of pages on your website, so don’t create a site with 500 pages! Web Crawlers usually discover pages from links within the site and…

Navigate