Crawlers (or bots) are used to collect information obtainable on the internet. By using website navigation menus, and reading inside and external links, the bots begin to know the context of a web page. Of course, the words, pictures, and different knowledge on pages also help search engines like google https://antonioh624pxd2.wikissl.com/user