I think you mean Google Crawlers i.e. the Google bots they are sometimes called spider as they discovers new and updated page to be added to the Google Index
Google bot or robot is a thing who work only for collecting new web site information. everyday thousands of web site is being created , so google collect their information by sending bot or robot. It just collect information and Indexer index the information what google bot collected.
Google Robot is Google crawler or spider. Robot is responsible to crawl the web and to find out newly added pages on the web and to index websites according to the search query of the end user.
Robot is a kind of file which order to search engine that which site will be crawled or not. Its default setting is crawl ordered. Some who doesn't like to show a page of his/her site then he/she will sets disallow in robots.txt file.
Whether it is a new or old website, Google's robot crawls your site regularly to get information from your site and provide it to the search results.
It does it at about 6 AM EST. But, if content isn't regularly posted then its less frequent. I think daily for most sites.
Googlebot is the search bot software used by Google, which collects documents from the web to build a searchable index for the Google Search engine. Google Robot - Google also building a robot for army use.
Google robot is google algorytm to read and check your website, so if you have high seo value, google will increasing your serp
Google robot or google bot are the software created by google which helps to collect information on world wide web to build a searchable form of the Google search engine.
Dont you just love when something is explained 25 different ways... can I please have that in 25 different languages also?