a search engine employs special software robots, called spiders, to build lists of the words found on Web sites. When a spider is building its lists, the process is called Web crawling.
Hi, I think the role of spiders for any search engines would be the fundamental but the most important. They crawl and index pages on the Internet according to their own algorithm. All search engines would rank pages basing on those data. all the best,
Hi, Google Spider is just a software that check your site and associated internal and external link and maintain temporary index of all links.
why they called it Google SPIDER it's is because WWW means world wide WEB it's a spider that crawl into your web (Your website) . the spider web has a lot of line and string (this is your backlink) the more string (backlink to your website) to your web the more spider frequently crawl to that web (your Website) spider crawl base on your backlink and eat a relevant content in your website. in easy words, it's a software that find website . every search engine have it! p/s : i'm feeling a little dizzy while trying to post this. hihihi
Google spiders I think means a series of codes that can help the search engine find your web site! And then Google started to index the contents and links of your site!
Google spiders is the process by which Googlebot discovers new and updated pages to be added to the Google index, it is also known as as a robot, bot, or crawl.
There are several talks about spider but spider main work is read website content regularly and index website pages.
Spider reads the content of your website & copy it in the search engine database for fast indexing of your site when someone trying to retrieve some relevant information.
Spider also known as robots or crawlers are used by Google's search engine to crawl and index sites when someone searches on Google, it pulls out the results out of its cache and gives the link
Spider is a program that automatically read Web pages. It's provide feed search engines. It's Known a spider because it crawls over the Web. we also known this programs is webcrawler.
Spider is an automated program that crawls the web pages to gether information about the web page and after gathering the information it sends to the database of the search engine.
Spiders are the software made by Google. They follow the links on the internet, indexing the Data & spew it out when the search is taking place for the specific data.
According to me, search engine Spider also called the name of robots and crawler. Spider work visit your site and store your site data on search database.