spider is also known as robots...they follow the link of internet..... [TABLE="width: 196"] [TR] [TD="class: xl64, width: 196"][/TD] [/TR] [/TABLE]
google spider basically a robot. Which when listing and indexing pages and distributing link juices according to their algorithm, called as crawling.
They are called spiders because they crawl over the Web in search of content . Search engines gather data about a website by ‘sending' the spider or bots to ‘read' the site and copy its content. This content is stored in the search engine's database. As they copy and absorb content from one document, they create record links and send other bots to make copies of content on those linked documents this process goes on and on.
Good answers guys. Also I think its important to make sure your internal links are good. Ive been seeing that the spiders will have a hard time indexing pages that have no internal linking, no matter how good or rich the seo is on it, it may never be found. Almost like a private page. Make sure your internal linking is good. it will help the spiders
Spiders are basically the scanners those scan the web everytime and notice new entries and index them.
Spider is nothing but a software which will perform the job as web crawler. It will see the websites/page and will read the information/content/links and those pages may be added in the index (quality pages will be always added in the google index). This is what a spider do.
spider is the softwear which can made by google can go to your website and checked the changing you can made recently and improve your website rank but its depends upon our work
Spider is a program which is used to crawl the new sites and any updation in your site and index that in Google data base.
Spider is also called robtos that crawl our websites and index website pages into the search engines database..
It crawls your pages and stored in google database so if you need crawlers you need to add more contents.