in interview if they ask question what is robot? what does it do? Robot is the spider who come to all website to collect the data for searches ------ this is correct shall i use this pls explain me the simple i have to pronounce in interview
yea you may say the same as you have mentioned!! Robots are spiders collecting the data from websites!!
Google has recently post on this topic. you should read it- http://googlewebmastercentral.blogspot.com/2010/11/controlling-crawling-and-indexing-now.html http://code.google.com/web/controlcrawlindex/
robot file is the file that tels google and other spiders what to do ... what links to follow and what not, and what folders are not for him to crawl ... for example you will like google to crawl your site, but for shure you don't want google to index your /admin folder or your /uploads folder... also in google webmaster tools you have a robot file generator ... hope this was helpfull
The web spiders and other web robots from accessing all part of a website which is otherwise publicly viewable...
Really depends on what context the question is asked. There are many kinds of Robot A file, A Spider, A Movie, A Machine and many more.
Robots is a generic term for programs and automated scripts that “crawl†through the web and collect data from websites and anything else on the Internet that they can find.
hi yes you said it right.Robot is the spider who come to all website to collect the data for searches.
hi yes you said it right.Robot is the spider who come to all website to collect the data for searches.
Dear Fren Whatever you have tell them is correct but your answer is not sufficient to impress, neither it will show your knowledge. Just go thru this link and read how to about the ROBOT and how to answer google.com/support/webmasters/bin/answer.py?hl=en&answer=156449 Always Remember that, answer in full and satisfy the question criteria.....Cheers
Robots are the generally software's that comes on your site to read the content of your site for better indexing your site on the search engine result page. Robots are also known as crawler and bots. We generally create robot.txt file for indexing our site. There are various softwares that generate effective robots.txt files that help ensure Google and other search engines are crawling and indexing your site properly.
Hello.......... Robot.txt is a simple text file. it is used tell to search engine that the particular content or page on your website you would not like them to search or visit. Web Designer | Real Estate Web Design
Yu this is absolutely right. Robot is the spider who come to all website to collect the data for searches
Actually It is better to understand the things rather than to learn it. Robots are the set of software which helps the Search engines to crawl your site and to get the data for their database. Every search engines has their own Robots. The GOOGLE Robots are called "Spider" as the Yahoo Robots are called "Slurp" Hope you are clear now
Hi, It is basically a text file that put in your website to tell search robots which page webmaster like them to crawl or which not. Thanks