What is a search engine spider or robot ?

Discussion in 'Search Engine Optimization' started by sweetmaddy, Dec 1, 2010.

  1. #1
    A search engine robot is a software program designed by search engines to seek out data on the internet. There are literally billions of web pages that exist on the internet. A spider will come to your website and gather all the data you have. The first thing the robot will look for on your web server is a file titled “robots.txt” this file which resides in your root directory will let the spider know if it is allowed to read your web pages or not. Please read the robots.txt for information on what exactly a robots.txt file can do to improve your chances on getting indexed. If the robot is allowed to read your webpages, it will then proceed to read your pages and then report the data back to the search engine that sent it on the mission. Once the data has been delivered to the search engine, the engine will then place the data in a database that is used by customers to find web pages. Such databases exist at Google.com, Inktomi, Teoma and other popular engines. All search engines have their own robots that scour the web for data.
     
    sweetmaddy, Dec 1, 2010 IP
  2. brad12

    brad12 Peon

    Messages:
    103
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Googlebot is the search bot software used by Google, which collects documents from the web to build a searchable index for the Google search engine. for more information you can check the en.wikipedia.org/wiki/Googlebot link.
     
    brad12, Dec 1, 2010 IP
  3. Deecoup

    Deecoup Greenhorn

    Messages:
    43
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    16
    #3
    It is a software program designed by search engines
     
    Deecoup, Dec 1, 2010 IP