I am looking at setting up a search site for a specific niche market, but I have a question about whether or not it is legal. If you look at Google for example, they scrape content, images, videos etc from all websites and display it in their search engine. Obviously they haven't got permission from every website, so one would assume what they are doing is perfectly legal. Is anyone able to explain why search engines such as Google, are allowed to scrape and re-use data, and at what point it is considered stealing content? Cheers Jake
Search engines follow rules from robots.txt and meta name = "robots". If desired you can disallow them index your site. I think, every search service creates Terms of Service. For example, http://www.google.com/intl/en/policies/terms/ All those who disagree with him, can disallow the indexing of the site. The main thing is that the Terms of Service are not contrary to the laws of the country where the Services are used.