Do you have some experience with preventing google to index some part of your page? Thanks in advance Regards, Bane
[h=2]1. Use a robots.txt robots exclusion file 2. Use “noindex†page meta tags 3. Password protect sensitive content[/h] 4. Nofollow: tell search engines not to spider some or all links on a page 5. Don’t link to pages you want to keep out of search engines [h=2]6. Use X-Robots-Tag in your http headers[/h]
Cashcars give a good answer but one step missing Remove your web page from your XML section or webmaster tool settings. cheers
Thanks guys but my question was for one part of a webpage, not whole webpage.. do you have some ideas?
Yes i have one idea upon it. just put following robots.txt file. user agent:* Disallow : / The page you want to stop to index
Apparently DP dolts still can't read. He said PART of his page, NOT all of it. As far as I know, you cannot stop Google from indexing all of the page (except for link following), and still have Google index the rest of the page. There is POSSIBLY one way, but it depends on how Google indexes pages. Write your non-indexable content using a RANDOM mixture of Unicode, ASCII codes, and plain text. I don't know if that will work, but you can try it.