I am doing my first experiments with google, and created a couple of articles sites to see what would happen. After a couple of weeks I got the sites listed and pages started to be crawled. However, in the last few days, these sites are loosing indexed pages (I am using site:www.domain.com to check) Is this behaviour normal or am I doing something wrong? Somebody please advise!
Sign up for Google Sitemaps and submit a sitemap. The site will tell you if there is a reason for a deindex.
I am trying to create a sitemap. Everything looks easy (except the robots.txt part). thanks for pointing this. I will let you know if it helps.
http://www.outfront.net/tutorials_02/adv_tech/robots.htm Creating a robots.txt file is dead easy, you can handle it yourself easily.
hi man u should check like this also site:http://domain.com it might happen that google indexed woth this. i face the same prob first i checked with site:www.domain.com but i did not got any result than i tried site:http://domain.com and i got the result so try it.
hi at this link given this 4. Allow no spiders to index any part of your site This requires just a tiny change from the command above - be careful! User-agent: * Disallow: / If you use this command while building your site, don't forget to remove it once your site is live! i cant understand the this line "If you use this command while building your site, don't forget to remove it once your site is live!" coz once u remove your robot.txt file there will no effect of it.