Hello Guys! I'm using Robots.txt file in my site but Google webmaster tool showing only 1 blocked URL and i have mentioned more then 60 Urls in file. What is the cause of that??? Please Help. Thanks
could be lag. there are trillions of pages now. google would likely tell you if there was a fault in the formatting of robot.txt. if you use a sitemap.xml file which excludes those pages you should be fine. if you MUST get them off Google, you can manually remove them in Webmaster Tools.. Google Index :: Remove URLs
i think that the way Google works now... (oversimplifying it here) it can add/update pages constantly, but removals and penalties, etc. happen during a big update. At least, that has been my experience.
Hi Everyone i have a query. I have created sitemap.xml and robots.txt where should i submit this both the things.
You can submit it in Google Webmaster tools. Create an account in Google webmaster tools and submit your website first. In your webmaster tools you can see the option "Crawl" below that you can see the sitemaps. Click it you can see the button "Submit a sitemap" You can also test the sitemap whether it's a valuable one or not. You can modify your robots.txt file under the same "Crawl" option by clicking robots.txt tester
Easy to do this, just simply login to Google webmaster account and their you have an option to submit sitemap. But take care about the sitemap format, it should be in xml format. If you have implemented robots.txt file in your website then it will be automatically get detected by Google webmaster tool; you can also test your URL which one is blocked and which one not.
Your title should not be more than 75 characters and your description should not exceed 150 characters. And it is better if you use only 3 keywords by analyzing using Google keyword tool. I'm using that for my website http://worldleaks.com for the past one year. It really helps me a lot
I am having one one problem my client is telling me that use same content in the website that is mention on http://www.godigitalonline.com to http://www.godigitalonline.in . SO tell me can i use same content in both the website. I think that i will be copy content issue. am right.
No. Don't use the same content in two different site, it will be counted as duplicate content. No matter how much similar both the sites are. But you can try to use URL canonical tags for this kind of process but sorry i never did that so cant help you one that.
Yep it's really a big issue for your website's SEO. You can do some changes in the content and post it. there should be some other technical ways to byepass this problem.. will have to look and help you out
Hi Cathndrew, My website pages are getting crawled but it does not show me Title when it is crawled. What i can do in this situation. Thanks & Regards Danish Shaikh
well if I'm not wrong then you want to say that the Title tag you are using in sites on page, are different from the SERP results Title. If that is the case then i will let you know that all this depends on the search engines and due to some Title length issue also.
Well at the end it all depends on Google that how it want to represent your title in SERP's. Not to worry about anything if following guideline.