Hi, I have done the look around that by using modrewrite on a .asp website it is possible to improve the URL of a .asp website. For example, http://www.domainname.com.au/filename1.asp?pcat_ID=14 page has the same title tag and displays a different product depending on the end number 14 or 15 or 16. I have used the robots.txt to not crawl the /filename1.asp?pcat_ID= pages. Has anyone had a similar situation and managed to get Google to ignore the dynamic pages. I have created flat pages with content as the dynamic pages provide product info and not much keyword rich content. Look forward any comments, that will be useful Cheers SKP77
Yes robots.txt can be used to block the SE from indexing certain content. for example in my wp blogs....the same content can be indexed using - www.sitename.com/category/post-title/ www.sitename.com/category/ www.sitename.com/page/.. www.sitename.com/feed/ So basically just to avoid the google dup content penalty i block certain things using my robots.txt