Hi DP members. I found crawl errors on my site in this format node/21 node/25 etc I suppose those pages were created without url and were indexed by Goolge. What is your opinion? How can I fix this problem? Thanks
You can use robots.txt to block the indexing of files and folders you don't want indexed by search engines. - Quoted from robotstxt.org So if you wanted to block tose folders completely from public access you would be better off using .htaccess rules.