Hi dear Friends My website have more than 1000 bad links. How can i block them in robots.txt i have 2 Type links: http://MYDOMAIN.com/all?order=title&ord_t=asc http://MYDOMAIN.com/all?order=cat&ord_t=asc http://MYDOMAIN.com/all?order=loc&ord_t=asc http://MYDOMAIN.com/all?order=price&ord_t=asc - http://MYDOMAIN.com/all?uid=1003 - http://MYDOMAIN.com/all?se=1&se_regs[0]=72 http://MYDOMAIN.com/all?se=1&se_regs[0]=73 http://MYDOMAIN.com/all?se=1&se_regs[0]=74 http://MYDOMAIN.com/all?se=1&se_regs[0]=75 ----------------- http://MYDOMAIN.com/15?order=title&ord_t=asc http://MYDOMAIN.com/15?order=cat&ord_t=asc http://MYDOMAIN.com/15?order=loc&ord_t=asc http://MYDOMAIN.com/15?order=price&ord_t=asc - http://MYDOMAIN.com/16?order=title&ord_t=asc http://MYDOMAIN.com/16?order=cat&ord_t=asc http://MYDOMAIN.com/16?order=loc&ord_t=asc http://MYDOMAIN.com/16?order=price&ord_t=asc Code (markup): This code is true or false? Disallow: /all*/ Disallow: /15*/ Disallow: /16*/ Code (markup): Thanks a lot ♥
Disallow: *.pdf$ Code (markup): The sample code blocks crawlers from accessing all URLs containing the file type “*.pdf”. By comparison, omitting the $ wildcard would block any file paths containing the string “.pdf”, such as /docs.pdf/newcars.htm.
thanks so this code: Disallow: /all*/ Code (markup): can block this links? http://MYDOMAIN.com/all?order=title&ord_t=asc http://MYDOMAIN.com/all?order=cat&ord_t=asc http://MYDOMAIN.com/all?order=loc&ord_t=asc http://MYDOMAIN.com/all?order=price&ord_t=asc - http://MYDOMAIN.com/all?uid=1003 - http://MYDOMAIN.com/all?se=1&se_regs[0]=72 http://MYDOMAIN.com/all?se=1&se_regs[0]=73 http://MYDOMAIN.com/all?se=1&se_regs[0]=74 http://MYDOMAIN.com/all?se=1&se_regs[0]=75 Code (markup):