do robot restrictions drop my ranking if included in my sitemap

Discussion in 'Search Engine Optimization' started by mpea, Apr 17, 2008.

  1. #1
    Hi there

    I currently own a Forum and a main site (on seperate URLs)

    In the UK the main site does really well (for both rank and traffic), yet the forum couldn’t be worse.

    They both have approximately the same number of links, I have targeted different keywords for the different sites but the Forum hasn't even got close for the keywords (vbulletin)

    The only real problem I have seen is that I have added some pages in the site map, but then restricted them with the robots file. Would this cause a problem with SEO?
     
    mpea, Apr 17, 2008 IP
  2. astup1didiot

    astup1didiot Notable Member

    Messages:
    5,926
    Likes Received:
    270
    Best Answers:
    0
    Trophy Points:
    280
    #2
    The only issue is the XML sitemap will error out in Google Webmasters Central, the robots will follow the robots.txt if they of course follow that standard.
     
    astup1didiot, Apr 17, 2008 IP
  3. mpea

    mpea Guest

    Messages:
    272
    Likes Received:
    9
    Best Answers:
    0
    Trophy Points:
    0
    #3
    So having errors in the Google Webmasters Central doesnt mean google will lower you ranking for any reason, it just means google is making you aware that the search engine cant crawl the specified pages becuse the robot text doesn't allow it?

    I still can figure out why the fourm is doing so badly (private tutors link in my sig), it doesn't have a bad pr or number if inbound links.
     
    mpea, Apr 26, 2008 IP