URL restriction tips

Discussion in 'robots.txt' started by No_name, Jun 2, 2010.

  1. #1
    Hi,

    A few time ago I launch www.wallpaperfx.com and I realize that I must restrict a lot of pages, because I have the same content, but order by different filters like: top downloads (last 7 days) etc.

    Here is a little example:
    - I want that google index from category "Animals" all the sub-categories but not this kind of pages: http://www.wallpaperfx.com/animals/top-rated/last-month because there a lot of new pages for google and with the same content.

    My concern is that I want to put this kind of page in robots.txt but there are a lot of them (actually over 2400 and the number will increase with new sub-category added) . Do you know another way to do this ?

    Well I have another problem (not really related ... but ...) according with my webmaster tools account the most common keyword is not "wallpaper" wallpapers is the 5 and I want that this to be first. Does anyone know how to remove or restrict the first top 4 keywords ?

    I know the site is working very slow, but the reason is the database and I'm working on that too.

    Thanks.
     
    No_name, Jun 2, 2010 IP
  2. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #2
    I do have answer of your first query. You can use wild characters to block multiple pages that follows the same url structure. However, to provide you an exact solution, I must know the list of URLs that you exactly want to block.

    honestly, I don't get your second query. Can you please explain it so that I can review it and can assist you.
     
    manish.chauhan, Jun 11, 2010 IP