Site pages indexed 2 times

Discussion in 'Site & Server Administration' started by varun8211, May 13, 2006.

  1. #1
    I have a real good content site on herbal cures. (www.herbsandcures.com). It has been online for more than 1 yr now and google has indexed more than 1000 pages from this site.Now the problem is, it is still not being shown on top google results in very specific keywords too.. Has it been sandboxed?? I am saying this bcoz, there is a reason for that.. When the site started, my site's URLs are sth like :
    http://www.herbsandcures.com/viewdiseasedetails.php?disease_id=11 for every disease (with different disease id offcourse)
    and google had indexed most of such URLs becoz they were hard coded on the pages.
    but after say 5 months, I implemented mod_rewrite and changed the URLs as:
    http://www.herbsandcures.com/view-disease-details/arteriosclerosis.html

    I created another PHP page and redirected it to "html" version..
    Now, I have same content on 2 pages and google has indexed both of them..

    Can this cause the problem and not showing my sites on google search engine?

    Any advice will be appreciated..

    Thanks a ton
     
    varun8211, May 13, 2006 IP
  2. NetMidWest

    NetMidWest Peon

    Messages:
    1,677
    Likes Received:
    151
    Best Answers:
    0
    Trophy Points:
    0
    #2
    NetMidWest, May 13, 2006 IP
  3. varun8211

    varun8211 Peon

    Messages:
    483
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I think, this would be best and most efficient method :
    <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

    Shall I place this code in viewdiseasedetails.php?disease_id=<diseaseid> ? this will also require me to change the code only in 1 page...
    what do u say?
     
    varun8211, May 13, 2006 IP
  4. wisam74us

    wisam74us Well-Known Member

    Messages:
    1,059
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    168
    #4
    I don t think so
    Many web masters enabled archive feature in VB which is some how creating duplicated content but nobody encounter any problem

    I think you can wait few weeks and check, Google behaving very strange these days
     
    wisam74us, May 13, 2006 IP
  5. varun8211

    varun8211 Peon

    Messages:
    483
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #5
    i have been waiting for last 6 months :)..
    I definitely think, there is some problem that I am not able to figure out otherwise, such relevant content rich can not be escaped from numero uno in SEs
     
    varun8211, May 13, 2006 IP
  6. NetMidWest

    NetMidWest Peon

    Messages:
    1,677
    Likes Received:
    151
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I don't have any knowledge of the structure of your site, but that is the easiest - put the robots meta tag in, and use the url removal tool for specific pages if they are not too numerous.
    The nice thing is that Google will give you the status of the removal via email, so if you need to, shut the whole site out from robots for a few days and use the tool. I know that sounds extreme, but could your rankings really suffer that much more for the few days it would be uncrawled?
     
    NetMidWest, May 13, 2006 IP
  7. theblight

    theblight Peon

    Messages:
    246
    Likes Received:
    9
    Best Answers:
    0
    Trophy Points:
    0
    #7
    or recode so that the link that will be produced on the site will be the same as the fake urls
     
    theblight, May 17, 2006 IP