Site being punished by Google - don't know why

Discussion in 'SEO' started by lhammer610, Aug 26, 2010.

  1. #1
    My website is
    http://www.visitoregonsouthcoast.com/
    and it is about the Oregon coast.

    When using Google Webmaster Tools, search queries, I see very low rankings for the following keywords that are crucial to my site:

    oregon coast 240
    oregon coast hotels 140
    bandon or hotels 250
    motels in oregon 120
    crescent city lodging 130
    reedsport oregon hotels 130
    umpqua river camping 250
    gold beach 350

    visit oregon coast 3.6 - highly ranked

    I have over 500 pages of information and the pages are dynamically created. I have followed what I thought was the best SEO practices, but I suspect Google thinks I am keyword stuffing, since some of those same words occur a lot on the site - since that is what the site is about.

    Keywords - frequency

    Oregon 563
    Coast 540
    beach 401
    dunes 326
    camping 294

    I used to use some of those words as alt tags for the maps (which was appropriate), but I deleted those on 6/1. I have reworded to reduce their frequency. I have changed the meta description and have seen my traffic actually slowly reduce.

    I would be willing to hire an SEO, but keep in mind that I am doing this on my own dime to help out the community (ads are not off-setting my costs and are mostly PSA).
     
    lhammer610, Aug 26, 2010 IP
  2. more_sem

    more_sem Peon

    Messages:
    299
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    more_sem, Aug 26, 2010 IP
  3. lhammer610

    lhammer610 Peon

    Messages:
    9
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Good question on the robots.txt, but yes. The subdirectories disallowed are where the stored data that is retrieved for the dynamically created page. For example, it has disallowed "*/venues". Under */venues, there is a page called 333 Hiking Trail. When that page is displayed, it will display with the URL
    visitoregonsouthcoast.com/333%20Hiking%20Trail
    The reason I disallowed the */venues is that when Google's bot checked out the site, it found two copies of the page. One located at the URL listed above, and one located at
    visitoregonsouthcoast.com/venues/333%20Hiking%20Trail
    I was concerned that since they are duplicate pages, that Google would think that I was keyword stuffing.
     
    lhammer610, Aug 26, 2010 IP
  4. Alan Smith

    Alan Smith Active Member

    Messages:
    1,263
    Likes Received:
    12
    Best Answers:
    0
    Trophy Points:
    78
    #4
    Having a good keyword density is good for any website but one needs to keep in mind that keyword should not be overly stuffed. And many SEO experts considers that keyword density up to 1 to 3 percent is optimum.
     
    Alan Smith, Aug 27, 2010 IP
  5. ajivets

    ajivets Member

    Messages:
    25
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #5
    good adviser for all. it's very usefull for the other to read the comment. it's so simple and easy to understand the clarification. thx all.
     
    ajivets, Aug 27, 2010 IP
  6. mgold

    mgold Peon

    Messages:
    65
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Nice one Alan ;-)
     
    mgold, Aug 27, 2010 IP
  7. lhammer610

    lhammer610 Peon

    Messages:
    9
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Hi Alan.
    On occasion, my keyword rate may exceed 3% and approach 4%, but never higher. Mostly, it is in the 1 - 3% range.
     
    lhammer610, Aug 27, 2010 IP
  8. lhammer610

    lhammer610 Peon

    Messages:
    9
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #8
    One of the frustrations with measuring keywords on a large site, is that none of the keyword tools that I am aware of, measure the site as a whole. Instead, they measure only a single page. They also give different results. For example, SEO tools (tools.seobook.com) gives a keyword density of:

    beach 9 2.26%
    river 9 2.26%
    coast 9 2.26%
    oregon 7 1.76%

    All of these are within the "acceptable range".

    However, Addme.com gives a different listing:

    coast 9 3.53%
    oregon 8 3.14%
    travel 8 3.14%
    river 7 2.75%
    views 7 2.75%

    I have Google ads on the side. Sometimes I think they are counting the Google ads. I deleted my alt tags on my images as they were pushing my keyword levels too high.

    Also note that the number (frequency) are about the same, but the percentages are different, meaning that they must be counting the total number of words differently.
     
    Last edited: Aug 30, 2010
    lhammer610, Aug 30, 2010 IP
  9. lhammer610

    lhammer610 Peon

    Messages:
    9
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    OK, no one seems to be able to come up with an answer for the above.

    How can I find out how many pages Google has currently indexed?

    If I use the Google search option of "site: ", I get a different number depending on which computer and browser I use.

    I get 106 on this computer and browser, but 148 on another computer and browser.

    When I use a tool at "Selfseo.com", I get a message from the Google search of "Not Available" and below that appears what looks to be CSS code. AltaVista has 568 pages indexed. I want to be certain that the way the site is set up is not causing problems for the Google Bot.
     
    lhammer610, Sep 1, 2010 IP