How to avoid Dup. Content De-indexing with a DB driven site?

Discussion in 'Search Engine Optimization' started by yo-yo, Nov 8, 2005.

  1. #1
    Hey Guys -

    I have bought tons of Databases, and plan to buy many more.

    The only problem I'm having is that the sites will fully index in google, and then de-index or stop being cached and just show up as URLS (no title, no desc).

    I'm assuming that problem is caused by the fact that others are using the same DB's and have the same content on their sites... suggestions for getting around this? (obviously im not going to add manual content to every listing of a 5,000 record db)
     
    yo-yo, Nov 8, 2005 IP
  2. TiGG

    TiGG Peon

    Messages:
    209
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #2
    if you're using databases of static content that is being sold to tons of people to use on their websites your sites are what the search engines are trying to filter. Therefor your goal is to out smart them...

    Some ideas would be using text rewriters, adding more content or rss feeds to the page, or get enough high PR links to be seen as the best of them... anything you can do manually can be programmed to do automatically so work from there. If you come up w/ a working answer you're rich.
     
    TiGG, Nov 8, 2005 IP
  3. nichewriter

    nichewriter Peon

    Messages:
    27
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I've seen suggestions for adding rss feeds based on different keywords (you could use a random selection of related keywords) and youtube videos to make everything more unique.
     
    nichewriter, Apr 10, 2011 IP