I have a web application that is somewhat ok on a few things. For directories, it shows www.example.com/directory/30-ASP-Shopping-Carts.asp. The "30" is from the database. And if you click on a listing in that Category, it takes you to something like www.example.com/links/130.asp. What I would like to know, if I create a sitemap and push the catergories to www.example.com/directory/ASP-Shopping-Cart and links to www.example.com/links/candy-press-shopping-cart, will the search engines consider this duplicate content? I am thinking the answer is yes, but I was not certain if the links were in the same website, it would not matter. If it does matter, if I add more content to www.example.com/links/candy-press-shopping-cart and www.example.com/directory/ASP-Shopping-Cart - would that help? Otherwise, maybe I will get a developer to see if they can fix it. Thanks!
To give a correct answer I would have to see the actual URLs to give you some comment to fix the problem. You can PM me the URLs if you want.
Think of content like movie plots. When a movie is predictable or the basic plot has been done ten times before, you want to see something new – a new spin from the director, a better level of special effects or an unexpected twist to the plotline to name a few. If a movie is a repeat of five others you’ve seen before – and it doesn’t give you anything new aside from different actors, how likely are you to see it again or recommend it to others? Content follows the same concept in my mind. If you re-hash the same crap already out there, with no added value – the site can be as *literally* unique as it wants, but that alone isn’t going to earn it recommendations (links) and therefore ranks in the engines and word of mouth with people for being conceptually unique.