As we know, phpld allows user to sort listings by PageRank, Hits and Alphabetical at the top of every page. It is useful from user perspective but what about search engine perspective? Dont we give search engines possible duplicate contents in this way? As far as i see, we should make those links nofollow! I would like to know your opinions..
not really .. u don't give duplicate content ... but some sites with higher PR gets more backlinks that way cuz they appear on the 1st page
Majority of people prefer ordered by PR, The several different ordered pages are good candidates to supplemental results.
Yes agreed... Octopedia is set up like that and it seems to be the very nice option or way of doing it. thx malcolm
Actually I disagree because no-follows get potential traffic from SEOs automatically suspicious. I choose to robots.txt it and if you're SEO friendly anyways you're good; Robots.txt; User-agent: * Disallow: /*? Code (markup): This denies all non static pages.
It's possible if used too much. But nofollow on your inner link is safer than on individual external links. I haven't figured this out yet though some claims that they lost PR because of the use of nofollow tag. Usually this happens when they erase bunch of links all of a sudden. I suspect that this is something to do with a link age factor, so if you have been linking to sites and put nofollow on them suddenly, it can be a red flag by G.
Yeah I agree though I'm sure some will say it's 100% superstition. I rather use the robots.txt 1) Due to exactly your point 2) some spiders follow no-follows and cgi redirects. 3) robots.txt looks cleaner and most friendly bots will look at it, maybe more than no-follows?
This is a great tip, it solves the problem of robots that follow the “no-follow†tag like yahoo and MSN.
CReed and I were discussing this a few days ago, and a few things were brought up. I wasn't sure if /*? would do it or /*?" then came the thought of the latest and topeven along with the search and sort So i was thinking that: User-Agent: * Disallow: /index.php?search=* would take care of all search related. User-Agent: * Disallow: /?s=* instead of Disallow: /?s=P Disallow: /?s=H Disallow: /?s=A would take care of all sorting issues. All the while leaving the listing intact for the latest and top, so as to not disallow all of them together with that suggested and initial Disallow: /*? Would those be correct?
rewriting them to static brings a header 404. I had them for a while and decided one day out of the blue just to check the header responses, and to my surprise although static, they were 404ing. they were respectively rewritten as /latest-links.html /top-links.html and they worked with no problems at all on the directory. Was just returning those 404's Maybe i'll go back and check on it one of these days.