Hi Nosferatus, I always thought that the title tags could go either above or below the main meta descriptions. In most sites I've seen on coding they recommend putting it above the meta (not particularly for SEO, just html tutorials I've read). I'd be really interested if you could describe (or link to) why going below the meta is better than above it. Cheers
I've looked at google and 2 other sites and I have seen them below the meta tags. I have only been putting them above as I learn't it this way from a tutorial about 5 years ago.
They have only just gone to 30. They were at around 40 or 50, but still are not back at 1 yet. So what causes penalty 30 these days. Does it definately look like the dupe content on co.uk and .com of my sites. Personally after looking at more sites source code, I don't think it matters where in the <head> your <title> tags are as longs as they are in the head. It is best to ever have them directly below the metatags are directly after the start of the head tags. After changing one site of mine. I thought I would take a further and more advaced look and it seems that alot of sites ranked near the top also does it my way. I even saw 1 site that had <head><metatag><title><metatag> and it was ranked #3
man... U still here.. just came by to grab a few messages... Here some reading on duplicate content problems cya soon
I have read the stuff from the link. Interesting read. I am just wondering what I should do now. Should I use robot files or shall I use the server to forward it or what every you call it. Also, how easily would it to take the penalty of. Shall I just ask them or shall I just wait and see what happens.
I mainly suggested that since I think most bot algorithims look for things in that order. So it more of a case of optimizing for SEO, as opposed to breaking any HTML "rules", so to speak. As for title themselves, what is more important then just placement is the text within the title. Always try to make it as descriptive as possible, since that will be one of the first things the bot sees. (Or the first thing it sees, if title is the first thing). For example, if you have a short tagline for your site, (e.g. XYZ Widgets. Inc. - California's Best Widgets) you should put that tagline in every page title tag, since that will give you multiple index entries for those terms, as opposed to using it, say, only on the home page. Sort of a buy one, get ten free sort of deal and it will, in a sense, operate as a super keyword.
Thanks for clearing that up. But your answer leads to another question. If you use ten title tags the same, won't this lead to a higher chance of pages going supplemental?
Gang.. yer hijacking the thread.. this has NOTHING to do with Tags or -30 silliness... If you do a little homework and follow what I have discussed earlier.. John has 2 sites with substantially duplicate content. He is trying to Geo target without incurring Dupe problems... Ok ? - thanks. Just start a new thread with
OK! I have come to a decission and think it may be best if I scrap everything on the .co.uk domain apart from the shopping directory. I would like to keep this part of it as there is some potention here. So how would I do this. Shall I just leave the content on it's own sub domain account on my server or move it all to the main domain at simplysearchit.co.uk so that it is on that instead of shopping.simplysearch4it.co.uk? The web directory is also different content, but I prefer to scrap that as there is no potential there and to much work when having to keep checking if the submissions are UK based sites. Also, I would keep the same templates, but the content is all UK products and merchants and the .com shopping directory is all US products and merchants. Would this be a better approach and will likely stop the duplicate content penalty? The problem is that Google says build your sites for your visitors and not for the Search engines. That is what I have done and I have a penalty. Now to solve this penalty I have to go and try and build the site for the search engine by scraping alot of duplicate content that is used for 2 different types of traffic and geo targetting traffic.
I did actually come to a suggestion at first that it may be best if I just kept the shopping.simplysearch4it.co.uk content and then split the articles, games, and shopping sections of the site into new sites and put them onto different domains with different templates. This will work, but I already have a alot of visitors to this site and a good visitor base that returns time and time again and having all this content like this on my site is what makes it different from the rest. If I split it up then it will just seem like another articles site, games site, & shopping price comparison. So I have now figured that it would be best if I just keep the .com site as it is and just build more sections on it and make it more like a portal like Yahoo and maybe even have news on it in the future and some other stuff apart from meta search engine. That is something that I will not bother looking into.
Well for now.. ( I am still researching a proper geo targeting - IP routing type of thing.. ) .. you can simply use the robots.txt to diasallow or noindex no follow - for the dupes on ONE of the sites...
Would using robots.txt work. If I use it then would Google still give me a penalty as I still have duplicate content, but I just am not letting their search engine have access to the co.uk duplicate content. What do you think? Also, the homepage of both sites are around 66% duplicate. Shall I also use a robot.txt to stop that or would it be best to change the page a little so that I don't use a robot.txt file on this as it is really the homepage and if I stop the search engines having access to that then would it bother crawling the other non duplicate parts of the site. I will find the thread shortly on here that I replied to or started about 3 or 4 months ago. I had concerns about the .com and .co.uk site being duplicate and was worried about getting banned or having a penalty about 3 - 4 months ago. People told me that it would be fine and just leave it the way it is and know I have the penalty. I wish I didn't listen to them now.
Heres the thread http://forums.digitalpoint.com/showthread.php?t=129830 . Some did mention that it may give a penalty, but more said it should be alright, which made me believe it would be fine. I wish I had gone with the others now that said there may be a penalty. Gutted.
As I mentioned in the article I sent U to on dupes, there is a FILTER and a PENALTY. One needs to satisfy more than one factor (dupe content) to start moving into penalties. Soooo.. do we have any others? Sure do... we have the 'boiler plate' offence. Meaning that not only do we have Dupes (substantial dupes) but we also have a boilerplate/duplicate design (page segmentation) problem... Now say the algo flags it for further investigation - it could check the 'owner profile' and see they are on the same IP and owned by the same person. A spammy profile (unintentionally) can start to emerge
So what would be the best? Would it be best to just keep the shopping part of the .co.uk and make the design slightly different. Doing this will mean no duplicate content and the design will be slightly different. Or shall I just use robots.txt files for the duplicate content stuff? What would be the best when it comes from a search engines point of view? What I'm also thinking is that if I have robots.txt blocking major parts of the .co.uk site then the pages pagerank will go to 0 which will look bad to some people, if I get backlinks then they will probably not mean much when it comes to SEO as they will be pointing to no follow index pages.
I have also used the same policy on all of my sites. Should I make them all no indexable with robots.txt and just leave the one crawlable on the main site?
If I do remove the games.simplysearch4it.co.uk content then shall I use a 301 redirect to redirect that sub domain to the games.simplysearch4it.com site or would this be a bad idea.
Holy CCrrraaaapppp John... slow down M8... My head is spinning. TO be honest there is considerable planing involved. I am getting a head ache here (lack of sleep as U know). Let me come back to this in a bit Sorry
I know. So far it has been a day of me sitting at the computer and doing nothing. All I have done is flick from site to site of mine and used copyscape and other page copy tools plus I have been looking at databases making dessions and so far I have not deleted anything or changed on thing on any site. This is the same as most of yesturday. I am planing on what the best and most profitable root will be to take plus which will not cause duplicate content and suite my sites visitors the best.