Hello everyone, I am wondering how often SEO professionals tweak a webpage? This is not necessarily for fresh content, but for optimization purposes. I want to find a balance between yahoo and google optimization. Thus, I need to tweak the webpages slightly until I achieve this balance. My question is how often should this tweaking occur? How long does it take for crawlers to crawl the website and change the SERP results? Suggestions are welcome. Sincerely, Travis Walters
It really depends upon where you are now in the serps. If you are on page 37, you'd want to adjust more often (what do you have to lose). If you are on page one at position 10 but want to move to position 5, you have to be more careful and move in much smaller steps (you might be a tweak away from going to page 2). You can check to see when Google last indexed your site by clicking on the "cached version" of your site which Google displays in the serps.
Hey there, I have two questions. 1. Originally, I had the title off all the pages set to this: Holland Michigan Real Estate | Holland Michigan Homes for Sale | PAGE_NAME Where PAGE_NAME was the name of the page. Somebody told me that I should not have all the pages like this because google does not like it. A week after I changed the titles.. kept keywords in the titles.. but mixed them up.. the rankings fell. Do you think I should set the titles back to what I had them before, or was mixing them up a better option? There is one other thing I am wondering about as well. 2. There is a MLS service my client's real estate website is using. There are other sites in michigan that use this service. So a lot of the west michigan real estate websites are going to have the same property descriptions. How does google penalize websites for having duplicate content? Would they penalize all west michigan real estate websites, or just my client's website? If google is just going to penalize my client's website, what should I do? Currently, my client's website has about 700 pages indexed, most of them property details pages with the duplicate content in them. I heard the more indexed pages you get, the better off you are. However, would I be better off leaving all these pages in their index or should I just blocked the details pages from being accessed in the robots.txt file? Suggestions are welcome. +REP for those with great responses. I need a good solution here. This is very important to my client. Thank you in advance. Sincerely, Travis Walters
Well, it's not "pretty". Rank drop will occur after a change like this. This is normal and temporary. Just wait a while. But if your rank is still in the depth of the sea then you should try to change back. Testing is always advisable. Google will only display some of the strongest websites on the result pages. If your client website is not considered strong, it will never appear on the search result although it is still in the index. I believe that the more updated your site (frequently), the better you are. Having 10,000 "old" pages in the index would not help so much. Imo, you should just leave them to be indexed. It's basically the same situation as books sold at Amazon.com, you could find them sold at other hundreds of online stores.
I do it like once a week until I am happy with the on-page optimization quality. Although really, I can often get it right the first time...
Hey there, Thanks for the great comments. I have another question that you might be able to give me some advice on. These properties get imported by the thousands, and we import them once a week. This helps us keep our content fresh. However, when they are updated, they receive a new ID number because they are deleted form the database and then readded. This ensures that there are no properties in the database that have been sold since the last update. The details page use to display like this.. viewProperty.cfm?propertyID=12345 So google indexed this page, and within a week it is removed because the property ID changes. So this got me thinking.. every property has a MLS number that does not change.. so I tweaked the website so it would do this.. viewProperty.cfm?mlsNumber=67890 So now when google indexes the page, it stays in the google index. My question is this.. would have it been better to leave it like it was before with the ID number or was changing it to the stable link better? I know google loves fresh content.. so I was thinking the ID links could seem fresh because of the new ID extension, but the MLS number would cause the links to be stable.. and google loves stability as well. Let me know what you think. Thanks. Sincerely, Travis Walters
I think the static page would be far better; Google likes pages with some age and it doesn't like 404 not found errors.
Hey there, Thanks for the response. If that is the case, then the change I made was a good one. It looks like some of my keyword phrases are starting to get back in the top three. Woohoo Sincerely, Travis Walters