I noticed that this week Google is intermittantly returning link titles and descriptions with the URL. It's returning the links, but only about 30% of them seem to actually have titles and descriptions. I've checked another one of my sites and noticed it did it to those links as well. It seems to be more with single pages with variables after them (ex: www.domain.com/page.php?id=578). I also went and looked at a few other sites and have also noticed the same thing, so it's apparently not just me. Anyone else notice this? Ian
I have one site where all 10,000 pages now show just the url. On another site where every page had a perfect cache, now hundreds of them are showing the url only. Don't know what is happening, but I hope it is fixed soon
Good....it's not just me. Anyone know if this is going to be a temporary thing thing, or do we need to change now to .htm pages? Ian
According to GG (supposedly Matt Cutts) @ WMW, Google is not quite finished implementing some sort of major update. It may be a few more days before things "settle". Of course, it could be that university students just don't like your site (if you have read Henk van Ess' blog)...
Unfortunately I've seen this happen just before Google removes the pages from the index (usually for dup. content). Hope that's not the case
I don't think that's the case this time. One of the newspapers I read I noticed is having the same problem with their articles, as they also obviously use a dynamic link system because their articles are in a database. Looks like we're just going to have to wait this one out Ian
The same thing happened a few days before they removed a ton of my indexed pages. The links remained at first, then the API stopped counting them (weight dropped in coop). I added a site map and the pages that previously contained no description have all returned (with description). I know this is what happens just before google is going to drop the pages, but I don't know if this is the only time it happens.
No description or title usually means (prior to the update) that these pages are been blocked from been indexed properly due to duplicate content. With all the changes going on I'm not 100% that is correct in the current climate, but it was certainly a problem for me until I change the content and hey presto the pages came back.
One of the newspapers I read is the Boston Herald and about 75% of their articles are having that issue as well. It's interesting because while it's the same file displaying the articles, it's obviously not the same content. I have another site where I have thousands of articles archived also, so now the dillemma is how to correct the issue as this is definitely going to become a problem. Obviously Google has their new site map feature. The problem I have now is creating a site map for a site that contains thousands of pages. I just don't understand how it can work just fine for the past four years and then all of a sudden overnight decide wrongly that it's duplicate content. You would think Google could tell that the values after the page.php link were different and returned different results. I guess I had better get to work Ian
Ian, or it could be if these are new pages, and dynamically generated that Google will need to spider these pages a few times before it indexes them properly. I have one template, 140+ different content, at first it only indexed 30 of them, and the rest of the pages were like yours. Rather than panic, I left it alone a few weeks, and all 140+ pages are now indexed properly. I've always found dynamic pages always take longer to get indexed.
The thing that stinks is that these pages were indexed, and then switched to URL's suddenly. It's just one of those odd things I guess. It doesn't matter, I'll be setting up the sitemap for Google tonight and hopefully that will help take care of it after they crawl the sites again. Ian
Actually one thing I just realized (and I've read others who have said the same thing) in regards to the duplicate content issue, maybe they're finally realizing that they're indexing pages more than once. For example, I have about 2,000 pages yet they say I have 6,000. I'm not complaining, however maybe they're finally realizing that there may be a glitch. Ian
Cool, I'm noticing loads of my pages are showing caches again... So that dumb duplicate content myth can be ruled out
Has anyone seen improvement in regards to this situation? I have noticed that we have complete sections of our site turning to URL only. When I do a search on exact text in the an article from our site that has gone url only - - - sometimes the correct article will be pulled - however, it is a supplemental result. Any more clues on why this is happening?
Yeah google cached almost my entire site (about 15 pages were title only and another 15 were not there) Then one day it dropped me to 30 pages then 21. This may be due to my site going offline for 2-3 days though
I don't have any clue and it's brought my business to a standstill. I've submitted my sitemap and still after over a week it's yet to re-index them. If it wasn't for my froogle feed where occasionally it grabs a product on a Google result, I'd be getting nothing from Google. It's only on my dynamic URL's (http://www.mysite.com/page.php?item=1234) that it's happened to. I've done everything people have suggested with no luck yet. This stinks Ian
Same here. The PR went down to 0 from 4 and back to 4 again. The no of BLs seem to have stayed still but the I'm going further and further down in the results for some reason. Let's hope this is a temporary glitch rather than G changing its algo completely.