Hi everybody, I have a website that I recently SEO. The results are excellent however some pages are not intexed as the should. This is my sitemap where I added the ALL the webpages: http://www.sterlet.ro/ror.xml In some searches I still get the old pages, wich are still active, should that be a problem? Let's take an example of what I'm saying. KEYWORDS: studii de fezabilitate, stiudiu fezabilitate (I know that "de" will be ignored so that dose not mather) http://www.google.com/search?source=ig&hl=ro&q=studii+de+fezabilitate&lr= Check the results. The first 2 pages are: I have bolded the importatn parts. So, I have created a special page with this keywords: http://www.sterlet.ro/studii-fezabilitate.php The page title also contain the keywords. The thing is that this page is not INDEXED at all, instead, this page (wich is still active) is intexed: http://www.sterlet.ro/servicii.php This page contain all the service, so this is just an part of that page. Also I have other pages that weare renamed, and google still uses the OLD pages wich are still on the FTP, however they are not in the XML. example: http://www.sterlet.ro/ranacultura-cresterea-broastelor.php and: http://www.sterlet.ro/ranacultura.php if I search this wich in my opinion should REALLY point to the new file, i'm poited to the old file. So what sugestions do you have? P.S. This is not an adversiting to that website, is just a SEO discussion. Thanks
I recommend putting a link based site map (non xml) on your site. link to it from every page in your footer. You should also get some links to the page you would like to see indexed with the link's anchor being the pages keywords. With your amount of posts your signature (accessible within your control panel) might function ideally (if you continue to grow your posts), dp links are weighted quite well. Until your site gets some decent link juice flowing through it, the best way to make sure your internal pages are getting indexed is by having your internal structure view that page as important, and having other websites link to your page.
- put .htacces and redirect your old pages to new pages - submit your new sitemap to google webmaster tools, it will flush all your indexed pages and then indexed all pages in your sitemap.
Thats a good suggestion, i wasn't clear if you had changed a page to another since you said it still exists.
can someone point me on how to use that .htacces? How should the redirect look? Also you say that I should create a HTML sitemap...
try google "htaccess redirect", youll find a lot of articles i prefer xml sitemap (google standard) so you can submit it to "google webmaster tools".
My XML is submited. So the name didn't change, should I submit it again? ror.xml RSS Feed Web Jun 28, 2006 May 2, 2007 OK 24 The bolded text is "last download" so basicly he has the latest pages. Why didn't he visits it?
correct me if I am wrong, u just submitted your sitemap on May 2, 2007. If that is correct, u should wait until the next update of google to see if your new pages already in.
Now, the file was submited in Jun 28. 2006, on May 2 it was the last time it was downloaded. I did my changes to 2 weeks ago and my site was vizited 2 times already.
Don't worry about submitting a sitemap. Look at this post on seomoz. All this site map stuff is over blown hype.
I think i'm going to build a HTML sitemap that will appear on everypage. Even now some pages are not indexed, the old ones appear instead. Should I delete the XML if I make a HTML sitemap? Will the robot visit the XML link structure or he will also come to the HTML sitemap and crawl all the links he also finds there? Later edit: http://www.sterlet.ro/site-map.html Is this HTML sitemap builded corectly?