Simple tutorial, just thought I'd post it here for those who might find it useful and for those who asked. Basically this line of code redirects pages with url links without 'www.' to links WITH 'www.'. For example, if you key in 'kenetix.net' in your browser, it will automatically redirect to 'http://www.kenetix.net'. This is useful for increasing the number of backlinks pointing to one domain instead of two. Since search engines recognize both domains (www and no-www) as two seperate domains, merging both of them to one link would help increase your PR. There are a couple of mod_rewrite methods out there that use .htaccess, this method is for PHP. Pointing links without 'www.' to 'www.' if(!eregi("www.", $_SERVER['HTTP_HOST'])){header("location: http://www.".strtolower($_SERVER['HTTP_HOST'])."");} PHP: Links with 'www.' to no 'www.' if(eregi("www.", $_SERVER['HTTP_HOST'])){header("location: http://".strtolower($_SERVER['HTTP_HOST'])."");} PHP: The strtolower is just to keep all characters in the link lowercase consistently.
you can also do this in your .htaacess file so you dont have to have that code on every page. thx for posting though
Options +FollowSymLinks RewriteEngine on RewriteCond %{http_host} ^domain-name\.com [nc] RewriteRule ^(.*)$ http://www.domain-name.com/$1 [R=permanent,nc,L]
Also... if you choose to do this via PHP for whatever reason... you might want to pass on a 301 header. Without the 301, there is no guarantee that the engines will see it as permanent. The only reason I can think of for doing this via PHP is if you are doing some type of cloaking. In that case a 301 is likely not in your plans.
yes .htaccess is easier to do, but is not always the ideal solution. We just installed a new seo rewrite mod for our shopping cart. 1000s of pages changed. Loading each one into a .htaccess would not be practical. Therfore we created a blank .html page for each file of the old rewrite system, and inserted the PHP snippet into each one, and it worked great.
DavidK1... so you created 000's of emtpy htmls? Why not a custom 404 that smartly handled old urls? (I can see a justification for the script based 301's)
Hmmm. NOt familiar with that strategy. A 404 would defeat the purpose of a 301 no? Creating the blank pages wasn't that big of a chore using dreamweaver.
you wouldn't have to load each one of your pages into the .htaccess file just put the code below into the .htaccess file and replace mydomain.com with your domain and that's it. you have a fully functional 301 redirect for every page on your site, even the one's you haven't added yet RewriteEngine on RewriteCond %{HTTP_HOST} ^mydomain\.com [NC] RewriteRule ^(.*)$ http://www.mydomain.com/$1 [L,R=301]
you missed my point. I wasn't talking about a canonical rewrite. I was referring to html rewrite files changing. The old rewrite did the php files like this: http://www.whatever.com/category/product-c32.html Thew new rewrite made it cleaner (no alphanumeric on the end) A lot of those old rewrites were indexed. You certainly couldn't load every page into an htaccess. I only brought this scenario up to show that .htacess is not always the most efficient method for redirect.
A custom 404 only handles the 404 it does not mean it sends a 404 to the user. If you have a custom 404 handler (easy to do in PHP, Perl or ASP), it would be invoked in the case of a file not found (ie your 000's of shell html files mentioned earlier). You can process the request, and based on the url and parameters you would know the intended location under the new URL structure. You could then send through a 301, and redirect the user to the new location. In the case of a real 404, you can send the user to wherever you want. The thing with a custom 404 handler is that you must send a specific HTTP code if you want something other than a 404. I've used it as an approach to serve dynamic content before as opposed to a mod_rewrite, and you just need to specifically send a 200 for the user (and spiders) to not think it was a page not found. In your case where you say you needed thousands of lines in an htaccess to handle this OR the thousands of shell html files, a custom 404 could handle it in a single file (301 and 000's of urls). It is easy enough as well to put tracking in place here to keep up with where the old links are coming from (link exchanges, other sites, search engines, directories) and you may consider addressing some of these "bad" links.
Thank you for clarifying that esoomllub. I will do some research, because I have never heard of that I'm curious how it can handle a specific page to page redirect on an individual basis. It seems to me there should be a processing issue like you would have if you did it in a .htaccess. Help me see the light