You can do this by simply re-naming your file from .php to .html (for instance, if you have a "myfile.php" file, rename it to "myfile.html"). Keep in mind though, that none of your php scripts will work in .html extensions. However, pretty much all your scripts will work just fine in a .php extension.
i know but i wont to change php file extension .php to html i don’t wont change file type i know but i wont to change php file extension .php to html i don’t wont change file type
Well that is basically true, it is pretty much a no-brainer to add .html to the php parser in htaccess; assuming the pages are being delivered from a server. AddType application/x-httpd-php55 .html is pretty simple.
Try using htaccess file in your project. Step 1: Create .htaccess file in any editor you work. Step 2 Write the code into .htaccess file: // Make sure you know how to handle this things. RewriteEngine on RewriteRule ^(.*)\.html$ $1.php Code (markup): or // Make sure you know how to handle this things. RewriteEngine On RewriteBase / RewriteRule ^(.*)\.php $1.html [R=301,L] RewriteRule ^(.*)\.html $1.php [L] Code (markup): or //Rewrite to www Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} ^tacnix.com[nc] RewriteRule ^(.*)$ http://tacnix.com/$1 [r=301,nc] //301 Redirect Old File Redirect 301 .php .html //Caching schema <FilesMatch "\.(jpg|png|css|js)$"> Header set Cache-Control "private, proxy-revalidate, max-age=3600" </FilesMatch> //Block bad bots RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.* - [F,L] //Prevent viewing of .htaccess file <Files .htaccess> order allow,deny deny from all </Files> //Prevent directory listings Options All -Indexes Code (markup): also you can try to create your own .htaccess file with your own logic, here: http://www.htaccessredirect.net/ else if you need some more specific things on URL Rewrite then i suggest you to try Google.
You REALLY need to learn more about how things like servers work before answering questions. The correct answer to the question if taken literally, but tacnix's approach of calling it a 301 may be better since it preserves old links but lets you use just the new proper .php links without renaming the files -- I would do this IF all your new internal links are going to be .php and you just want .html for ancient backlinks to still work. ... if the OP isn't doing this to make ancient backlinks continue to work after rewriting the site, why would you do this? Really that would be my reaction, answering your question with another -- WHY?!? WHY would you do this?!? The assumption you are taking an existing site and changing it to have a PHP back-end is the only logical answer, so the 301 would be "better" but ONLY if moving forward you'll be using .php extensions and the .html support is JUST for old off-site links. Either way if you are going to do that it should be done in httpd.conf (assuming apache) and NOT the .htaccess if you have that level of access for speed reasons; though that would depend on the number of files present in the directory, override rules for subdirectories, etc, etc... Hell, my own approach to site design doesn't even use extensions for web pages, since I'm using a whitelist redirect to make those "SEO Friendly URI's" -- support to strip out the extension (any extension) is easy once you have that up and running.
As other explained, if you are using apache, mod_rewrite (which uses htaccess files) does the job. However if you are using nginx, you have to make some configuration inside the nginx.conf file