I've been researching this for about a week and I'm unable to get a clear grasp of it. If anyone could humor me it would be much appreciated. I started with a company: http://www.rentittoday.com It's a pretty extensive website that is predominately DHTML. As you can see, we have multiple categories (vacation rentals, exotic cars, etc.) Each of these categories has a unique table in our MySQL database, where individual listings are stored. I need some recommendations on how I can create an xml sitemap that creates a direct avenue to each of our individual listings from pulling from the MySQL database. That way, if someone posts a new item, it's automatically populated into the (dynamic?) sitemap. My plan is to have a basic sitemap that exports our static landing pages (sitemap1.xml) and have the alternate dynamic sitemap that pulls all of our listings (sitemap2.xml) I've extensively researched this and have seen php scripts that claim to do this, among other things. Unfortunately I'm having a hard time wrapping my mind around it. If it takes a programming pedigree should I consider sites like 'rent a coder' (etc)? I'm not sure how daunting of a task the above is. Is there a program or generator that streamlines this process?
Is there any logical hierarchy to the database's organization of the pages. If so it would be more or less a matter of looping through the database.
If I'm understanding your question correctly, then yes, there is a logical hierarchy. We have 40-some categories and each category has it's own table, each with unique "items" For example if you go to our exotic car category: The webpage hierarchy will take you to our "exotic_cars.php" landing page Hit search and it will take you to "exotic_car_results.php" page Click a listing and it will pull the appropriate "$exotic_car_rentalsRecord" Thanks for the help
I guess it would depend on how your database is setup yet im pretty sure its possible and not very hard, I did this myself for my site which is a very large site with 8 million different urls.... Im not the best coder but will include the script i made below <?php $page = @$_GET['page'] ; // how many rows to show per page $drowsPerPage = 50000; // by default we show first page $pageNum = 1; // if $page defined, use it as page number if(isset($page)) { $pageNum = $page; } // counting the offset $offset = ($pageNum -1) * $drowsPerPage; $db = mysql_connect("localhost", "username", "password"); mysql_select_db("dbname",$db); $sql = "select * from `YOURTABLE` LIMIT $offset, $drowsPerPage"; $result = mysql_query($sql, $db) or die(mysql_error()); echo "<?xml version=\"1.0\" encoding=\"UTF-8\"?> <urlset xmlns=\"http://www.sitemaps.org/schemas/sitemap/0.9\"> "; { while($row = mysql_fetch_array($result)) { $CATEGORY = $row["CATEGORY"]; $BUSINESSNAME = $row["BUSINESSNAME"]; echo " <url> <loc>http://YOURDOMAIN.com/$CATEGORY-$BUSINESSNAME.htm</loc> </url> "; } } ?> </urlset> PHP: this would be say xmlsitemap.php I use mod rewrite to name that whatever and since I have 8 million urls mine are just simply 1.xml 2.xml 3.xml etc for each 50,000 urls as thats the max allowed on a sitemap
The code given by edual200 has solved your problem already. It loops through database records in each table generating separate sitemaps for each table and then consolidating all the sitemaps into one master sitemap index file. If you still have the problems you can PM me and I will whip up a quick script for you.
You need to get value from database and generate xml file dynamically. for example, <?php mysql_connect('localhost','root','') or die(mysql_error()); mysql_select_db('new') or die(mysql_error()); $dat=date('Y-m-j'); header ("content-type: text/xml"); echo '<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">'; $urls=mysql_query("select * from url")or die(mysql_error()); while($row_Tags=mysql_fetch_array($urls)) { echo "<url><loc>".$row_Tags['url']."</loc><lastmod>".$dat."</lastmod><changefreq>daily</changefreq><priority>1</priority></url>"; } echo "</urlset>"; ?> Code (markup): Reference: http://www.phponwebsites.com/2014/07/php-mysql-create-sitemap-dynamic.html
Just remember each sitemap can have a limit of 50,000 for Google - you can create multiple sitemap files on the fly and then use a sitemapindex file to point to each of them, also on the fly... I think it is definitely better to create sitemaps using a cron job overnight or every couple of days - you can then gzip them as most search engines can handle that format, and also have a sitemap file size limit - I am sure someone here will tell me why I shouldn't be doing that lol! I have 30 million pages on one of my sites, and over 24 million successfully indexed on Google using weekly created flat files.