Hi guys, I'm planning to assign a specific DB to multiple sites... any thoughts on doing this? Steps? Necessary steps so that I can properly manage individual sites later on? Transfer them without destroying the DB of other sites?
What kind of scenario have you got? I'm in the process of developing multiple tutorial sites with a single DB and single admin panel. The way I do it is have a table called sites with two important fields: code, name. Code is like afftut and name could be Affilliate Tutorials. Then for each table such as tutorials, there's a field called site_code and for example afftut is that site code. That way multiple sites read the same db and just select the material for that site, also you could put a code called global or something for content for each websites. That's how I do it anyway
of course you can do it. but you'll have to allocate a little bit more bandwidth for the database server.
if all you need to get is the tutorial itself (text or html), just use curl to scrape a page on the "server" (that's the site with the db). You might lose a few milliseconds as I believe a database is just a bit faster but it would literally take you 10 minutes to write the code and have it working on all sites. I look at this method as similar to an api ... I can store information on one site (like geo ip info) and use it for all of my sites .. just put the master server on an underused vps or dedi and monitor downtime closely. If you are scraping by id or keyword, simply set up 1 page that reads the database and exports the desired information. From there you would just send the id or keyword as a get variable and scrape just what you need and immediately display it So on your satellite pages echo file_get_the_contents('http://www.databaseserver.com/db_output.php?id=234') ; function file_get_the_contents($url) { $ch = curl_init(); $timeout = 5; // set to zero for no timeout curl_setopt ($ch, CURLOPT_URL, $url); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout); $file_contents = curl_exec($ch); curl_close($ch); return $file_contents; } PHP: and on db_output.php (your server) // database connection $query = mysql_query("SELECT description FROM tutorial_db WHERE id = " . (int) ($_GET['id'])) ; echo htmlentities(mysql_result($query,0,description)) ; PHP: Just a little 2 minute hack that will work easier than dealing with remote dbs ... if you're concerned about other people using your setup, just lock it down by only outputting data to specified IP addresses, rick roll them if the ip addy doesn't match.
It is possible but I think you have to use the same webhost for all websites. Web hosting companies will not allow any website that is not hosted by them to access their mysql databases for security reasons.