Would it be wise to import 30k - 2 meg .txt files into a mysql database. I could probably import it on my server by splitting it or with a script, so not problem there. How sluggish are the hugest mysql DBs? Do people often apply DB implementations with split-up records so that DBs are smaller for easier workload? And how can I do it, whether it's wise or not, I'll share the results with someone who tells me. I've got project Gutenberg's ebooks on my hard drive and would like to import 4 to 8 gigs of text into a .sql file. Any suggestions? Regards Ventage
hmm... that's a question that have can many answers. I guess it all depends on various factors, like what you want to do with it. IE: Perform searches on the field, populate a download from sql. If only the last one is the case then I would not recommend it as it will have a huge overhead, especially if you have many visitors. As for your question on how: I would create a binary row in sql and use the function "mysql_real_escape_string" (assuming you're using PHP) prior to the insert statement. If you could be more specific on what exactly you plan to do with the data I can give you more in dept advice.