Just a general question, does any body here have experience running there own server's with a large database 1000mb, if so what pc specs would you recommend ? (processors / memory etc..) I'm trying to develop a large directory Perl driven, my current load times are awful, the raw site without a directory structure loads in under 0.01 sec's, with a directory structure load times run upto 17-20 seconds. Some of the sites Perl scripts crash with "Internal Server Error" message, I'm putting that down to low pc specs. Currently running windows pro, 1.7mhz with 500mb, php5 mysql 4.1.22 and Perl.
1 gig is a small db put it on a machine with 2 gig memory and it'll fly as allt ables are in memory check your db design, indexes, queries, .. so you don't have table scans that's more important than the speed of the hardware
I tried that but ran into so clashes, problem is I cant load a .dat file which is 800+mb, the directory structure loads ok (80mb, into one table), but slows down the site 100+ times, you mentioned "table scans" to what where you referring to, I know a little about MySql and Perl.
table scans: if you don't have good indexes on a column and want to retrieve data from a table, sql doesn't know what rows to get so it will search through the whole table, from first row until the last this takes a lot of time so always put a good primary key on a table and other indexes on columns where you will select on