I need big MySQL Optimization Help

Discussion in 'MySQL' started by smitts, Jul 20, 2007.

  1. #1
    I'm drastically trying to improve the page load times on DealScans without much avail. Here's some details:

    My Main Query which gets run most frequently is:

    SELECT SQL_CALC_FOUND_ROWS HIGH_PRIORITY itemandstore.ID, STORETXT, TITLE, COST, AVAIL, PRICECHNG, PERCENTCHNG, TIMECHNG, IMGURL, RATE FROM items LEFT JOIN itemids on itemandstore.ID = items.ID LEFT JOIN stores on itemids.STOREID = stores.STOREID WHERE TIMECHNG > DATE_SUB(CURRENT_TIMESTAMP,INTERVAL 24 HOUR) AND RATE >= 25 AND AVAIL = 1 ORDER BY RATE DESC LIMIT 0, 25

    This is the output of the EXPLAIN FOR THE ABOVE STATEMENT

    id 	select_type 	table 	type 	possible_keys 	key 	key_len 	ref 	rows 	Extra
    1 	SIMPLE 	items 	range 	TIMEINDEX,RATEINDEX 	TIMEINDEX 	5 	NULL 	11251 	Using where; Using filesort
    1 	SIMPLE 	itemandstoreid eq_ref 	PRIMARY 	PRIMARY 	8 	scannerdb.items.ID 	1 	 
    1 	SIMPLE 	stores 	eq_ref 	PRIMARY 	PRIMARY 	4 	scannerdb.itemids.STOREID 	1 	 
    Code (markup):
    This query can take 3-4 seconds + sometimes. (note, I have seen where it uses RATEINDEX instead of TIMEINDEX and the filesort goes away)


    I have multiple column indexes set up for each parameter that can searched upon, which it seems to be using, but I think I'm doing wrong since it's not very fast, and when I look at my MySql runtime info, I'm getting high values for Handler_read_rnd and Handler_read_rnd_next.


    The table is 4.5 million items, and I have 384MB of RAM on the machine I'm running. Do I need a new server or more RAM for better performance? I'm guessing the problem is that I'm just not optimized in my queries

    Thanks!
     
    smitts, Jul 20, 2007 IP
  2. lowridertj

    lowridertj Well-Known Member

    Messages:
    2,882
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    195
    #2
    No I suggest setting up as a global function to call that database and keep information cache in the server. that way it is readily available rather then having to reload it each time its called.

    This will cut your load time by 3-6 times and make your script perform faster.

    as well if it is data that is updated from time to time you can setup a easy to use optimization script to ping every 1 hour or day to optimize the data so that it is clearing faster as well.

    TJ
     
    lowridertj, Jul 20, 2007 IP
  3. smitts

    smitts Peon

    Messages:
    177
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I have thought about that, and probably will go that method soon, but I have an option where users can perform any search using any of the combinations that they want.

    I could cache them for the next time the user performs the same search, but I still think it is important for me to get the queries optimized.
     
    smitts, Jul 20, 2007 IP
  4. lowridertj

    lowridertj Well-Known Member

    Messages:
    2,882
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    195
    #4
    the queries can be optimized so that it works from global functions and using cached information.

    ie:

    $db->$querie to execute

    rather then

    if($querie){

    $getinfo = mysql_querry(mysql_fetch_array(


    if that helps If you need more information I need the full line of code to give a better example how it would be set up.


    The above $db-> would basically call to your global database connection file and other files in which help to clear the database information.


    TJ
     
    lowridertj, Jul 20, 2007 IP
  5. smitts

    smitts Peon

    Messages:
    177
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I'm not quite sure what you're suggesting.

    I plan on caching the pages that get searched the most often, but I still need to index the table better so that queries come back quicker for those that aren't cached.


    I would be willing to pay someone for their expertise in this issue, PM me if you are interested.

    Thanks
     
    smitts, Jul 20, 2007 IP
  6. lowridertj

    lowridertj Well-Known Member

    Messages:
    2,882
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    195
    #6
    so set the indexes per whats being querried

    ALTER TABLE `your_table` ADD INDEX ( `table_querried` );
     
    lowridertj, Jul 21, 2007 IP