I have a web app (mySQL and PHP) which allows people to create an item with up to 200 records which I store in a single table. Any user subscribing to that item will be pulling up to 4 records from the original 100 records each time they load a page. Sometimes the 4 records are chosen randomly and could take many selects to get 4 correct records. I expect thousands of items to be created, which could lead to millions of records and thousands of millions (I think they call them billions) of selects. Will this approach scale or should I be creating new tables for each set of records? This could lead to thousands of tables on the DB. Any problems there?
Why not just do 1 big select? select 1st crazy criteria union select 2nd crazy criteria union select ... etc. What are you storing that requires 200 records per item? is your database properly normalized?
Thanks for your reply. I'm preforming joins and such where possible and I'm not worried so much about that aspect of my app. My question put more simply is, should I be storing storing many sets of records grouped by an ID in one table or separating those records by creating a new table for each set of records the user creates?