I've got a PHP+mySQL website that shows data of 300,000 products: name, description, how to fix it, where to buy it, etc.
Initially, the mySQL table is designed in the following way:
- product_id: MEDIUMINT
- category_id: SMALLINT
- weight_kg: SMALLINT
- height_cm: SMALLINT
- width_cm: SMALLINT
- depth_cm: SMALLINT
- name: VARCHAR(100)
- label: VARCHAR(100)
- short_description: VARCHAR(200)
- long_description: VARCHAR(9999)
- how_to_fix_it: VARCHAR(9999)
- where_to_buy_it: VARCHAR(9999)
- similar_products: VARCHAR(9999) <--- it stores "ACME 12|acme-12#ACME 555|acme-555" to link directly other products, avoiding to do subsequent queries to find the names and labels of the similar products.
I've found that the size of the table is huge, mainly due to the existence of the text fields (VARCHAR-9999), and this can slow down queries and website speed. I think that these VARCHAR-9999 fields are not necessary to be inside the table, since I do not perform SQL operations with them. I only query them when displaying information on a single product.
I'm considering the creation of 300,000 text files (product_id.txt) to be read by PHP, each one storing the text information (one field per line):
long_description: blah blah
how_to_fix_it: blah blah
where_to_buy_it: blah blah
similar_products: blah blah
Each time I display information of a single product, I would PHP-read the text file from disk, and I think it would be faster.
I would appreciate any similar experience dealing with this issues. Is there any smart way to store the 300,000 files? I was thinking about 300 subdirectories ('000', '001', ..., '299') to store 1,000 files in each one, to make the read faster.