0

This is a follow up to my question from yesterday - where a comment that asked me why not storing the 'cache' in the Database, (and me testing out a folder architecture with several million folders splitted up in different subfolders holding up to 1000 each) started to make me think about the right solution at all.

1 million or more files in one folder, for include (cache)

Basically, to get a bit more into detail here: I have an expensive calculation, with many database queries needed. Basically, every request of a program, targets a specific user. The result, for each user, will always be exactly the same (until the user makes major changes). So basically, I was thinking: Why do this operation every time a person visits this userpage - if I can do it once, and store it?

Basically, the result is a set of different PHP Objects, holding different data / settings / etc. So I was thinking about creating a .inc file, that creates those objects (with predefined values, that would result from SELECT's). And everytime someone calls for that specific user, I just make a quick include and the problem is solved.

Now I'm not so sure about that solution anymore. This would mean, that basically every user has at least one .inc (multiple are possible) file, to include. This means, if I have a UNSIGNED MEDIUMINT for the PK of the user table, I need to have a folder Structure to support at least 8388607 * ~2 Files (worst case).

I think I'm searching in the wrong direction here. What are my possiblities?

  1. SELECT the data and calculate it / put it togheter on every page call - always having the same result
  2. The same as 2., but store the resulting objects in a file, as code snippet - to include
  3. The same as 2., but serialize the resulting objects, saving them in a file or in the database, unserializing them on every page call
  4. ?

Are there other solutions / ways to solve this kind of issues? All those solutions dont sound too clean in my own ears, but I'm struggling to find another way to do this. Holding 8 Million files, or calculate the same thing over and over again - there should be something inbetween, somehow.

Any suggestions?

Thanks! - excuse my poor english.

Community
  • 1
  • 1
Katai
  • 2,773
  • 3
  • 31
  • 45

1 Answers1

0

I think that there are a number of options for you here, though without further detail it is hard to pick the best one:

You could semi-cache these objects in a database table if they aren't likely to change constantly in their own table and only do the expensive calculation if the result isn't found in in the semi-cache table (which also places it in the semi-cache table). You might even want to have a schedule that updates these on a regular basis (like nightly) to take advantage of downtime on your server.

You could store the results in a session or a cookie and check that as needed and if it isn't there, calculate it from the database.

Use your mySQL cache to store these expensive calculations so that they are likely to be in the cache as needed. This could mean that you pre-heat the cache with expected visitors.

Fluffeh
  • 33,228
  • 16
  • 67
  • 80
  • Thanks for the inputs, they helped me out to test a couple of things - I figured out that storing in the DB isnt the right solution after all - a direct lookup of an already gziped file just is better to avoid DB calls – Katai Aug 21 '12 at 08:27