I have a 'theoretical' question, to see if a solution I'm planing makes sense or not:
I have a script that reads a lot of data out from the Database, with settings, configuration, etc - and builds that togheter (for every registered user). I wont go into too much details why or what exactly.
My Idea was, that I could actually do that only once and create a .inc
file, with the ID of the user, to cache it. If the user changes something, the file will be recreated of course.
But now, lets suppose I do that, with 1'000'000 - or even more files. Will I encounter issues, while including those files? (always one specific file, not every file at once). Is that generaly a good idea, or am I just stressing the server even more with this?
And I'm planing to put everything in the same cache folder - will I have performance improvements, if I split that folder up into multiple ones?
Thanks for the help.