I am building a website having in my mind that hundreds (I wish thousands!) of 'get' queries -per day- will be cached for a couple of months in the filesystem.
Reading the cache documentation, however, I observe that the default values lean towards a small and fast cache cycle.
An old post describes that a strategy like the one I imagine, wrecked havoc in their servers.
Of course, the current django code seems to have evolved since 2012. However the cache defaults still remain the same...
I wonder whether I am on the right track or not.
My familiarity with caching is restricted in enjoying the W3 Total Cache results after saving thousands of files in the relevant directories without understanding anything but its basic settings.
How would an experienced developer approach "stage 1" of this task:
Without the budget -yet- to support solutions based on Redis (for example) (Not a valid argument)
How would you cache a normally augmenting number of queries -capable to form a bulk- for a long period of time, running on rather basic server resources?