-1

I am running Nginx and PHP on a Virtual Linux machine with Digital Ocean. I use a very simple folder and file system as a database that looks like this:

--Users
  --User1
    --file1.json
    --file2.json
  --User2
    --file1.json
    --file2.json

Nginx easily blocks the Users directory from public access and each user has around 60 json files. At login, I do a file_get_contents() on each of them and send all of the data to the browser. When I need to save a file I just use file_put_contents() on just that file.

On user logout, I'd like to combine all of that users files into 1 json file so at next login, I can load the user faster by doing a file_get_contents() on just one file. This is easy enough to do with PHP but it just doesn't seem like the right way to do this. It feels like PHP should pass this to some kind of service on the machine rather than handling with php. My 3 questions are:

  1. How is this type of task normally handled?
  2. Is there a particular name that I could google for the process of maintaining data in two different ways (1 way for initial load and another for being able to update the data without having to do a file_get_contents() on the whole file just to update a very small subset of the data).
  3. Or should I do away with 60 smaller files and just store it as 1 file. If I do this, should I be concerned with pulling the whole 116kb file every time a user wants to update around 1kb of data? Each update is less than 1kb of data.
Shawn Cooke
  • 907
  • 1
  • 11
  • 28

1 Answers1

-1

You should really look into SQL databases, if your application is small you could use SQLite, and if your application is more intense then you could use MySQL, you could store your data in JSON format, then just extract your information and echo it to the client

Chico3001
  • 1,853
  • 1
  • 22
  • 43