I have a very big and important project but I have a problem about choosing a way to handle data.
- database : If I use database I can sort them and filter them. Do you think that in this way filtering data is gonna be faster or I should code it in python? requesting data or filtering or any other process to database in the same time by more than one software is gonna make any problem?
- file : I'm currently using text files to handle data and I extract data from them but I have so many threads and multiprocessing functions that should open the same file. So sometimes an error comes up and says "Error : Permission denied" which is reasonable.
So my software needs to do a lot of data processing and filtering (using some conditions) and sometimes opening and accessing data in the same time by so many functions or other python scripts, that in files it comes up with an error (Permission denied).
So how can I filter and process data without having any error. what tools should I use? I need a fast method to do all these.
Note : Each 10 seconds , data will be updated