i need to make a script to fetch data after filtering them.
The problem is that the total file size should be about 50GB.
And i want to filter the data as FAST as possible
I thought of two solutions:
-MySQL Database
-Memory Database
MySQL Should be a REALLY bad solution as what i heard because i have a lot lot lot of data to read and the speed could be pretty terrible.
On the other side, i think a Memory Database could be a really good solution,
loading the 50GB of content when the script start, and then reading trought memory would be much faster according to this post (and some others that where saying same stuff)
https://www.quora.com/Is-the-M-2-SSD-approaching-RAM-speeds
Can someone confirm me that my choice is correct, and if not what would be the best solution. I can do the Memory Database system in CSHARP,NODEJS,VB,GOLANG MAYBE ?