0

i need to make a script to fetch data after filtering them.
The problem is that the total file size should be about 50GB.
And i want to filter the data as FAST as possible
I thought of two solutions:
-MySQL Database
-Memory Database

MySQL Should be a REALLY bad solution as what i heard because i have a lot lot lot of data to read and the speed could be pretty terrible.
On the other side, i think a Memory Database could be a really good solution,
loading the 50GB of content when the script start, and then reading trought memory would be much faster according to this post (and some others that where saying same stuff)
https://www.quora.com/Is-the-M-2-SSD-approaching-RAM-speeds

Can someone confirm me that my choice is correct, and if not what would be the best solution. I can do the Memory Database system in CSHARP,NODEJS,VB,GOLANG MAYBE ?

xPyth
  • 33
  • 1
  • 4
  • 1
    Nothing prevents you from loading your complete MySQL database in memory. Your question is based on a faulty premise. – Robby Cornelissen Oct 25 '21 at 11:47
  • Well so i would make a whole sql database, then load the result into mem right ? hmmm could be a solutiont too – xPyth Oct 25 '21 at 11:57

0 Answers0