I am new here and I am asking for your understanding. I am a beginner in the field of data processing and analysis. I would like to ask for help in my task.
I have three datasets(logs) in the json format. Each of them has a size of approximately 1.5 GB and have the same attributes.
Next, I would like to analyze the data on these data sets together(statistics and graphs concerning various attributes). I would also like to be able to detect patterns, trands and relationships in the data later.
How can I do it to make it effective? What are the good practices? How can I deal with such large data? I tried the "pandas" library but it is very time-consuming. I prefer "Python", but I'm open to other solutions :)
I am asking you for help. This is very important for me.
Thank you in advance for any help.