I am currently trying to use my google maps history and google maps API to show something interesting. And I wanna save every data into my mysql BD, the phpmyadmin has a limit of import size of 20MB. I tried to use file_get_contents of PHP, but it echos nothing, since the whole file exceeded 80MB. I don't wanna divide the file into several parts manually. Anyone can tell me something about processing very big or just big file using PHP, or JAVA, Python? Best in PHP. Thanks a lot!
Asked
Active
Viewed 132 times
0
-
In PHP config file (displayed if you phpinfo();), you can change the maximum execution time/size for the PHP in here, I'm not sure about the other languages though. – Jack hardcastle Aug 22 '15 at 22:52
-
A general technique when processing big, big files is to avoid trying to keep it in memory all at once in the first place, and processing it incrementally. SAX works like that for XML, for example. I’m not aware of any PHP JSON-processing libraries that work that way, though. – icktoofay Aug 22 '15 at 22:57
-
I already tried to lift the ceiling to 512MB, which I think is big enough, but nothing changed with this action. And something more, I tried to use some event based parse in JAVA, which would be SAX, but I prefer to get something in PHP, after all I am working on PHP. Anyway, thank you all – cinqS Aug 22 '15 at 23:03
-
duplicate?: [Processing large JSON files in PHP](http://stackoverflow.com/questions/4049428/processing-large-json-files-in-php) – Ryan Vincent Aug 23 '15 at 02:55