1

So I have a JSON file which is a cache of a API result that updates frequently, it has thousands of items (more than 110000 lines in total, currently stored as a text file and has 3.50 MB!) The PHP script has to itenerate through it hundreds of times when it runs. Obviously, it takes about 20 seconds to finish.

I was wondering if there is anything I could do to optimize it or maybe a different aproach.

Thanks

bockzior
  • 199
  • 1
  • 6
  • 20
  • 1
    possible duplicate of [Processing large JSON files in PHP](http://stackoverflow.com/questions/4049428/processing-large-json-files-in-php) – Vikas Arora Feb 22 '14 at 05:49
  • 1
    3.5 mb or 3.5 gb? what is the actual structure and usage? – Iłya Bursov Feb 22 '14 at 05:50
  • If you have a lot of data to iterate of, why dont consider writing an application and run it with [exec](http://php.net/function.exec) which will handle the data and output the desired result for the PHP to take care? What possible performance improvement can you get when the data is already an array in PHP, except of restructing the array order and such? –  Feb 22 '14 at 05:59

1 Answers1

2

This Jsonparser will help you to solve you problem

https://github.com/kuma-giyomu/JSONParser

Ananth
  • 1,520
  • 3
  • 15
  • 28