I use curl to get a big chunk of data from a web service. I find myself exhausting the memory limit when I use json_decode() on that data. I know I could increase the limit, but that is not a good solution since the data keeps increasing.
The real problem is that I only need a small portion of the json that I am fetching. So, to simplify things a bit, my data looks something like this:
{ array [
{ // object 1
"field1": "xxx",
"field2": "yyy",
.
.
"field30": "zzz"
},
.
.
. // object 15,000
]
}
Right now there are about 15,000 objects in array[] and each has 30 fields. I expect the number of objects to grow to around 50,000 in coming months.
Since I need all the objects but only fields 1 and 6, I am wondering if I can somehow change the above to something more like this:
{ array [
{ // object 1
"field1": "xxx",
"field6": "aaa"
},
.
.
. // object 15,000
]
}
I imagine that would reduce the memory usage substantially. Any ideas?