0

I made some test with 100.000 records and json_encode still faster than making createMyModel(builder,id,.....) of every fetched row .

I'm just doing the following:

//Flatbuffer Version

$query->execute();
$builder = new \App\Http\Controllers\FlatbufferBuilder(0);
while ($row = $query->fetch()) {

   MyFlatBufferGeneratedModel::createMyModel($builder,
              $row['id'],
//  add here about 24 extra atributes ....
      );
}

$builder->dataBuffer();

return "Finish";

And:

//Json Version

$query->execute();
$result_array = [];

while ($row = $query->fetch()) {

  array_push($result_array,$row);

}

json_encode($result_array);

return "Finish";

Extra notes:
- I'm using Laravel 5.3 and MySql as data source.
- Json process is taking 1.8s and Flatbuffer process like 10s

aaron0207
  • 2,293
  • 1
  • 17
  • 24

1 Answers1

0

This is comparing apples and oranges. json_encode likely runs in native code, vs the FlatBuffer version is all in PHP itself. Then there's the additional overhead that you're fetching each row element by name, which isn't fast.

If PHP had a builtin flatbuffers_encode it would probably be a lot faster than JSON.

Also note that with such a large amount of data, make sure the buffer is pre-allocated to an appropriate size so it does not have to reallocate as often.

Aardappel
  • 5,559
  • 1
  • 19
  • 22
  • How can I avoid to fetch the rows by name? Which other option would you use to fetch it? thanks for your repply – aaron0207 Apr 17 '17 at 07:46
  • Actually, it is not like comparing apple and oranges. It is about finding fastest solution for real word problem. Since there is no native flatbuffers - of course that native json and non-native flatbuffers must be compared. – DamirR Aug 10 '19 at 19:30