0

I have the following array which needs to be reduced with internal values being summed.

Input:

$array = [
    [
        121 => [ "number" => 121, "name" => "Some Name 1", "value" => "2.222" ],
        116 => [ "number" => 116, "name" => "Some Name 2", "value" => "1.111" ],
          1 => [ "number" =>   1, "name" => "Some Name 3", "value" => "1.232" ]
    ],
    [
        121 => [ "number" => 121, "name" => "Some Name 1", "value" => "1.111" ],
        116 => [ "number" => 116, "name" => "Some Name 2", "value" => "2.222" ],
          1 => [ "number" =>   1, "name" => "Some Name 3", "value" => "3.111" ]
    ]
];

Desired result:

0 => array:116 [
  121 => array:3 [
    "number" => 121
    "name" => "Some Name 1"
    "value" => "3.333"
  ] 
  116 => array:3 [
    "number" => 116
    "name" => "Some Name 2"
    "value" => "3.333"
  ]
  1 => array:3 [
    "number" => 1
    "name" => "Some Name 3"
    "value" => "4.343"
  ]
]

What would be the efficient/effective way to calculate the resulting array based on the input, assuming that the length of input array is undefined, but the complexity/nesting level is 2 maximum.

I am seeking a performance optimized solution for PHP5.6 as well as PHP7+.

mickmackusa
  • 43,625
  • 12
  • 83
  • 136
har256b
  • 118
  • 5

2 Answers2

1

Use nested loops to access your deep subarrays. As you iterate push first-encountered rows into the result array while preserving their numeric/associative key. Use these keys to determine (while looping) to determine if a row has been encountered before -- if so, merely add the value value to the cached row.

Code: (Demo)

$result = [];
foreach ($array as $group) {
    foreach ($group as $key => $row) {
        if (!isset($result[$key])) {
            $result[$key] = $row;
        } else {
            $result[$key]['value'] += $row['value'];
        }
    }
}
var_export($result);
mickmackusa
  • 43,625
  • 12
  • 83
  • 136
0

I updated the functionality to be able to take many children. In the repl I've linked two I added an extra child to both array key's 0 and 1.

// Merege $a[0] and $a[1] and sum up the `value` key 

$merge_sum = array_map(function($key, ...$a) {
  $merged = array_merge(...$a);
  $merged['value'] = array_sum(array_column($a, 'value'));
  return [ $key => $merged ];
}, array_keys($a[0]), ...$a);

// // Fix array array keys

$final = array_reduce($merge_sum, function($c, $i) {
  $concat = $c + $i;
  return $concat;
}, []);

var_dump($final);

Test it on this repl; https://repl.it/Id7T/6

Perspective
  • 642
  • 5
  • 12
  • thanks, but actually this was just a sample. the real array consists of 100+ child, also using **function is not worse** than _looping through foreach_? – har256b Jun 09 '17 at 02:58
  • looping is anyways faster than function calls, or correct me if am wrong please – har256b Jun 09 '17 at 02:59
  • I believe that to be true, this is just a way I would tackle the issue. The trade off would end up being maintainability. You'd have less potential to break the code, if speed is of up most important, I'd write a loop that accomplishes the same and compare the trade offs. – Perspective Jun 09 '17 at 03:30
  • @harehman I've updated the functionality to be able to do multiple amount of children. – Perspective Jun 09 '17 at 03:44