0

I'm trying to call the api inside another api in the foreach loop but the execution time is too long I've tried many ways but the same problem This is my code

<?php
require_once 'vendor/autoload.php';

$client = new GuzzleHttp\Client( ['headers' => ['X-EXEMPLE' => '00-00-10']] );
$res = $client->get('https://api.example.com/index.php/records/orders');
$OrdersData             =    json_decode( $res->getBody(), true ); 

    foreach ($OrdersData['records'] as $key) {

        $res = $client->get('https://api.example.com/index.php/records/users/'.$key['user_id']);
        $UserData           =    json_decode( $res->getBody() );
        $User_Name          =    $UserData->name;
        $User_Phone         =    $UserData->phone;

        echo $key['unique_order_id'] . ' ' . $User_Name . ' ' . $User_Phone .  '<br />' ;

    }

?>
Dharman
  • 30,962
  • 25
  • 85
  • 135
Redone
  • 21
  • 7
  • How long is too long? If you're just interested in general performance optizimation you may consider that parallel http requests greatly reduce latency issues. So instead of looping over the guzzle requests one at a time you could [do parallel requests](https://stackoverflow.com/a/19599351/1878262) and compute the result later. – Sherif Jul 31 '21 at 16:41
  • @Sherif it's up 20 seconds it's too long – Redone Jul 31 '21 at 16:47
  • Consider why's 20 seconds too long? Are you doing this check regularly or in a user-facing request. If so perhaps consider caching the result for future performance gains. – Sherif Jul 31 '21 at 16:53
  • In addition to caching results, if this isn't user facing perhaps you could offload API requests to a queue. Multiple workers assigned to the queue would enable you to send multiple API requests in parallel to speed things up. Keeping in mind rate limits of course. – grimmdude Jul 31 '21 at 17:14

0 Answers0