5

In my React app I have a component that send request to an online service that is capable to handle 50 requests max. I got a new request now to execute 7000 MAC’s.

function App() {
const [data, setData] = useState([]);

useEffect(() => {
     const fetchData = async () => {
        await axios.all([
             axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 }),
             axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 })
       // Adding all the mac address ....... 
        ]).then((responseArr) => {
            setData(responseArr)
        });
     };
    fetchData();
}, []);

I would like to extend the fetchData function so basically it will send only 50 IP’s and will wait till iteration is complete.

When the iteration is complete then the next 50 will be executed.

Thank you

angus
  • 3,210
  • 10
  • 41
  • 71
  • Make use of [es6-promise-pool](https://www.npmjs.com/package/es6-promise-pool). It helps you to manage the concurrency with parallel requests – Prathap Reddy Aug 27 '20 at 15:12
  • Could you complete your code so it is clear where the mac addresses are stored, and how you pass them to the request? – trincot Aug 27 '20 at 16:03
  • 3
    Uh, just *don't* write a clientside component to execute 7000 HTTP requests. – Bergi Aug 27 '20 at 16:10

4 Answers4

2

Here is how you can do it without any external libraries:

const ips = [
  /* List of mac address. */
];

useEffect(() => {
  const fetchData = async () => {
    const loadedData = [];
    
    // Iterate over the slices of array until all the ips have been processed.
    for (const sliceIps of sliceGenerator(ips)) {
      const gettingData = sliceIps.map(getDataFromIp);
      const sliceLoadedData = await axios.all(gettingData);
      loadedData = loadedData.concat(sliceLoadedData);
    }
    setData(loadedData);
  };
  fetchData();
}, []);

const getDataFromIp = (ip) =>
  axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 });

// Generates a slice of an array, here the slice has a size of 50 max.
function* sliceGenerator(arr) {
  const sliceSize = 50;
  let i = 0;
  while (i < arr.length) {
    yield arr.splice(i, i + sliceSize);
    i += sliceSize;
  }
}

I'm using a generator function* sliceGenerator here to generate the slices of ips array. This way you batch process them 50 by 50.

I'm also using a for (... of ...) loop. It's very convenient because you can use the await keyword inside.

Baboo
  • 4,008
  • 3
  • 18
  • 33
  • It's because I `await axios.all` in the loop – Baboo Aug 27 '20 at 16:27
  • It's in the `for of` loop : `const sliceIps of sliceGenerator(ips)`. Here is some [documentation](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...of) – Baboo Aug 27 '20 at 17:12
  • 1
    Did you mean to use a while loop instead of the `if` statement? – Ramesh Reddy Aug 27 '20 at 17:24
  • @RameshReddy, consider the generator results like `[[50 reqs], [50 reqs], ...]`. Since `for ... of` iterates element by element, you will get 50 requests for every iteration. – Prathap Reddy Aug 27 '20 at 17:45
  • 1
    @PrathapReddy that's what I'm trying to say his function only has one yield so it'll only give the first splice. He should use a loop inside the generator function. His function will give an array with a single nested array of length 50. – Ramesh Reddy Aug 27 '20 at 18:04
  • 1
    @RameshReddy, my bad. Didn't observe that. He should be using loop instead of `if`. – Prathap Reddy Aug 27 '20 at 18:11
  • 1
    Good point I didn't see your point. I updated my code – Baboo Aug 27 '20 at 23:08
2

Without library, you could use this function:

function poolPromises(iterPromises, poolSize) {
    return new Promise((resolve, reject) => {
        let promises = [];
        function nextPromise() {
            let { value, done } = iterPromises.next();
            if (done) {
                resolve(Promise.all(promises));
            } else {
                promises.push(value); // value is a promise
                value.then(nextPromise, reject);
            }
            return !done;
        }
        
        while (promises.length < poolSize && nextPromise()) { }
    });
}

This function will take promises from an iterator up to the pool size. Whenever a promise resolves, it will get the next promise from the iterator so the pool is complete again. So the pool does not have to be emptied completely before the next chunk of promises is generated. As soon as a spot is free it will be used again.

It is important that the iterator only creates a next promise when one is pulled from it via the next() method.

In your use case, you can call it as follows:

const fetchData = async () => {
    function * iterRequests() {
        for (let macAddress of macAddresses) {
            yield axios.get("/ipdn/" + macAddress, { timeout: 10000 });
        }
    }
    return poolPromises(iterRequests(), 50).then(setData);
}    

Note: fetchData does not have to be declared async, since there is no await in there.

trincot
  • 317,000
  • 35
  • 244
  • 286
  • with `while (promises.length < poolSize && nextPromise()) { }` and e.g a `poolSize` of maximum 50 promises a time, but more than 50 requests via `*iterRequests`, how does one ever manage to process all promises? Does one ever reach `done` via `let { value, done } = iterPromises.next();`? – Peter Seliger Aug 27 '20 at 17:13
  • 2
    The first batch of calls are initiated from that `while` loop, which usually ends when `poolSize` is reached, but then when one of these 50 promises resolve, the same function `nextPromise` is called again (via `then`), and this continues until the `iterPromises` is depleted. So in the end all promises are created, and collected in the `promises` array, which in your case would reach a length of 7000. – trincot Aug 27 '20 at 17:31
  • I didn' t read `value.`, thus I totally missed its importance within this line `value.then(nextPromise, reject);` ... thanks for patiently repeating the now obvious to me. – Peter Seliger Aug 27 '20 at 17:41
  • 1
    I like this approach more and more ... for its pureness and thus beauty of how to ensure a constant use to capacity of a controlled queue like (in terms of how to *enqueue*) structure of promises ... I learned a lot today. – Peter Seliger Aug 27 '20 at 18:04
  • 1
    A **Wow** answer. Thanks for the knowledge share @trincot – Prathap Reddy Aug 27 '20 at 18:31
1

If you don't have any issues in using external library, you can make use of es6-promise-pool to manage concurrent requests as below

import PromisePool from 'es6-promise-pool';


// macs - Array of mac addresses
useEffect(() => {
  const fetchData = () => {
    const results = [];
    const generatePromises = function*() {
      for (let count = 0; count < macs.length; count++) {
        yield axios.get(`/ipdn/${macs[count]}`, ...);
      }
    }
    const promiseIterator = generatePromises();
    // Create a pool with 10 concurrent requests max
    const pool = new PromisePool(
      promiseIterator,
      10 // Configurable
    );
    // To listen to result
    pool.addEventListener('fulfilled', function (event) {
      console.log('Fulfilled: ' + event.data.result);
      results.push(event.data.result);
    });
    // Start the pool
    pool.start().then(function () {
      setData(results);
      console.log('Complete');
    });
  };
  fetchData();
}, []);
Prathap Reddy
  • 1,688
  • 2
  • 6
  • 18
  • 1
    Thank you Prathap. Can you please add some code to see how to incorporate this with useEffect. – angus Aug 27 '20 at 15:57
  • Moved logic inside `useEffect` and captured the result. You can move it to an array and set it using `setData` after pool execution completes. (Not added error handling) – Prathap Reddy Aug 27 '20 at 16:19
  • 1
    Thank you all for the great solutions. Prathap I will vote for your answer since I am using this one. – angus Aug 27 '20 at 17:28
  • Try considering @Bergi comment (7000 requests from client) as well into consideration, incase if there is any way to convince your boss. (Considering slow/unreliable internet connection, mobile networks etc...). Happy coding – Prathap Reddy Aug 27 '20 at 17:36
  • 1
    why using async if await has not used? – RRR May 30 '21 at 03:44
  • Good find @RRR, have removed `async`. Thanks. – Prathap Reddy Jun 02 '21 at 19:45
0

I am not familiar with axios.all, then I provide a way just use Promise.all. My idea is split the input array to each block 50 addresses, then solve it one by one

function App() {
  const [data, setData] = useState([]);

  useEffect(() => {
    // helper function, split array to chunked array
    const splitToChunks = (items, chunkSize = 50) => {
      const result = [];
      for (let i = 0; i < items.length; i += chunkSize) {
        result.push(items.slice(i, i + chunkSize));
      }
      return result;
    }
    
    const fetchData = async () => {
      const result = []; // init value
      const macAddresses = []; // array of mac addresses - your mac addresses
    
      const chunkedArray = splitToChunks(macAddresses); // return array of array [[...50 mac adds], [], []]
    
      for (const macs of chunkedArray) { // now macs is [...50 mac adds]
        const promises = macs.map((mac) => {
          return axios.get(`/ipdn/${mac}`, { timeout: 10000 });
        });
        // now promises is array contains 50 Promises
        const response = await Promise.all(promises); // wait until finish 50 requests
        result.push(...response); // copy response to result, and continue for next block
      }
    
      setData(result);
    };
    fetchData();
  }, []);
}
hoangdv
  • 15,138
  • 4
  • 27
  • 48