Not sure what you exactly mean by Backend only allows arround 70 requests. But if you wish to control the number of parallel requests at a time, you could use RxJS from
, bufferCount
and forkJoin
functions with concatMap
operator.
Try to avoid toPromise()
. It is an easier way to switch back to a familiar Promise paradigm, but it's being deprecated in RxJS 7 and will be gone in RxJS 8. Instead try to subscribe to the observables.
For controlled parallel requests, try the following
const reqs = this.data.map(item => this.http.post(url, item, { headers: (reqHeader) }))
from(reqs).pipe(
bufferCount(6), // <-- adjust number of parallel requests here
concatMap(buffer => forkJoin(buffer))
).subscribe(
res => console.warn(res),
err => console.log(err),
() => console.log('complete')
);
Update: delay each request
You could forgo the bufferCount
and instead switch to each request individually and use an explicit delay. Try the following
from(this.data).pipe(
concatMap(item => this.http.post(url, item, { headers: (reqHeader) }).pipe(
delay(3000) // <-- wait 3 seconds b/n each request
))
).subscribe(
res => { },
err => { }
);
Update: count the emissions
You could introduce a variable (eg. count
) and use the map
operator to return both the count and the response.
Try the following
someFunc() {
let count = 0;
from(this.data).pipe(
concatMap(item => this.http.post(url, item, { headers: (reqHeader) }).pipe(
map(res => {
count++;
return {
count: count,
response: res
}
}),
delay(3000) // <-- wait 3 seconds b/n each request
))
).subscribe(
res => {
console.log(res.count); // <-- the count of the emission
console.log(res.response); // <-- the response from `this.http.post()`
},
err => { }
);
}