I have have this bash script that I made which makes an api request for a very big list of user accounts (almost 10,000)
#!/bin/bash
#Variables
Names_list="/home/debian/names.list"
auth_key="***"
#For loop
Users=$(cat $Names_list)
for n in ${Users[@]}
do
curl --silent --request GET \
--url https://api.example.com/$n \
--header 'authorization: Bearer '$auth_key'' \
--data '{}' >> /home/debian/results.list
done
echo "Done."
My pain is that with the current way of working my bearer token expires before the calls can complete. It only has a 30 minute lifetime and it the calls start returning an unauthorized error at around seven to eight thousand.
I understand that I can just split up the big list file with something like "split" and then set the script to background the task with &, but I cannot wrap my head around that part.
Since the API I am using is private and has no rate limiting, I was thinking of bursting the ~10,000 calls in batches of 1 or 2 thousand.
Like this:
#!/bin/bash
cat_split(){
cat $file;
}
Split_results="/home/debian/split.d/"
for file in ${Split_results[@]}
do
cat_split &
done
Yes, that does work as a poc, but I don't know what the best way of going around this is now. Should I place in my api call in another function or have one function that does the cat and then the api call? What would you consider a proper way of going around this?
Thanks for any advice in advance.