everyone!
I'm trying to make parallel curl requests from array of urls to speed up bash script. After research I found out, that there are several approaches: GNU parallel, x-args, built-in curl options (--parallel) starting from 7.68.0, using ampersand. Best option would be not to use GNU parallel, since it demands GNU installation and I'm not allowed to do it.
Here is my initial script:
#!/bin/bash
external_links=(https://www.designernews.co/error https://www.awwwards.com/error1/ https://dribbble.com/error1 https://www.designernews.co https://www.awwwards.com https://dribbble.com)
invalid_links=()
for external_link in "${external_links[@]}"
do
curlResult=$(curl -sSfL --max-time 60 --connect-timeout 30 --retry 3 -4 "$external_link" 2>&1 > /dev/null) && status=$? || status=$?
if [ $status -ne 0 ] ; then
if [[ $curlResult =~ (error: )([0-9]{3}) ]]; then
error_code=${BASH_REMATCH[0]}
invalid_links+=("${error_code} ${external_link}")
echo "${external_link}"
fi
fi
i=$((i+1))
done
echo "Found ${#invalid_links[@]} invalid links: "
printf '%s\n' "${invalid_links[@]}"
I tried to change the curl options, added x-args, ampersand, but failed to succeed. All examples I found were mostly using GNU or reading data from the file, none of them worked with variable containg array of URLs (Running programs in parallel using xargs, cURL with variables and multiprocessing in shell). Could you, please, help me with this issue?