0

i'm running a really inefficient for loop on a large text file and i'm trying to figure out how to accomplish the same thing in parallel instead of going through the text file 1 line at a time:

for i in `cat file.txt` ; do curl -sX POST "https://xxx/api/string“ -d Field=$i >/dev/null ; done
Underflow
  • 13
  • 3
  • Does `file.txt` contain 1-entry per-line? You are getting `fields` from it? 1-what? – David C. Rankin Aug 10 '20 at 01:52
  • I thing you would be better off using python or ruby or some other (better) scripting language that has proper support for threads, thread pools, etc for this. The problem with trying to do this in bash is that it could have the same effect as a "fork bomb". – Stephen C Aug 10 '20 at 01:57
  • @DavidC.Rankin 1-entry per-line is correct – Underflow Aug 10 '20 at 02:00
  • You could try and background the task with a `&` before `; done` - however depending on how big your file is you might want to limit it somehow though. – bob dylan Aug 10 '20 at 07:53

0 Answers0