0

I have several subdirectories within a parent folder, each with a URLs.txt file within the subdirectory.

$ ls

H3K4me1_assay H3K4me2_assay H3K4me3_assay ... +30 or so more *_assay files

Each assay_file contains one URLs.txt file:

$ cat URLs.txt

https://www.encodeproject.org/files/ENCFF052HMX/@@download/ENCFF052HMX.bed.gz
https://www.encodeproject.org/files/ENCFF052HMX/@@download/ENCFF052HMX.bed.gz
https://www.encodeproject.org/files/ENCFF466DMK/@@download/ENCFF466DMK.bed.gz
... +200 or more URLs

Is there any way I can execute a command from the parent folder that reads and curls the URLs.txt file in each subdirectory, and then downloads the file within each subdirectory?

I can cd into each file and run the following commands to download all of the files:

$ cd ~/largescale/H3K4me3_assay
$ ls URL* | xargs -L 1 -d '\n' zcat | xargs -L 1 curl -O -J -L

But I will have to run this command for experiments with +300 folders, so cd'ing in each time isn't really practical.

I have tried to run this, it does download the correct files but within the parent folder rather than the subdirectories. Any idea what I am doing wrong?

$ for i in ./*_assay; do cd ~/largescale/"$i" | ls URL* | xargs -L 1 -d '\n' zcat | xargs -L 1 curl -O -J -L; done

Thanks, Steven

  • Subshell or command grouping: [Save file to specific folder with curl command](https://stackoverflow.com/questions/16362402/save-file-to-specific-folder-with-curl-command) – Zac Anger Nov 28 '22 at 04:48
  • 1
    What is zcat doing there? Is URLs.txt compressed? – oguz ismail Nov 28 '22 at 05:31
  • @oguzismail yes I had to compress URLs.txt - URLs.txt was extracted via : `for fname in *_assay_metadata; do zcat "$fname" | cut -f 48 | gzip >"URLS_${fname}"; done` – Steven James Nov 29 '22 at 09:52

0 Answers0