What you need is parallelism. If a single thread can't download the files fast enough, multiple threads are needed. Although it may be the case that the limiting factor is your Internet connection bandwidth, in which case nothing will help.
Have you thought of manually splitting the file into say ten or hundred pieces, and then using ten or hundred uget
processes to download URLs from each file? This would be an easy hack to add parallelism to the download process.
Of course, you can use e.g. Python or Java to develop a software that starts the multiple threads for you and processes URLs, but then you need to be familiar with thread programming and in either case, it's probably simpler to just split the file into multiple and start multiple uget
processes, as developing the software takes much time and you may not save the time later by using the software.
Is the server controlled by you? One or multiple servers? If all images are on one server and it's not controlled by you, I would consider not placing too much load on the server.
I have had the same kind of problem previously, and in that case I used Java code to download the images, and only one thread. Furthermore, I placed intentional sleep calls between downloading the images in order not to load the server too mcuh. So, I didn't want performance; I wanted to not put too much load on the server. In that case, there was only one server and it wasn't controlled by me.