Don't use multiple threads; that will just slow things down and given you headaches. Instead, use an existing engine that can perform parallel requests.
For example,
Net::Curl::Multi and WWW::Curl::Mult provide access to libcurl, a proven, powerful and fast engine. (I've used the former in production.) You can still process the responses in different threads if you so desire.
AnyEvent::HTTP and AnyEvent::Curl::Multi are two other such engines. However, using these would add a lot of overhead (which may affect performance and robustness), and I don't know how well the various event loops deal with threaded environments.
If any of these modules don't have a Debian package, just create one!