Libcurl in my app seems to have trouble with non existing domains, as it stays a minimum 7-20s on "NXDOMAIN" requests (which seems to match CURLOPT_CONNECTTIMEOUT).
here is the pmp (poore man's profiler) output:
2585 __GI___poll,Curl_poll,curl_multi_wait,curl_easy_perform,getweb,athread,start_thread,clone,??
1281 __GI___poll,Curl_poll,curl_multi_wait,curl_easy_perform,getweb,getweb,athread,start_thread,clone,??
100 nanosleep,__sleep,athread,start_thread,clone,??
...
the curl
command doesn't seem to have this issue. It finishes the same request under a sec.
The question could be related to this one, as it seems I've solved one problem and ended up with another, can't tell if it was after the last ubuntu update or before.
Here is the libcurl code in my project:
PAGE_TIMEOUT=20;
curl_easy_setopt(curl, CURLOPT_CONNECTTIMEOUT, (PAGE_TIMEOUT-PAGE_TIMEOUT%3)/3); //3 times less
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, true);
curl_easy_setopt(curl, CURLOPT_MAXREDIRS, 20);
curl_easy_setopt(curl, CURLOPT_TIMEOUT, PAGE_TIMEOUT);
curl_easy_setopt(curl, CURLOPT_URL, argv);
curl_easy_setopt(curl, CURLOPT_NOSIGNAL, 1);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, write_to_string);
curl_easy_setopt(curl, CURLOPT_WRITEHEADER, &header);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &response);
curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, false);
curl_easy_setopt(curl, CURLOPT_SSL_VERIFYHOST, false);
curl_easy_setopt(curl, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
dns_index=DNS_SERVER_I;
pthread_mutex_lock(&running_mutex);
if(DNS_SERVER_I>DNS_SERVERS.size())
{
DNS_SERVER_I=1;
}else
{
DNS_SERVER_I++;
}
pthread_mutex_unlock(&running_mutex);
}
string dns_servers_string=DNS_SERVERS.at(dns_index%DNS_SERVERS.size())+","+DNS_SERVERS.at((dns_index+1)%DNS_SERVERS.size())+","+DNS_SERVERS.at((dns_index+2)%DNS_SERVERS.size());
curl_easy_setopt(curl, CURLOPT_DNS_SERVERS, &dns_servers_string[0]);
curl_easy_setopt(curl, CURLOPT_DNS_USE_GLOBAL_CACHE,false);
struct curl_slist *slist=NULL;
slist = curl_slist_append(slist, "ACCEPT: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,*/*;q=0.5");
slist = curl_slist_append(slist, "ACCEPT_CHARSET: ISO-8859-1,utf-8;q=0.7,*;q=0.7");
slist = curl_slist_append(slist, "ACCEPT_ENCODING: gzip,deflate");
slist = curl_slist_append(slist, "ACCEPT_LANGUAGE: en-gb,en;q=0.5");
slist = curl_slist_append(slist, "CONNECTION: keep-alive");
slist = curl_slist_append(slist, "KEEP_ALIVE: 300");
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, slist);
string useragent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.8.1.16) Gecko/20080702 Firefox/2.0.0.16";
curl_easy_setopt(curl, CURLOPT_USERAGENT, useragent.c_str());
sm=curl_easy_perform(curl);
I've built libcurl v7.30 from source (with c-ares and no ipv6) and installed over ubuntu 12.10 repository version (not sure if overridden, I've used make install
).
After last automatic update I've been getting an error but don't know if it's related.