1

I try to download the /home.htm file of around 6000 http addresses. For speed I tried to use grequests to send them all at once but I only get around 200 answers, most of them give a connection refused error. When I split the addresses in chunks of 100 and then send every chunk alone around 1200 addresses answer me (=the download of their /home.htm was a success), even if I use the same addresses as before.

I run it with Python3.6 on Ubuntu 16.04.

import grequests
import requests
import sys
import os
import resource

# Counts exceptions and prints them
def exceptionh(request, exception):
...

# Yields succesive n-sized chunks
def make_chunks(req, n):
    for i in range(0, len(req), n):
         yield req[i:i+n]

def run(ipport):
    # Make http links
    http_links = []
    for ip in ipport:
        http_links.append('http://' + ip.strip() + '/home.htm')

    # changing limit, without it there are too many Errno24 Exceptions
    resource.setrlimit(resource.RLIMIT_NOFILE, (131072, 131072))

    # Request making
    rq = []
    ctr = 0
    for link in http_links:
        rq.append(grequests.get(link, timeout=30, headers={'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.75 Safari/537.36'}, stream=False))
        ctr += 1
    rq = list(make_chunks(rq, 100))
    # Send requests
    results = []
    for chunk in rq:
        results.append(grequests.map(chunk, exception_handler=exceptionh))

    # Save .html
    for chunk in results:
        for response in chunk:
            if response is not None:
                # write it in html file

As I described above the results differ. When I send the requests in chunks I get more results than when I send all at once. Why is that? Is there a better approach to this problem?

RubiBubi
  • 11
  • 2

0 Answers0