I'm trying to query ~50 Wikipedia pages. I've been using the requests package to make GET requests, but I've been working on implementing grequests as I hear it has much better performance.
The performance improvement is really quite minimal for me. Am I doing something wrong?
import requests
import grequests
from urllib.parse import quote
from time import time
url = 'https://en.wikipedia.org/w/api.php?action=query&titles={0}&prop=pageprops&ppprop=disambiguation&format=json'
titles = ['Harriet Tubman', 'Car', 'Underground Railroad', 'American Civil War', 'Kate Larson']
urls = [url.format(quote(title)) for title in titles]
def sync_test(urls):
results = []
s = time()
for url in urls:
results.append(requests.get(url))
e = time()
return e-s
def async_test(urls):
s = time()
results = grequests.map((grequests.get(url) for url in urls))
e = time()
return e-s
def iterate(urls, num):
sync_time = 0
async_time = 0
for i in range(num):
sync_time += sync_test(urls)
async_time += async_test(urls)
print("sync_time: {}\nasync_time: {}".format(sync_time, async_time))
output: sync_time: 8.945282936096191 async_time: 7.97578239440918
Thanks!