I'm currently trying to make my requests faster by multithreading them but I'm not sure how to do it the way I want. I know about grequests but they seem to require a URL list. I have code with a starting number contained in URL and would like all threads to stop after getting a status_code of 200
I have tried to accomplish this by grequests but couldn't make it work. Also tried threading but don't know how to stop all threads after working URL was found
import requests
import webbrowser
def url_request(number):
url = "http://website.com/download/" + str(number) + ".zip"
r = requests.head(url)
if r.status_code == 404:
print(url + " - 404 Not Found!")
number += 1
url_request(number)
elif r.status_code == 200:
webbrowser.open(url)
print(url + " - 200 Found!")
if __name__ == "__main__":
url_request(int(input("Starting number: ")))
What I want the code to do is execute multiple request.head
at once with a number after "Starting number" and will stop after one of the threads finds url with status_code 200.