0

Hi I am very new to python programming. Here I'm trying to write a python script which will get a status code using GET request. I can able to do it for single URL but how to do it for multiple URL's in a single script. Here is the basic code I have written which will get response code from a url.

import requests
import json
import jsonpath

#API URL
url = "https://reqres.in/api/users?page=2"

#Send Get Request
response = requests.get(url)
if response:
    print('Response OK')
else:
    print('Response Failed')

# Display Response Content
print(response.content)
print(response.headers)

#Parse response to json format
json_response = json.loads(response.text)
print(json_response)

#Fetch value using Json Path
pages = jsonpath.jsonpath(json_response,'total_pages')
print(pages[0])
Sameer
  • 113
  • 1
  • 4

3 Answers3

1

try this code.

import requests


with open("list_urls.txt") as f:
    for url in f:
           response = requests.get(url)
           print ("The url is ",url,"and status code is",response.status_code)

I hope this helps.

jose praveen
  • 1,298
  • 2
  • 10
  • 17
  • Thanks it works. Assume a case. I kept all my urls in a file eg: url.list and I want to call that file(url.list) in python script. Is it possible? Because I have more then 100 urls to check. I know how to do it in shell but not in python please help me – Sameer Mar 08 '20 at 10:34
  • @SameerSheik i have changed the code as per your requirements. list_urls.txt file represents the 100+ urls file. now you can get each url [https status code](https://en.wikipedia.org/wiki/List_of_HTTP_status_codes) – jose praveen Mar 08 '20 at 11:02
0

You can acess to the status code with response.status_code

You can put your code in a function like this

def treat_url(url):
    response = requests.get(url)
    if response:
        print('Response OK')
    else:
        print('Response Failed')

    # Display Response Content
    print(response.content)
    print(response.headers)

    #Parse response to json format
    json_response = json.loads(response.text)
    print(json_response)

    #Fetch value using Json Path
    pages = jsonpath.jsonpath(json_response,'total_pages')
    print(pages[0])

And have a list of urls and iterate throw it:

url_list=["www.google.com","https://reqres.in/api/users?page=2"]
for url in url_list:
    treat_url(url)
Ángel Igualada
  • 891
  • 9
  • 13
  • Thanks angel it works but my question is how to loop through multiple url's at once let say 100 may be – Sameer Mar 08 '20 at 08:22
0

A couple of suggestions, the question itself is not very clear, so a good articulation would be useful for all the contributors over here :) ...

Now coming to what I was able to comprehend, there are few modifications that you can do:

response = requests.get(url) You will always get a response object, I think you might want to check the status code here, which you can do by response.status_code and based upon what you get, you say whether or not you got a success response.

and regarding looping, you can check the last page from response JSON as response_json['last_page'] and run a for loop on range(2, last_page + 1) and append the page number in URI to fetch individual pages response

You can directly fetch JSON from response object response.json() Please refer to requests doc here

suyash
  • 327
  • 2
  • 8