0

Currently I am doing as follow, to getting response and save the file locally in my PC-

    response = requests.get(url)
    if response.status_code != status.HTTP_200_OK:
        raise BadRequestError('message')
    return subprocess.call(['wget', url])

But what I want, to save the response In my PC when response = requests.get(url) with error handling, instead of doing subprocess.call(['wget', url]) again.

Any help would be appreciated.

nick
  • 33
  • 6

2 Answers2

0

Something like this?

import requests
import pickle

response = requests.get('https://stackoverflow.com/questions')

if response: # <- you can be more precise here
    with open('out_file.pickle', 'wb') as f:
        pickle.dump(response, f)
else:
    raise Exception # <- Your exception here

pickle is the defacto serializer in python, but, does have issues

James Schinner
  • 1,549
  • 18
  • 28
0

you can pickle the response and save it. pickle is used for object serialization before storing it into disk.

import pickle

response = requests.get(url)
if response.status_code != status.HTTP_200_OK:
    raise BadRequestError('message')

else: 
    with open('new.pkl','ab') as f: 
       pickle.dump(d,response)

with open('new.pkl','ab') as f: opens the file in append mode so that all the responses are appended to new.pkl. Otherwise, if opened in 'wb' mode will overwrite the file each time you open it

Later to see the output you can do:

with open('new.pkl') as f:
      while 1:
        try:
            response= pickle.load(f)
            print response
        except EOFError:
            break

The reason to use the above code for unpickling is this

Pratik Kumar
  • 2,211
  • 1
  • 17
  • 41