1

I have been looking for a better way to automatically upload a new or existing file to my repository. The following code can only update an existing file (not a big deal for me now) and does not work on with a big file.

def push_to_github(filename, repo, branch, token):
    url="https://api.github.com/repos/"+repo+"/contents/"+filename

    base64content=base64.b64encode(open(filename,"rb").read())

    data = requests.get(url+'?ref='+branch, headers = {"Authorization": "token "+token}).json()
    sha = data['sha']

    if base64content.decode('utf-8')+"\n" != data['content']:
        message = json.dumps({"message":"update",
                            "branch": branch,
                            "content": base64content.decode("utf-8") ,
                            "sha": sha
                            })

        resp=requests.put(url, data = message, headers = {"Content-Type": "application/json", "Authorization": "token "+token})

        print(resp)
    else:
        print("nothing to update")

token = "lskdlfszezeirzoherkzjehrkzjrzerzer"
filename="foo.txt"
repo = "you/test"
branch="master"

push_to_github(filename, repo, branch, token)

This code gives me the following error:

KeyError: 'sha'

but it works fine on a small file.

When I try to run data variable in the code, I get the following message, which makes me think that I need a different way to accept a bigger file of 60 Mbs even way more.

{'message': 'This API returns blobs up to 1 MB in size. The requested blob is too large to fetch via the API, but you can use the Git Data API to request blobs up to 100 MB in size.', 'errors': [{'resource': 'Blob', 'field': 'data', 'code': 'too_large'}], 
'documentation_url': 'https://docs.github.com/rest/reference/repos#get-repository-content'}

Does anyone know how to handle this? It's been hard for me to follow the suggested link.

Egidius
  • 971
  • 1
  • 10
  • 22

0 Answers0