0

I need to get logs for all builds from a Jenkins instance.

def get_builds():
    builds_api_res = session.get('https://build.org.com/api/json?depth=3',stream=True,auth=(username,access_token),).json()
    for chunk in builds_api_res.iter_content(chunk_size=1024*1024):
        # Do something with the chunk of data
        print(chunk)
get_builds()

The problem is, this returns such a huge response that the Jenkins instance itself runs out of memory.

So, the other approach would be to get all builds from each folder individually. That is where I am facing the problem.

When I look at the folders, some builds are at 2 folder down levels, some are at 3 levels down, maybe there are builds which are 4 folder level down from the root folder as well. I am not sure how to keep on looking for folders inside folders unless I find a build.

for project in projects_api['jobs']:
    folder_url = JENKINS_URL+'/job/'+project['name']+'/api/json'
    folders = session.get(folder_url, auth=(username, access_token)).json()
    jobs = folders['jobs']
    for job in jobs:
        job_det_url = job['url'] + 'api/json'
        all_builds = session.get(job_det_url, auth=(username, access_token)).json()
        latest_build = all_builds['builds'][-1]
        print(latest_build['url'])
        log_url = latest_build['url'] +'/consoleText'
        print(log_url)
        logs = session.get(log_url, auth=(username, access_token))
        print(logs.text)

This works for builds which are inside the root folder. However, if there are more folders inside that folder then it will fail. How do I locate the builds directly from the parent folder using the API, if there is at all a way to do that? or is there a better approach to the whole thing?

Abhishek Rai
  • 2,159
  • 3
  • 18
  • 38

1 Answers1

0

I decided to keep on checking for folders inside folders in a loop and break out of it if a build is found. Seems to be working.

def check_builds(job):
    try:
        if job['color']:
            print("Build found, sleepng for 7 seonds")
            time.sleep(7)
            build_url = job['url'] + 'api/json'
            builds = session.get(build_url, auth=(username,access_token),stream=True).json()
            resp = builds['builds'][0]['url']
            log_url = resp + 'consoleText'
            time.sleep(10)
            print("Logs Found. Sleeping for 10 seconds")
            logs = session.get(log_url, auth=(username,access_token),stream=True)
            print(logs.text)
            return True
    except:
        return job['url']

def no_build(job):
    print("No builds found. Checking folders in 7 seonds")
    time.sleep(7)
    next_url = job + 'api/json'
    print(next_url)
    try:
        next_level = session.get(next_url, auth=(username,access_token),stream=True).json()
        for job in next_level['jobs']:
            resp = check_builds(job)
            if resp is not True:
                no_build(resp)
    except:
        pass


# Get a list of all jobs from root folders
projects_api = session.get(JENKINS_URL + '/api/json', auth=(username,access_token),stream=True).json()
for root_folder in projects_api['jobs']:
    root_folder_job_url = JENKINS_URL + 'job/'+ root_folder['name']+ '/api/json/tree=jobs[name,url,builds[number,url]]'
    root_jobs = session.get(root_folder_job_url, auth=(username,access_token),stream=True).json()
    for r_job in root_jobs['jobs']:
        check = check_builds(r_job)
        if check is not True:
            no_build(check)
Abhishek Rai
  • 2,159
  • 3
  • 18
  • 38