I need to get logs for all builds from a Jenkins instance.
def get_builds():
builds_api_res = session.get('https://build.org.com/api/json?depth=3',stream=True,auth=(username,access_token),).json()
for chunk in builds_api_res.iter_content(chunk_size=1024*1024):
# Do something with the chunk of data
print(chunk)
get_builds()
The problem is, this returns such a huge response that the Jenkins instance itself runs out of memory.
So, the other approach would be to get all builds from each folder individually. That is where I am facing the problem.
When I look at the folders, some builds are at 2 folder down levels, some are at 3 levels down, maybe there are builds which are 4 folder level down from the root folder as well. I am not sure how to keep on looking for folders inside folders unless I find a build.
for project in projects_api['jobs']:
folder_url = JENKINS_URL+'/job/'+project['name']+'/api/json'
folders = session.get(folder_url, auth=(username, access_token)).json()
jobs = folders['jobs']
for job in jobs:
job_det_url = job['url'] + 'api/json'
all_builds = session.get(job_det_url, auth=(username, access_token)).json()
latest_build = all_builds['builds'][-1]
print(latest_build['url'])
log_url = latest_build['url'] +'/consoleText'
print(log_url)
logs = session.get(log_url, auth=(username, access_token))
print(logs.text)
This works for builds which are inside the root folder. However, if there are more folders inside that folder then it will fail. How do I locate the builds directly from the parent folder using the API, if there is at all a way to do that? or is there a better approach to the whole thing?