I have a Python Rumps application that monitors a folder for new files using the rumps.Timer(...)
feature. When it sees new files, it transfers them offsite (to AWS s3) and runs a GET request. sometimes that transfer, and get request can take over 1 second, and sometimes up to about 5 seconds. During this time, the application is frozen and can't do anything else.
Here is the current code:
class MyApp(rumps.App):
def __init__(self):
super(MyApp, self).__init__("App", quit_button="Stop")
self.process_timer = rumps.Timer(self.my_tick, 1)
self.process_timer.start()
def my_tick(self, sender):
named_set = set()
for file in os.listdir(self.process_folder):
fullpath = os.path.join(self.process_folder, file)
if os.path.isfile(fullpath) and fullpath.endswith(('.jpg', '.JPG')):
named_set.add(file)
if len(named_set) == 0:
self.files_in_folder = set()
new_files = sorted(named_set - self.files_in_folder)
if len(new_files) > 0:
for new_file in new_files:
# upload file
self.s3_client.upload_file(
new_file,
'##bucket##',
'##key##'
)
# GET request
return requests.get(
'##url##',
params={'file': new_file}
)
self.files_in_folder = named_set
if __name__ == "__main__":
MyApp().run()
Is there a way to have this transfer and GET request run as a background process?
I've tried using subprocess with the transfer code in a separate script
subprocess.Popen(['python3', 'transferscript.py', newfile])
and it doesn't appear to do anything. It will work if I run that line outside of rumps, but once it's in rumps, it will not run.
Edit: code provided