0

I have transferred forecasts.py file from GitHub to my virtual machine via Azure Pipelines. If I start the script from the virtual machine terminal with python3 forecasts.py &, everything goes smoothly and the script remains running in the background. For some reason, I get the following message from the Azure Pipelines, if I try to start that script similarly:

The STDIO streams did not close within 10 seconds of the exit event from process '/bin/bash'. This may indicate a child process inherited the STDIO streams and has not yet exited.

Full debug logs can be found here

The core content of the forecasts.py is the following:

import schedule
import time

def job():
    print("I'm working...")

schedule.every().minute.at(":00").do(job)

while True:
    schedule.run_pending()
    time.sleep(5)

This script should print "I'm working..." once per minute. Should I start the script with some different way?

EDIT

azure-pipelines.yml might help to solve this:

variables:
- name: system.debug
  value: true 

jobs:
- deployment: fmi_forecasts_deployment
  displayName: fmi_forecasts
  environment:
    name: AnalyticsServices
    resourceType: VirtualMachine
  strategy:
      rolling:
        maxParallel: 2  #for percentages, mention as x%
        preDeploy:
          steps:
          - download: current
          - script: echo initialize, cleanup, backup, install certs
        deploy:
          steps:
          - checkout: self
          - script: sudo apt install python3-pip
            displayName: 'Update pip'
          - script: python3 -m pip install -r requirements.txt
            displayName: 'Install requirements.txt modules'
          - script: rsync -a $(Build.SourcesDirectory) /home/ubuntu/$(Build.Repository.Name)/
            displayName: 'Sync files to $(Build.Repository.Name)'
          - task: Bash@3
            inputs:
              targetType: 'inline'
              script: python3 /home/ubuntu/$(Build.Repository.Name)/s/forecasts.py &
            displayName: 'Start the script'
        routeTraffic:
          steps:
          - script: echo routing traffic
        postRouteTraffic:
          steps:
          - script: echo health check post-route traffic
        on:
          failure:
            steps:
            - script: echo Restore from backup! This is on failure
          success:
            steps:
            - script: echo Notify! This is on success    

EDIT

I edited the forecasts.py file to print "Sleeping..." every 5 seconds. And when I execute that with nohup python -u /home/ubuntu/$(Build.Repository.Name)/s/forecasts.py & I will receive the following logs. So, the script works, but when I look the running processes in the VM, there is not any python processes running. The script dies, when the pipeline ends, I assume.

pentti
  • 95
  • 10
  • **How** are you starting the script? What does your pipeline look like? – Daniel Mann Aug 19 '20 at 18:47
  • I added the `azure-pipelines.yml`, where the information can be found. – pentti Aug 20 '20 at 06:40
  • Azure Pipelines are for building code, not for running code (forever). – Wolfgang Kuehn Oct 11 '20 at 21:58
  • @WolfgangKuehn Do you know, where to start, which technologies to choose etc. in order to start my code automatically, when GitHub (or other repo) is updated? I am beginner in this scene, so every advice is beneficial. – pentti Nov 10 '20 at 12:42

1 Answers1

0

##[debug]The task was marked as "done", but the process has not closed after 5 seconds. Treating the task as complete.

According to the debug log, this should be more like a prompt message indicating that some process is still running and has not been cleaned up, rather than an error message which didn't write into the standard error steam and fail the task.

If you want that script will continue to to run in the background while the task has finished. You could try to use start-process command to launch the script. This will make sure that the launched job keeps running when the task is finished. But the job will be closed when the build is finished.

Start-Process powershell.exe -ArgumentList '-file xxx\forecasts.py'

For details, please refer to the workaround in this ticket.

Hugh Lin
  • 17,829
  • 2
  • 21
  • 25
  • I assume, that could work in Windows. Ubuntu instance do not recognize `start-process`. Do you know, what is the equivalent script in Ubuntu? Maybe it would be possible to install powershell to ubuntu also, but I'd rather not to do that. – pentti Aug 20 '20 at 06:26
  • How about the way mentioned in this [ticket](https://askubuntu.com/questions/396654/how-to-run-a-python-program-in-the-background-even-after-closing-the-terminal)? – Hugh Lin Aug 20 '20 at 07:36
  • Unfortunately, using `nohup python3 xxxx/forecasts.py` resulted the same issue. – pentti Aug 20 '20 at 07:46
  • Help from here helped a bit, but did not solve the issue. [link](https://stackoverflow.com/questions/32213565/nohup-for-python-script-not-working-when-running-in-the-background-with) – pentti Aug 20 '20 at 11:18
  • You can try `gnome-terminal -x bash -c "` command. Here is [reference](http://pwet.fr/man/linux/commandes/gnome_terminal/) . – Hugh Lin Aug 21 '20 at 10:06
  • The `-x` is deprecated, but I used `--` instead. I received the following error: `Unable to init server: Could not connect: Connection refused # Failed to parse arguments: Cannot open display:` I could not find an answer with quick googling. – pentti Aug 24 '20 at 06:41