0

I'm calling the following command from a Github Actions runner:

aws ssm send-command --document-name "AWS-RunRemoteScript" \
--targets "Key=instanceids,Values=<my instance id>" \
--parameters file://param.json --region us-east-1 \
--cloud-watch-output-config CloudWatchLogGroupName=CITestLogGroup,CloudWatchOutputEnabled=true

I'd like to be able to follow the script execution from the Github interface (as opposed to Cloudwatch).

Solutions I've considered:

--follow parameter

aws logs tail CITestLogGroup --format short --follow --log-stream-name-prefix <my command id>

This works but when the script execution completes the aws logs tail command does not return, and waits indefinitely. It seems inappropriate for non-interactive use.

Batch fetch

I could fetch log entries in batches. Every 10 seconds I could grab the last 10 seconds of log data:

aws logs tail CITestLogGroup --format short --since 10s --log-stream-name-prefix <my command id>

...and check for command completion status, using it as a condition to end the loop. This seems fragile - likely to miss or duplicate log entries if the timing isn't tight. A possible solution to timing, albeit with a good bit more complexity and tight coupling to Cloudwatch log formatting, is this.

Single fetch

I could get an accurate log by waiting till the end and grabbing the entire history but this is a long running script so I'd rather follow.

Is there any other approach I should consider?

davegravy
  • 888
  • 11
  • 28

0 Answers0