0

I have got a step within an Azure DevOps pipeline which scans a container using trivy.

The Azure powershell task as as follows.

trivy -v
$folder = Get-Location
$filename = "report-$(Build.BuildId)-$(Build.DefinitionName).json"
trivy image -f json -o $filename python:3.4-alpine
$fullfile = Join-Path $folder $filename
write-host $fullfile
echo "Structure of work folder of this pipeline:"
tree $(Agent.WorkFolder) /f

echo "Build.ArtifactStagingDirectory:" 

echo "$(Build.ArtifactStagingDirectory)"

echo "Build.BinariesDirectory:" 

echo "$(Build.BinariesDirectory)"

echo "Build.SourcesDirectory:"

echo "$(Build.SourcesDirectory)"

The file generated is as follows.

/home/vsts/work/1/s/report-6949-my-test-pipeline.json

I would like to be able to download the file from the pipeline as an artifact or perhaps upload it onto a storage account.

Shayki Abramczyk
  • 36,824
  • 16
  • 89
  • 114
learner
  • 545
  • 2
  • 9
  • 23

1 Answers1

2

Just add a publish artifacts step that publish the .json file:

  - task: PublishPipelineArtifact@1
    inputs:
      targetPath: '$(Build.SourcesDirectory)/report-$(Build.BuildId)-$(Build.DefinitionName).json'
      artifact: 'trivy-output'
      publishLocation: 'pipeline'

Result:

enter image description here

Shayki Abramczyk
  • 36,824
  • 16
  • 89
  • 114
  • Thank you for this, worked a treat. Was wondering if its possible to automate the writing of the report onto a SQL database, will this be written to a storage account first, then have an ETL that sweeps up the files and then imports onto DB ? Any ideas here ? – learner Jun 11 '23 at 22:14
  • I guess it's possible but i'm not familiar with this :/ – Shayki Abramczyk Jun 12 '23 at 07:11