I'm trying to set up some integration-tests using a real SQL database. I've found this excellent tutorial https://blog.dangl.me/archive/running-sql-server-integration-tests-in-net-core-projects-via-docker/ and implemented it, working like a charm locally. Now, I would also like to have these tests being part of the CI build pipeline, which is running in Azure DevOps. Unfortunately, this doesn't work out of the box, as Azure DevOps doesn't just allow access to the likes of Dockerhub without a service connection.
As a first solution, I've created a YAML task to create the container and check in the code, if it is existing. The tasks is looking like this:
- task: Docker@2
displayName: 'Start SQL Server Container'
inputs:
containerRegistry: DockerHub
command: run
arguments: '-e "ACCEPT_EULA=Y" -e "SA_PASSWORD=123" -p 1437:1433 --name SQLServer2017 -d mcr.microsoft.com/mssql/server:2017-latest'
This works quite well, but I have now a dependency to a task, two seperate container setup paths (one in the code for local development and a task for Azure DevOps) and I'm way less flexible, for example if I'd like to try a container per test.
I think it would be way better and more understandable, if I could get rid of this task and have all the logic in the code. To do so, I would guess I need somehow access to the service connection or tell Azure DevOps to allow the access to Dockerhub? Searching for that, I found the likes of How can a script access Service Connections? (Azure Devops Pipelines), but I didn't find any hints, how I could use a service connection in the code. So my questions would be:
- Is the approach even feasible or is there a better solution at hand?
- Can I use a service connection to allow access to Dockerhub via code?
- Are ther other possibilities to allow access to Dockerhub via code?