I have a docker-compose.local.yml file that looks like this:
version: "3.6"
services:
rabbitmq:
image: "rabbitmq:latest"
ports:
- "15672:15672"
- "5672:5672"
volumes:
- "./enabled_plugins:/etc/rabbitmq/enabled_plugins"
abc-service: &a-service
image: ${DOCKER_REGISTRY}unicorn/a-service:${DOCKER_IMAGE_LABEL:-latest}
build:
context: .
dockerfile: ./debugging/Dockerfile
ports:
- "0.0.0.0:8080:8080"
depends_on:
- rabbitmq
environment:
- NODE_ENV=development
- GOOGLE_SHEET_TAB=abc
configs:
- source: abc-service
target: ./bin/csv/main.csv
def-service:
<<: *a-service
environment:
- NODE_ENV=development
- GOOGLE_SHEET_TAB=def
configs:
- source: def-service
target: ./bin/csv/main.csv
configs:
abc-service:
file: ./csvs/abc.csv
def-service:
file: ./csvs/def.csv
So this creates 3 services... a RabbitMQ service, a service called abc-service and a service called def-service that copies from abc-service but changes the environment variable of google sheet tab.
Both these services run off of different csv files.
I have a folder structure like
-
- src
- bin
- csv
- csvs
my Dockerfile copies src and bin to my container. I want my docker-compose to copy the correct csv in the csvs folder for that service to the bin/csv folder as specified in the config section.
When I try and run my services locally, they don't have the csv copied to the folder. This seems to be because it requires docker swarm to be running.
Am I doing this correctly? I've seen a lot of talk about having to run CLI commands to add the CSV files as docker config variables... But i don't fully understand this and how it will later work with my gitlab auto deployment? Also how can i easily test my application is using the correct (or at least copying the right) csv to the bin/csv folder?