0

The solution for this problem is just create two rasa services in the container one for train and another for run the rasa and set the dependes_on on the rasa service, so it will start when the rasa train ends. you can choose any port to train example code:

version: '3.0'
services:
  train:
    image: rasa/rasa:3.4.2-full
    ports:
      - 8080:8080
    volumes:
      - ./rasa_Chatbot:/app
    command:
      - train
  rasa:
    image: rasa/rasa:3.4.2-full
    ports:
      - 5005:5005
    depends_on:
      train:
        condition: service_completed_successfully
    volumes:
      - ./rasa_Chatbot:/app
    command:
      - run
      - -m
      - models
      - --enable-api
      - --cors
      - "*"

when docker-compose up, the docker will train the model and run in one single command I spent all day searching for this solution, as I couldn't find it, I created my own solution and decided to post it for future programmers who have the same problem as me.

i spent hours trying run multiple commands in rasa container but didnt work, so i create my own solution

0 Answers0