7

I would like to open a Rails console in a Fargate container to interact with my production installation

However after searching the web and posting in the AWS forum I could not find an answer to this question

Does anyone know how I can do this? This seems like a mandatory thing to have in any production environment and having no easy way to do it is kind of surprising coming from such a respected cloud provider as AWS

Thanks

Deepak Mahakale
  • 22,834
  • 10
  • 68
  • 88
Fred grais
  • 139
  • 1
  • 9
  • AWS is a mix of IaaS and PaaS. I mean aws provides lower level of abstraction than "rails application". When people talk about aws they usually speak in terms of databases, load balancers, etc. If I were you I would look into fargate + application load balancer combination. – Molecular Man Apr 25 '19 at 09:50
  • I already have this in place for my API based on rails, I do not see how it relate with my original question which is to open a console in a fargate container ? – Fred grais Apr 25 '19 at 11:05
  • ok, got it wrong. sorry – Molecular Man Apr 25 '19 at 11:42

6 Answers6

8

[Update 2021]: It is now possible to run a command in interactive mode with AWS Fargate!

News: https://aws.amazon.com/fr/blogs/containers/new-using-amazon-ecs-exec-access-your-containers-fargate-ec2/

The command to run is:

aws ecs execute-command \
  --cluster cluster-name \
  --task task-id \
  --container container-name \ # optional
  --interactive \
  --command "rails c"

Troubleshooting:

Check the AWS doc for IAM permissions: https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-exec.html#ecs-exec-prerequisites

BTL
  • 4,311
  • 3
  • 24
  • 29
5

After trying lots of things, I found a way to open a Rails console pointing to my production environment, so I will post it here in case somebody come accross the same issues

To summarise I add a rails application deployed on Fargate connected to a RDS postgres database

What I did is creating a VPN client endpoint to the VPC hosting my Rails app and my RDS database

Then after being connected to this VPN, I simply run my rails production container (with the same environment variables) overriding the container command to run the console startup script (bundle exec rails c production)

Being run on my local machine I can normally attach a TTY to this container and access my production console

I think this solution is good because it allow any developper working on the project to open a console without any costs incurred and a well-though security policy on the AWS end ensure that the console access is secure, plus you don't have to expose your database outside of your VPC

Hope this helped someone

Fred grais
  • 139
  • 1
  • 9
  • I would really appreciate if you could explain this is more detail. I think this is a highly sought after feature that is not well described anywhere on the internet. How do you contact from your command line to the VPC and what commands do you run to launch the rails console? – bo-oz May 10 '19 at 16:19
  • ok so basically you have to create a VPN endpoint to access your VPC (this can be done via the AWS console). Then connecting to this VPN on your local machine allows it to be part of the VPC as if it was en EC2 instance for example. Then you can launch your rails production container on your local machine (with all the production env variables set of course) with the command `bundle exec rails c`. Is it clear enough? – Fred grais May 11 '19 at 18:00
  • @bo-oz I was using it with Fargate containers yes – Fred grais May 13 '19 at 11:42
  • Well, I'm pretty new to all this VPN / SSH kind of stuff... so I don't will I will get this to work without a dummy proof step-by-step manual I'm afraid. Strange that such a thing is not available anywhere on the internet. – bo-oz May 13 '19 at 12:05
  • @bo-oz If you are new to VPN/SSH even a step by step manual will not be enough i'm afraid, first things first get up to speed on these subjects then come back here and it will all be clear don't worry – Fred grais May 14 '19 at 13:28
  • Ok, so basically you are saying, getting rails c to work on Fargate will be a challenging project? – bo-oz May 14 '19 at 14:27
  • I you don't know about VPN/SSH yes indeed – Fred grais May 15 '19 at 21:25
2

Doing any sort of docker exec is a nightmare with ECS and fargate. Which makes doing things like shells or migrations very difficult.

Thankfully, a fargate task on ECS is really just an AWS server running a few super customized docker run commands. So if you have docker, jq, and the AWS CLI either on EC2 or your local machine, you can fake some of those docker run commands yourself and enter a bash shell. I do this for Django so I can run migrations and enter a python shell, but I'd assume it's the same for rails (or any other container that you need bash in)

Note that this only works if you only care about 1 container spelled out in your task definition running at a time, although I'd imagine you could jerry-rig something more complex easy enough.

For this the AWS CLI will need to be logged in with the same IAM permissions as your fargate task. You can do this locally by using aws configure and providing credentials for a user with the correct IAM permissions, or by launching an EC2 instance that has a role either with identical permissions, or (to keep things really simple) the role that your fargate task is running and a security group with identical access (plus a rule that lets you SSH into the bastion host.) I like the EC2 route, because funneling everything through the public internet and a VPN is... slow. Plus you're always guaranteed to have the same IAM access as your tasks do.

You'll also need to be on the same subnet as your fargate tasks are located on, which can usually be done via a VPN, or by running this code on a bastion EC2 host inside your private subnet.

In my case I store my configuration parameters as SecureStrings within the AWS Systems Manager Parameter Store and pass them in using the ECS task definition. Those can be pretty easily acquired and set to a local environment variable using

export DATABASE_URL=$(aws ssm get-parameter --region $REGION \
                      --with-decryption --name parameter.name.database_url \
                      | jq '.Parameter["Value"]' -r)

I store my containers on ECR, so I then need to login my local docker container to ECR

eval $(aws ecr get-login --no-include-email --region $REGION)

Then it's just a case of running an interactive docker container that passes in the DATABASE_URL, pulls the correct image from ECR, and enters bash. I also expose port 8000 so I can run a webserver inside the shell if I want, but that's optional.

docker run -i -t \
           -e DATABASE_URL \
           -p 8000:8000 \
           $ACCOUNT_ID.dkr.ecr.$REGION.amazonaws.com/$DOCKER_REPO_NAME:$TAG \
           /bin/bash

Once you run that you should see your copy of docker download the image from your container repository then launch you into bash (assuming bash is installed inside your container.) Docker has a pretty solid cache, so this will take a bit of time to download and launch the first time but after that should be pretty speedy.

Here's my full script

#!/bin/bash

REGION=${REGION:-us-west-2}
ENVIRONMENT=${ENVIRONMENT:-staging}
DOCKER_REPO_NAME=${DOCKER_REPO_NAME:-reponame}
TAG=${TAG:-latest}

ACCOUNT_ID=$(aws sts get-caller-identity | jq -r ".Account")

export DATABASE_URL=$(aws ssm get-parameter --region $REGION \
                      --with-decryption --name projectname.$ENVIRONMENT.database_url \
                      | jq '.Parameter["Value"]' -r)

eval $(aws ecr get-login --no-include-email --region $REGION)

IMAGE=$ACCOUNT_ID.dkr.ecr.$REGION.amazonaws.com/$DOCKER_REPO_NAME:$TAG

docker run -i -t \
           -e DATABASE_URL \
           -p 8000:8000 \
           $IMAGE \
           /bin/bash
NickCatal
  • 718
  • 7
  • 16
1

You cannot ssh to the underlying host when you are using the Fargate execution type for ECS. This means that you cannot docker exec into a running container.

Ian Massingham
  • 181
  • 1
  • 4
  • So using fargate with a Rails API is prohibited ? As to properly debug anything in production you need the console ? Or am I missing something obvious here to be able to launch a console without paying a full time EC2 instance to be able to run it (without mentionning that it would limit the number of console being able to run concurrently) I find this asthonishing coming from AWS – Fred grais Apr 26 '19 at 01:26
  • 1
    I found a way to connect to the production DB with a rails console from my local machine (see other responses) – Fred grais May 14 '19 at 13:26
0

I haven't tried this on Fargate, but you should be able to create a fargate task in which the command is rails console.

Then if you configure the task as interactive, you should be able to launch the interactive container and have access to the console via stdin.

Javier Ramirez
  • 3,446
  • 24
  • 31
  • Hello, thanks for the answer, I already tried this but even after specifying the flags interactive and pseudoTerminal, when i run the task with aws run task, it just launch the task asynchronously and does not connect my terminal to it – Fred grais Apr 25 '19 at 11:04
  • This also sounds like a nice solution, do you if there's any documentation that explains how this is set-up and connected to? – bo-oz May 10 '19 at 16:20
  • 1
    This is not possible on Fargate. – sj26 Jul 18 '19 at 12:36
  • This only works in EC2 launch-type environment. Not Fargate. https://aws.amazon.com/about-aws/whats-new/2018/09/amazon-ecs-now-allows-three-additional-docker-flags/ – Zulhilmi Zainudin Jun 01 '20 at 09:41
0

Ok, so I ended up doing things a bit differently. Instead of trying to run the console on Fargate, I just run a console on my localhost, but configure it to use RAILS_ENV='production' and let it use my RDS instance.

Of course to make this work, you have to expose your RDS instance through an egress rule in your security group. It's wise to configure it in a way that it only allows your local IP, to keep it a bit more secure.

The docker-compose.yml then looks something like this:

version: '3'
  web:
    stdin_open: true
    tty: true
    build: .
    volumes:
      - ./rails/.:/your-app
    ports:
      - "3000:3000"
    environment: &env_vars
      RAILS_ENV: 'production'
      PORT: '8080'
      RAILS_LOG_TO_STDOUT: 'true'
      RAILS_SERVE_STATIC_FILES: 'true'
      DATABASE_URL: 'postgresql://username:password@yours-aws-rds-instance:5432/your-db'

When you then run docker-compose run web rails c it uses your local Rails codebase, but makes live changes to your RDS DB (the prime reason why you'd like access to rails console anyway).

bo-oz
  • 2,842
  • 2
  • 24
  • 44
  • 2
    Yes, this is basically what I suggested to you by using a VPN to the VPC hosting your RDS, which has the added benefit of not exposing your RDS. However it can become costly as running a VPN endpoint is quite expensive – Fred grais Feb 26 '20 at 14:36