36

Is it possible to run multiple docker containers in one EC2 instance through AWS ECS (EC2 Container Service)?

tugberk
  • 57,477
  • 67
  • 243
  • 335
  • @tugberk- Hi, Can you please share some information how you achieve this? It will be helpful for me and anyone else? – ketan Jan 19 '18 at 15:11

5 Answers5

27

Yes.

AWS's documentation/product details doesn't ever come out and say it explictly, but it talks about launching many containers to a cluster. A cluster can be one instance.

When configuring a container, you specify memory and CPU usage. ECS uses that to "schedule" (or "pack") an EC2 with Docker containers.

tedder42
  • 23,519
  • 13
  • 86
  • 102
7

All containers defined in one ecs task are deployed onto the same instance.

Even if the cluster has many instances all containers defined in one task are located on the same ec2 instance. The containers can access each other using the links defined between them.

This is equivalent to a POD in Kubernetes.

Shibashis
  • 8,023
  • 3
  • 27
  • 38
5

Yes, for doing that write the task definition that have definition of multiple container.

Vaibhav Jain
  • 2,155
  • 5
  • 27
  • 41
  • 1
    Thanks for that. That did it, just revising the task and adding multiple containers to the task definition. So easy, but for some reason it was not very clear – cameck Sep 09 '16 at 15:01
  • Tangentially, in my local environment, I have one container for the client and one for the server. The client is able to issue http requests to the server (serving at port 3000) using 'http://localhost:3000/api/stuff'. This worked on my mac, but on the AWS ECS, where the two containers reside and run the same as they did in my local context, the client is no longer reaching the server. Is 'localhost' not a thing in these Linux EC2s? – Nick Feb 16 '18 at 03:45
  • @Nick Did you ever figure out a solution to that networking problem? – pdoherty926 Jan 11 '21 at 21:29
2

Exactly. That's possible.

Write one task definition per docker image and run that through a service to automate the deployment. You also need to be careful while dividing the memory and CPU among different tasks to run different docker.

Here is the link for reference.

Stephen Rauch
  • 47,830
  • 31
  • 106
  • 135
Anoop
  • 385
  • 2
  • 10
0

It can be achieved using task definition. If you are familiar with kubernetes,task definition is more like a pod in k8s.

Task definition -> Create new revision or create new task definition.

enter image description here

Go to Container section and add all the container you want to run in the task. Define container name and the image URL.

enter image description here

Note: Allocate memory/cpu required for each container.

Now when you save and run the task in any EC2/Fargate resource, all the containers defined in the task definition runs together.

deepanmurugan
  • 1,815
  • 12
  • 18