0

I have 10 servers (16G memory, 8 cores) and want to deploy Hadoop and Spark, can you tell me which plan can make the maximum utilization of resources?

  1. the immediate deployment;

  2. install Openstack, deploy the environment into virtual machine;

  3. using Docker, such as Spark on Docker;

I know resource utilization associated with usage scenario, actually I want to know the advantages and disadvantages of the three plans above.

Thank you.

gudaoxuri
  • 103
  • 7

1 Answers1

0

For highest resource utilization, deploying a single resource manager for both Spark and Hadoop will be a best way to go. There are two options for that:

  • Deploying Hadoop cluster using YARN since Spark can run on YARN.
  • Deploying Apache Mesos cluster, and run Hadoop job and Spark on it.

Isolating Spark cluster and Hadoop cluster provides no advantage over this, and will cause higher overhead and lower resource utilization.

Jihun
  • 1,415
  • 1
  • 12
  • 16