I have 10 servers (16G memory, 8 cores) and want to deploy Hadoop and Spark, can you tell me which plan can make the maximum utilization of resources?
the immediate deployment;
install Openstack, deploy the environment into virtual machine;
using Docker, such as Spark on Docker;
I know resource utilization associated with usage scenario, actually I want to know the advantages and disadvantages of the three plans above.
Thank you.