we have Hadoop cluster ( HDP 2.6.5
cluster with ambari , with 25 datanodes machines )
we are using spark streaming application (spark 2.1
run over Hortonworks 2.6.x
)
the current situation is that spark streaming applications runs on all datanodes machines
but now we want the spark streaming application to run only on the first 10 datanodes
machines
so the others last 15 datanodes
machines will be restricted , and spark application will runs only on the first 10 datanodes
machines
is this scenario can be done by ambary features or other approach?
for example we found the - https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.3.2/bk_yarn_resource_mgt/content/configuring_node_labels.html ,
and
http://crazyadmins.com/configure-node-labels-on-yarn/
but not sure if Node Labes can help us