I have a Apache Spark 1.6.1 standalone cluster set on a single machine with the following specifications:
- CPU: Core i7-4790 (# of cores: 4, # of threads: 8)
- RAM: 16GB
I set nothing so Spark can take the default values, which for cores is "all the available cores", based on that, the question is:
Why is Spark detecting 8 cores, when I only have 4?