0

I have a Apache Spark 1.6.1 standalone cluster set on a single machine with the following specifications:

  • CPU: Core i7-4790 (# of cores: 4, # of threads: 8)
  • RAM: 16GB

I set nothing so Spark can take the default values, which for cores is "all the available cores", based on that, the question is:

Why is Spark detecting 8 cores, when I only have 4?

8 cores

User2130
  • 555
  • 1
  • 6
  • 16
  • 1
    do you have hyperthreading enabled? this is a feature that is available on your processor and basically allows each core to act as two. That would be my first guess. Try toggling hyperthreading off and on in the bios and see if the number of cores changes. – Rob S. Jun 04 '16 at 16:23
  • Thanks! The hyperthreading topic helped too as complement to the answer given below. – User2130 Jun 04 '16 at 20:55

1 Answers1

1

I assume that setting all available cores means that Spark is also using Virtual cores

And since your CPU does support Hyperthreading it has 8 virtual cores available.

If you want to only use physical cores I assume there is a specific setting for that.

SilverWarior
  • 7,372
  • 2
  • 16
  • 22