2

I'm running a foreach loop with the snow back-end on a windows machine. I have 8 cores to work with. The rscript is exectuted via a system call embedded in a python script, so there would be an active python instance too.

Is there any benefit to not have #workers=#cores and instead #workers<#cores so there is always an opening for system processes or the python instance?

It successfully runs having #workers=#cores but do I take a performance hit by saturating the cores (max possible threads) with the r worker instances?

imouellette81
  • 234
  • 2
  • 5

1 Answers1

1

It will depend on

  1. Your processor (specifically hyperthreading)
  2. How much info has to be copied to/from the different images
  3. If you're implementing this over multiple boxes (LAN)

For 1) hyperthreading helps. I know my machine does it so I typically have twice as many workers are cores and my code completes in about 85% of the time compared to if I matched the number of workers with cores. It won't improve more than that.

2) If you're not forking, using sockets for instance, you're working as if you're in a distributed memory paradigm, which means creating one copy in memory for every worker. This can be a non-trivial amount of time. Also, multiple images in the same machine may take up a lot of space, depending on what you're working on. I often match the number of workers with number because doubling workers will make me run out of memory.

This is compounded by 3) network speeds over multiple workstations. Locally between machines our switch will transfer things at about 20 mbytes/second which is 10x faster than my internet download speeds at home, but is a snail's pace compared to making copies in the same box.

You might consider increasing R's nice value so that the python has priority when it needs to do something.

mateen
  • 83
  • 5
  • yes the machine has hyperthreading and I'm when I mean # workers = # cores I mean # worker = # logical cores. I'm also using sockets as its a windows machine. Its one work station. The coping of data to each worker takes less than 1% of the time. The questions is more once the computing begins is their any benefit to leaving a core open for os processes or the python process submitting the system call the spawns the parallel R job. – imouellette81 Oct 23 '15 at 18:37