-2

Recently I have written an R script for the lab which is nothing but

a=system(cmd, wait =T, intern = T)

in a loop. It calls some tools from the terminal. I ran it from terminal as

sudo Rscript mycode.R.

The problem is that it runs in a 4GB RAM, 1cpu laptop as the same speed as in 16GB RAM 12 CPU PC . When it runs on the PC it just uses 1.6GB of RAM and only one CPU with 100%. PC is Ubuntu 16.04. How can I make it faster? Is it just a limit imposed by R? Thanks in advance.

  • Possible duplicate of [R not using more than 4GB of memory](https://stackoverflow.com/questions/15594318/r-not-using-more-than-4gb-of-memory) – mischva11 Jul 29 '18 at 20:25
  • Thank you! How can I change that? I guess it won't be the same as setting the limit from R – Farid Ahadli Jul 29 '18 at 20:44
  • 2
    It sounds like your code (1) is not parallelized, and (2) only requires 1.6GB of RAM. From (2), it means that having 4GB of RAM vs. 16GB of RAM makes no difference to performance when running this script. From (1), it means that if the single-core performance is the same between CPUs, having 12 cores vs. 1 core will not improve performance. If your code can benefit from parallelization, you could speed up execution on your PC that way. [Here's](https://www.r-bloggers.com/how-to-go-parallel-in-r-basics-tips/) one example of some info on how to go parallel. – duckmayr Jul 29 '18 at 22:09
  • Thank you @duckmayr ! Your comment lead me to the right – Farid Ahadli Aug 01 '18 at 22:26

1 Answers1

2

I resolved the problem! Actually, the second comment lead me to the solution. My job was doing some bioinformatics analysis. It turned out that the input file could be split into parts. I split it, then ran a parallel command with GNU Parallel. Thank you!