9

I been compiling a ".c / .c++" code which takes 1.5hour to compile on 4 core machine using "make" command.I also have 10 more machine which i can use for compiling. I know "-j" option in "make" which distribute compilation in specified number of threads. but "-j " option distribute threads only on current machine not on other 10 machine which are connected in network.

we can use MPI or other parallel programing technique but we need to rewrite "MAKE" command implementation according to parallel programing language.

Is there is any other way by which we can make use of other available machine for compilation??? thanks

rahulk
  • 175
  • 3
  • 15
  • Ok, I get the [tag:gnu-parallel]. But why [tag:linux-kernel]? Did you mean [tag:linux] instead? – Baum mit Augen Sep 06 '15 at 13:05
  • @4566976 it is not about getting unwanted crowed. since similar problem exist in different domain's such as linux kernel compilation, i want those people to look into it, and it will helpful for future reference also. so instead of looking at tag, concentrate on problem statement. – rahulk Sep 06 '15 at 13:06
  • @BaummitAugen Yes, but now the tags do not include that the question is about compiling C and C++. – 4566976 Sep 06 '15 at 13:12
  • @4566976 While one could argue that the question is not about actual C or C++ code, I would not have removed the tags for that very reason. I just wanted to point out that "it reduces the audience" is not a valid reason against an edit. – Baum mit Augen Sep 06 '15 at 13:17

2 Answers2

7

Yes, there is: distcc.

distcc is a program to distribute compilation of C or C++ code across several machines on a network. distcc should always generate the same results as a local compile, is simple to install and use, and is often two or more times faster than a local compile.

Unlike other distributed build systems, distcc does not require all machines to share a filesystem, have synchronized clocks, or to have the same libraries or header files installed. Machines can be running different operating systems, as long as they have compatible binary formats or cross-compilers.

By default, distcc sends the complete preprocessed source code across the network for each job, so all it requires of the volunteer machines is that they be running the distccd daemon, and that they have an appropriate compiler installed.

They key is that you still keep your single make, but gcc the arranges files appropriately (running preprocessor, headers, ... locally) but arranges for the compilation to object code over the network.

I have used it in the past, and it is pretty easy to setup -- and helps in exactly your situation.

Dirk Eddelbuettel
  • 360,940
  • 56
  • 644
  • 725
2

https://github.com/icecc/icecream

Icecream was created by SUSE based on distcc. Like distcc, Icecream takes compile jobs from a build and distributes it among remote machines allowing a parallel build. But unlike distcc, Icecream uses a central server that dynamically schedules the compile jobs to the fastest free server. This advantage pays off mostly for shared computers, if you're the only user on x machines, you have full control over them.

4566976
  • 2,419
  • 1
  • 10
  • 14