-1

I'm wondering if there's a way to build a .dpr using dcc32, but skipping the linking part. I work on a very large project, that's been divided in sub-projects for modular compilation. The problem is that every sub-project, when compiled, generates an EXE files which is absolutely unnecessary, and this step also increases the build time.

Any ideas?

Bruno Santos
  • 724
  • 7
  • 19
  • 2
    To get this straight - you want to create DCUs but not the EXE? – gabr Nov 26 '15 at 10:38
  • What are you going to do without any executable files? – David Heffernan Nov 26 '15 at 10:39
  • 1
    It depends on your goal. If you need just to check code for errors - use `Check Syntax` command. – Abelisto Nov 26 '15 at 10:40
  • The sub-projects exist for the sole purpose of modular compilation, meaning, if nothing on a sub-project has changed, I skip it's compilation. I only need them to generate the dcus, while the "main projects" will use those dcus. I know I can just dcc32.exe a single .pas, but I don't wanna have to change by build every time a unit is added to the sub-project. Having a .dpr is a the perfect way of keeping the files of a sub-project, but if I dcc32 a .dpr I aways get a .EXE, which I don't need. – Bruno Santos Nov 26 '15 at 10:47
  • 4
    This is somewhat pointless. The compiler already has good facilities for avoiding unnecessary re-compilation. Your attempt to re-invent this wheel will lead to more fragility and slower compile times. – David Heffernan Nov 26 '15 at 10:53
  • 2
    Note the difference between `Compile, Ctrl-F9` and `Build, Shift-F9`. The former compiles only units that have changed, the latter all units. I *think* the linker time is insignificant, but don't know what time savings you are expecting. – Tom Brunberg Nov 26 '15 at 10:58
  • 3
    If you only need dcu's why don't you use packages instead of full application projects? – Dalija Prasnikar Nov 26 '15 at 11:04
  • I don't suppose building a bpl is much faster than building an exe. – Bruno Santos Nov 26 '15 at 11:33
  • 2
    Do you appreciate that the Delphi compiler already does exactly what you need and skips compilation for units that have not changed and do not depend on units that have changed? – David Heffernan Nov 26 '15 at 12:46
  • Yes I do. For some reason we at some point came to the conclusion that this feature of Delphi's compiler wasn't reliable for such a large project as ours. – Bruno Santos Nov 26 '15 at 13:41
  • 1
    @DavidHeffernan It is true that Delphi compiler is smart enough not to go and recompile every unit every time you build your project. But if you have lots of small units the linking time can become quite long. Now best way to avoid this is to use BPL's as Dalija Prasnikar suggested. If done correctly you can even use these BPL's as runtime libraries. This could come useful during debugging time because you don't even need to link these BPL's into the executable. Of course for final build you probably would not want to use runtime libraries so they would be linked directly into the executable. – SilverWarior Nov 26 '15 at 17:44
  • 1
    @Silver Packages are the worst possible approach. They cause deployment and versioning issue where none are needed. – David Heffernan Nov 26 '15 at 17:52
  • 1
    @Bruno How large is the project? I've never heard of anyone reaching a conclusion like yours. – David Heffernan Nov 26 '15 at 17:52
  • 1
    @Bruno: *came to the conclusion that this feature wasn't reliable*. That's nonsense. The incremental build of code in Delphi isn't broken, even on very large (1MM+ LOC, with 1K+ units + 3rd party source). – Ken White Nov 26 '15 at 20:08
  • 1
    If you want to speed up compilation, you should try Andy's [FastDCC and IDE Fix Pack] (http://andy.jgknet.de/blog/ide-tools/ide-fix-pack/) tools. FastDCC is for command-line usage, it wraps original dcc and replace some slow WinAPI calls to faster versions. IDE Fix Pack is an IDE package which installs fastdcc fixes when IDE is running. Using these tools in my projects halved the compilation time (from approx 2 hours to ~50 min). – Serge Kraikov Nov 27 '15 at 11:54

1 Answers1

3

If the sub-projects exist solely to facilitate modular compilation (as you have mentioned in the comments on the question) then this attempt to "optimise" the build is at least partly responsible for causing the delays you are trying to eliminate.

When building in the IDE, the compiler will already only compile units that have changed since the previous compilation. Your sub-projects are forcing the compiler to make this assessment on the units involved at least twice. Once for each sub-project and then again for the "real" project.

In addition, the compiler is having to link and emit the EXE for each sub-project. EXE's which you say are not actually required as an output of your project build.

You will improve your build times by simply eliminating the redundant "sub-projects".

If the sub-projects added some other value, such as for example providing automated test coverage, then the overhead of these projects might be considered worthwhile, but this does not appear to be the case here.

One other thing to bear in mind is that the compiler decides whether to re-compile a unit based on whether each unit has itself changed. However, if conditional directives or other compiler settings have changed then all units should be recompiled otherwise the effect of the changed compiler settings is not applied (and can itself lead to strange behaviours and errors if these changed settings are applied to some units - the changed ones - and not others).

When building under the IDE this requires you to remember to perform a full build after making changes to compiler settings. In a dedicated build process it is generally advisable to always perform a full build in a "clean room" environment. That is deleting all compilation by-products (dcu's etc) either before or after each build.

NOTE: With these redundant sub-projects you again have created more work and potential trouble for yourself by having to ensure that your compiler settings are consistent across all sub-projects and the "master" project.

Deltics
  • 22,162
  • 2
  • 42
  • 70
  • You can avoid having to rebuild when changing conditional directives by including $(Config) in the unit output path and then changing directives only by switching configurations. Then you'll have a separate set of .dcu files for each configuration, with each configuration always using the same directives. – Jan Goyvaerts Mar 15 '16 at 02:59
  • This doesn't avoid the problem, The problem persists If you change a compiler setting in one of those configurations: You still need to force a full build for *that* configuration's set of dcu's to be rebuilt. Which is the same problem, just qualified differently. For this to "avoid the problem" you would have to ensure that you **never** change any compiler option in **any** configuration set but always create a **new** configuration set for *every* variation of compiler settings, *no matter how minor*. This would quickly become unwieldy imho. – Deltics Mar 15 '16 at 04:16
  • For the OP's problem of wanting to improve compile times, he's likely working with a limited set of configurations at any time, such as a Debug config and a Release config. While those configurations remain fixed, having separate output folders for those configurations avoids the need to rebuild. If a configuration needs to change, then a rebuild will still be needed. No way around that. – Jan Goyvaerts Mar 15 '16 at 09:18
  • Using `$(config)` in the path only changes the situations under which the problem manifests *and* creates a new problem of ensuring that your output path is correctly configured w.r.t previous builds to be able to rely on existing dcu's. You would *always* be well advised to do a full build after changing configurations to avoid "stale" dcu's. There may be other benefits to commend it, but as a solution to this problem this practice would not be 100% reliable. With no way to determine when/if it can be relied on, it cannot be relied on *at all* (to solve this particular problem). imho. ymmv – Deltics Mar 15 '16 at 20:57