7

The R docs describe the ByteCompile field in the "DESCRIPTION file" section as:

The ‘ByteCompile’ logical field controls if the package code is to be byte-compiled on installation: the default is currently not to, so this may be useful for a package known to benefit particularly from byte-compilation (which can take quite a long time and increases the installed size of the package)

I infer the only detrimental side-effects to byte-compiling are (a) time-to-install and (b) installation size. I haven't found a package that takes too long during installation/byte-compiling, and the general consensus is that GBs are cheap (for storage).

Q: When should I choose to not byte-compile packages I write? (Does anybody have anecdotal or empirical limits beyond which they choose against it?)

Edit: As noted in the comments of an older question, the rationale that debugging is not possible with byte-compiled code has been debunked. Other related questions on SO have discussed how to do it (either manually with R CMD INSTALL --byte-compile ... or with install.packages(..., type="source", INSTALL_opts="--byte-compile")), but have not discussed the ramifications of or arguments against doing so.

Community
  • 1
  • 1
r2evans
  • 141,215
  • 6
  • 77
  • 149
  • 1
    This is a duplicate of http://stackoverflow.com/q/8343243/602276, but the answer over there seems incomplete. Perhaps time for a new answer? – Andrie Jun 20 '16 at 22:13
  • @Andrie That question/answer is certainly in need for an update. – lmo Jun 20 '16 at 22:17

1 Answers1

3

I have yet to find a downside for byte-compiling, other than the ones you mention: slightly increased file size and installation time.

In the past, compiling certain code could cause slow-down but in recent versions of R (version >3.3.0), this doesn't seem to be a problem.

csgillespie
  • 59,189
  • 14
  • 150
  • 185
  • I've found byte compiling during development to be problematic: debugging is more spartan when byte compiled; though my dev cycle typically involves frequenting `source`ing vice build/install, the latter is still necessary pre-deployment. I wasn't aware of that previous JIT problem, good to know it is history. Thanks for your thoughts on a slightly "stale" question! :-) – r2evans Sep 18 '16 at 19:58
  • Could you provide a few more details on "debugging". I've been using byte-compiling for awhile and not had any issues. – csgillespie Sep 18 '16 at 19:59
  • Using emacs/ESS (I do not use RStudio), when debugging a `source`d function, ESS does next-line highlighting in the source file, following step-by-step. With a byte-compiled function, the source file is not known so this is not possible. Perhaps it's a limitation of emacs/ESS and not of R? – r2evans Sep 18 '16 at 20:01
  • ... though without the source file reference (`srcref` and http://stackoverflow.com/a/32749240/3358272), I don't know how any IDEs would be able to follow and provide in-source context. – r2evans Sep 23 '16 at 20:23
  • 1
    Support for source references in byte-compiled code has just been added to the R development version last week. You can, even in older versions of R, debug byte-compiled functions using `debug` (set `debug` on the function and then run it) - effectively this works by interpreting the AST of the function (ignoring its byte-code). – Tomas Kalibera Oct 07 '16 at 07:12
  • You are very unlikely to get a slowdown with the byte-code compiler. If you use JIT on very short-running code, there could be slowdown because of the compilation time. Then you can get slowdown in case of very strange programming patterns (like frequent replacement of closure-environment that causes the need to re-compile, so again the overhead is compile time). This happens very rarely. – Tomas Kalibera Oct 07 '16 at 07:16