1

I have a large eco-system of IDL code, mostly implemented using custom objects, that a large number of different applications are built from and run as regular jobs on a cron or as post-processed tasks. At the moment each 'application' is run from a script that first populates the !path as appropriate and then runs whatever top level procedure is required. Compilation happens on the fly. This is becoming a bit of a pain in terms of version control (i.e. ensuring no one touches the operational code which the cron jobs require and instead work on the development code, but then remember to push changes to the operational area). It would clearly be more efficient to instead build .sav files for each application allowing code to be changed more freely rebuilding the .savs when modifications are complete (not to mention the runtime savings in not having to continually re-compile the same code over and over again each time each script is run).

Now, the issue I am having with this is ensuring that all required modules are compiled into the .sav file. Simple compiling the top level routine then using RESOLVE_ALL doesn't seem to work as it misses many of the objects (i.e. when I RESTORE the .sav and run the code more routines get compiled that weren't in the .sav). Looking at the docs it appears IDL doesn't resolve object definitions very well. There is the CLASS keyword to RESOLVE_ALL that is supposed to get around this, but you then need to manually list the classes to compile, which defies the point of having IDL resolve the routines in the first place. Even running the script once in order to ensure everything is compiled doesn't work, since there are occasional edge cases that causes some objects to be required that in normal runs won't be needed, so that's not a robust way of ensuring all routines are resolved.

I did see mention of a user written RESOLVE_EVENMORE routine on David Fannings page at https://www.idlcoyote.com/tips/compile.html that looked promosing, but the link to the code appears to be dead.

Can anyone suggest a good approach here?

I would really prefer to avoid having to create workbench projects for each of these applications since there are a lot of them and I'd like to be able to make global changes, such as changing where code is stored, for all applications easily by making build scripts that refer to environmental variables for all this deployment specific info. Basically, I want the IDL equivalent of GNU Make.

Bogdanovist
  • 1,498
  • 2
  • 11
  • 20
  • 1
    Good question, though I am not expert enough to answer it. There aren't many IDL folks lurking here, unfortunately ... you'll probably get more visibility if you post it to the idl-pvwave Google group. – Ajean Jul 08 '14 at 05:16
  • Thanks, yeah I did cross-post this there as well. No response yet. If I get a good resolution there I will post as an answer here (if not otherwise answered here). – Bogdanovist Jul 08 '14 at 05:48

1 Answers1

1

[Edited to clarify] You could use FILE_SEARCH to find all of the .pro code within your particular application's directory and all subdirectories [not everything in IDL's !path, just your own application directory]. Then loop through each file and use RESOLVE_ROUTINE, with the name of the routine given by the file name (minus the .pro). You will probably need a catch block to ignore any compile errors, and you might need a list of files to skip. Finally, use RESOLVE_ALL to pick up anything in IDL's lib directory that might be required.

Using this method, as long as all of your code is within some top-level directory, you will end up with everything that you need in your save file.

Chris Torrence
  • 452
  • 3
  • 11
  • Thanks Chris, just to clarify, this will essentially just put everything in the !path into the .sav file, as opposed to only what is needed as in the case of using RESOLVE_ALL? That's okay, the .sav files will be bigger than necessary, but that's not really an issue. – Bogdanovist Feb 23 '15 at 01:27