Here is my desired use case:
I have a package with a single module that reads HDF5 files and writes some of their data to Haskell records. To do the work, the library uses the bindings-hdf5
package. Here is my cabal's build-depends
. reader-types
is a module I wrote that defines the types of the Haskell records that contain the read-in data.
build-depends: base >=4.7 && <4.8
, text
, vector
, containers
, bindings-hdf5
, reader-types
Note that my cabal
file does not currently use extra-libraries
or ghc-options
. I can load my module, src/Mabel.hs
in ghci as long as I specify the required hdf5_hl
library:
ghci src/Mabel.hs -lhdf5_hl -L/long/nixos/path/lib
and within ghci, I can run my function perfectly fine.
Now, what I want to do is compile this library/module into a single, compiled file that I can later load with the GHC API in a different Haskell program. By single file, I mean that it needs to run even if the hdf5_hl
library does not exist on the system. Preferably, it would also run even if text
, vector
, and/or containers
are missing, but this is not essential because reader-types
requires those types anyway. When loading the module with the GHC API, I want it to load in already compiled form, and not run interpreted.
My purpose for doing this is that I want the self-contained file to act as a single, pre-compiled plugin file that is later loaded and executed by a different Haskell executable. Other plugins might not use hdf5 at all, and the only package they are guaranteed to use is reader-types
, which essentially defines the plugin interface types.
The hdf5 library on my system contains the following files: libhdf5_la.la
, libhdf5_hl.so
, libhdf5.la
, libhdf5.so
, and similar files that have the version number in the file name.
I have done a lot of googling, but am getting confused by all the edge cases I am finding. Here are some examples that I'm either sure don't fit my case, or I can't tell.
- I do not want to compile a Haskell library to use from C or Python, only a Haskell program using GHC API.
- I do not want to compile C wrappers for a C++ library into a Haskell module because the bindings already exist and the library is already a C library.
- I do not to want compile a library that is entirely self-contained because, since I am loading it with the GHC API, I don't need the GHC runtime included in the library. (My understanding is that the plugins must be compiled with the same ghc version they will be loaded with in the GHC API).
- I do not want to compile C bindings and the C library at the same time because the C library is already compiled and the bindings are specified in separate package (
bindings-hdf5
). - The closest resource for what I want to do is this exchange on the mailing list from 2009. However, I added
extra-libraries: hdf5_hl
orextra-libraries: hdf5
to my cabal file, and in both cases the resulting .a, .so, .dyn_hi, .dyn_o, .hi, and .o files indist/build
are all the exact same size as without usingextra-libraries
, so I'm confident it is not working correctly.
What changes to my cabal
file do I need to make to create a self-contained, standalone file that I can later load with the GHC API? If this is not possible, what are the alternatives?
Instead of using the GHC API, I am also open to using the plugins
library to load the plugin, but the self-contained requirements are still the same.
EDIT: I do not care what form the compiled "plugin" must take (I assume object file is the right way), but I want to load it dynamically from an separate executable at run time and execute functions it defines with known names and known types. The reason I want a single file is that there will eventually be other different plugins, and I want them all to behave the same way without having to worry about lib paths and dependencies for each one. A compiled, single file is a simpler interface for doing this than zipping/unzipping archives that include Haskell object code and their dependencies.