1

I have a locally authored Haskell project, which produces both:

  1. a binary executable, and
  2. several new Haskell modules, which I'd like made accessible to my other, Haskell based, executables.

After:

stack build
stack install

I'm finding that:

  1. the binary executable (#1, above) runs just fine from any directory.
  2. But, the new Haskell modules (#2, above) are only found when I'm running from within my project directory! (That is, for any executable other than #1, above.)

I need to be able to find the new modules from anywhere. How can I achieve this?

dbanas
  • 1,707
  • 14
  • 24
  • Sounds similar to [this question only a couple hours earlier](https://stackoverflow.com/questions/47989939/is-there-a-declarative-way-to-specify-packages-to-be-installed-into-global-proje), and like there I would suggest you use Cabal-install instead of stack, then you never need to worry about making module installs global. – leftaroundabout Dec 27 '17 at 20:26
  • Thanks for your comment. Yes, "cabal install" did solve my problem. However, now I've got two separate, parallel, and largely redundant Haskell installations chewing up hard drive space, and that seems terribly wasteful and unnecessary. It's particularly infuriating, because the binary executable I'm producing can run fine, from any directory, which means it knows how to find my new Haskell modules from anywhere on my system (since it imports them). So, why can't I make these new modules available to other Haskell executables, as well?! – dbanas Dec 28 '17 at 16:27
  • It absolutely is unnecessary, that's why I use _only_ Cabal-install on my laptop, and Stack merely on Travis. — Note that the way your executable finds imported modules is very different from how the compiler finds them for a source file. In fact, if you link statically then there _isn't_ anything external that needs to be found as everything is included in the binary already. If you link dynamically, it looks up a hard-coded path to a particular hashed dynamic-library file, but that only works because the version resolver and linker have done their work beforehands. – leftaroundabout Dec 28 '17 at 18:00
  • Oh, I get it (I think; perhaps, you could confirm?): my newly compiled executable had the benefit of "knowing" where these custom Haskell modules were located on my system, at the time it was compiled/linked. These other Haskell executables, which I'd like to have be able to import these same custom modules, didn't have that same luxury when they were compiled/linked. Is that it? – dbanas Dec 30 '17 at 15:52
  • Yes, that's right. But with cabal (non-sandboxed), there's one global package registry which all projects have access to, so there's generally no more duplication than necessary. – leftaroundabout Dec 30 '17 at 16:27

1 Answers1

1

Each stack project is in its own sandbox, so the compiled modules can only be used within that project. Compiled dependencies (which come from a stackage snapshot) sometimes get shared between projects.

Note that you can list a relative path in the packages list, and point to this package. It will get built again, but it can be directly used in another project this way. Why the extra building? Stack has a different model of projects than cabal-install - it does not allow mutations to the package DB to affect how your other projects build.

One option for sharing such a package is to have it in a git repo and use https://docs.haskellstack.org/en/stable/custom_snapshot/ , but that stuff is still a bit new.

mgsloan
  • 3,245
  • 21
  • 20
  • Thanks for the response. So, if I understand what you're saying, it sounds like modules built for use with Haskell executables, which use a dynamic plugin style of customization, are just not a good candidate for building/installing with stack. They should be built/installed with cabal, instead. Is that a fair summary? – dbanas Dec 28 '17 at 16:34
  • Oh, I did not realize you were trying to dynamically load modules. If you do "stack build pkg-x" in a project, then the modules will be available within the package DBs specified by the GHC_PACKAGE_PATH set by `stack exec`. It sounds like you want more global sharing, but consider that for dynamic plugins to work, dependency versions need to line up, so it should be done within a single stack project... No, you probably shouldn't use cabal for this. Pretty sure the new-build stuff would actually makes this quite a bit more awkward because you would need to specify a particular package id – mgsloan Dec 30 '17 at 04:22
  • Thanks! I'd like to make sure I understand: so, what I really need to do is rebuild this "foreign" executable, after adding my new custom plugins to its stack project description. Is that right? In other words, attempting to build the executable, using one stack project, and the plugins, using a different stack project, is an inherently flawed approach. If that's correct, it seems awfully fragile and restrictive to potential plugin developers. I thought that this is what a well designed/documented API (for the executable) was intended to prevent. I must be missing something fundamental, here. – dbanas Dec 30 '17 at 15:59
  • The plugins would need to be binary compatible, so yeah, the versions need to match exactly. Mere API compatibility is insufficient. One way around this is to load plugins from source files. If you are talking about a dev tool that uses plugins, then it should be feasible to run it in one stack environment but use it in another. You'd just need to plumb environment variables like PATH and GHC_PACKAGE_PATH to the right places. – mgsloan Jan 02 '18 at 21:32