I am a little bit confused while reading about Cabal Hell, as the term is overloaded. I guess originally Cabal Hell referred to the diamond dependency problem, which was solved by restricting the build plan to have only a single version of any package in each build plan (two different versions of a package can't exist in a single build plan) as explained in this answer.
However, the term is also used in various other contexts. Such as destructive re-installations, incorrect package dependency boundaries (lower/upper version bounds), inconsistent environments ... (or any other error reported by Cabal).
Particular among these, I am confused about 1) destructive re-installations and 2) inconsistent environments? What do they mean, and how cabal new-build
solves these problems (is it just sandboxing like cabal sandbox
)? And what role ghc-pkg
has to play here?
Any references or a simple example where these problems could be reproduced would be very appreciated.
Regarding "destructive re-installations": If I am not wrong, GHC has a package manager of itself (ghc-pkg
), and the packages are installed as dynamically linkable libraries i.e: base
depends on ghc-prim
, so if ghc-prim
is removed it will break base
, am I right? And since GHC only allows one instance of a package with the same version, cabal install
might register a newer build of the same (package, version)
such that it breaks the dependents of the unregistered package. If the above understanding regarding "destructive re-installations" are correct; how does cabal new-build
help here?