16

I don't understand, how FP compilers make the code dealing with immutable data structures fast, not blow up stack, etc.

For example, insert operation in tree, it has to copy the whole tree before adding the new node and return the copied tree, versus the imperative couterpart that only needs to add a pointer to the new node. If the insert operation is run millions times, it would take a load of memory, and copying will be slower and slower when the tree is bigger. How do FP compilers actually optimize this ?

romerun
  • 2,161
  • 3
  • 18
  • 25

4 Answers4

16

You don't have to copy the whole tree to make a change; you can share most of the structure. See e.g. the diagrams in this blog, or this talk by Rich Hickey on Clojure (see discussion of hash tries about halfway through).

Brian
  • 117,631
  • 17
  • 236
  • 300
7

The compiler won't really optimize this, it is something you have to program for specifically when coding. Techniques for doing this are explained in the excellent Purely Functional Data Structures (book, thesis).

Adam Goode
  • 7,380
  • 3
  • 29
  • 33
1

Take a look at the Zipper data structure.

Todd Stout
  • 3,687
  • 3
  • 24
  • 29
-5

If the garbage collector is doing its job, the older copies of the data structure will be reclaimed when they're no longer being used.

Barry Brown
  • 20,233
  • 15
  • 69
  • 105
  • May I ask why down-voting this answer ? assume I want to move the same set through some modifications and keep things functional, wouldn't this be helpful to know that GC will collect those parts that are no longer in use ? – Ashkan Kh. Nazary Feb 25 '11 at 03:01
  • 1
    It doesn't answer in any way to the question. – devoured elysium Sep 03 '11 at 15:38