50

I'm used to lazy evaluation from Haskell, and find myself getting irritated with eager-by-default languages now that I've used lazy evaluation properly. This is actually quite damaging, as the other languages I use mainly make lazily evaluating stuff very awkward, normally involving the rolling out of custom iterators and so forth. So just by acquiring some knowledge, I've actually made myself less productive in my original languages. Sigh.

But I hear that AST macros offer another clean way of doing the same thing. I've often heard statements like 'Lazy evaluation makes macros redundant' and vice-versa, mostly from sparring Lisp and Haskell communities.

I've dabbled with macros in various Lisp variants. They just seemed like a really organized way of copy and pasting chunks of code around to be handled at compile time. They certainly weren't the holy grail that Lispers like to think it is. But that's almost certainly because I can't use them properly. Of course, having the macro system work on the same core data structure that the language itself is assembled with is really useful, but it's still basically an organized way of copy-and-pasting code around. I acknowledge that basing a macro system on the same AST as the language that allows full runtime alteration is powerful.

What I want to know is, is how can macros be used to concisely and succinctly do what lazy-evaluation does? If I want to process a file line by line without slurping up the whole thing, I just return a list that's had a line-reading routine mapped over it. It's the perfect example of DWIM (do what I mean). I don't even have to think about it.

I clearly don't get macros. I've used them and not been particularly impressed given the hype. So there's something I'm missing that I'm not getting by reading over documentation online. Can someone explain all of this to me?

acfoltzer
  • 5,588
  • 31
  • 48
Louis
  • 2,442
  • 1
  • 18
  • 15
  • 2
    why do you need to go back? maybe it'd be better / easier to recover whatever you're missing in haskell? there's TH and quasiquoting in haskell. also, please post an example!! – gatoatigrado Aug 13 '11 at 03:31
  • Examples? OK, it's often the trivial examples stacked up that start causing problems. For example, if I set up a massive map/grep/sort pipeline in Perl 5, each stage has to evaluate the entire damn thing, write the **whole** thing into memory, and then feed it to the next stage. Rather than storing one computed element in memory at for each step, the whole thing needs storing. If you're retrieving something from, let's say, a file or an infinite sequence, it becomes tedious to do, often resorting to state, albeit encapsulated state. – Louis Aug 13 '11 at 15:46
  • I've seen many comments like this "organized way of copy-and-pasting code around" about macros, but they are exactly what you say later "i don't get macros" macros are compiler/evaluator extensions, code that runs at compile/eval time, with common lisp macros you could implement all of haskell from scratch, any compiler. so saying "macros are a organized way of copy-and-pasting code around" is like saying compilers are the same, is like a javascripter saying "I've used compilers and not been particularly impressed given the hype" which is preposterous & very dull of you – kisai Dec 18 '18 at 02:27

5 Answers5

64

Lazy evaluation makes macros redundant

This is pure nonsense (not your fault; I've heard it before). It's true that you can use macros to change the order, context, etc. of expression evaluation, but that's the most basic use of macros, and it's really not convenient to simulate a lazy language using ad-hoc macros instead of functions. So if you came at macros from that direction, you would indeed be disappointed.

Macros are for extending the language with new syntactic forms. Some of the specific capabilities of macros are

  1. Affecting the order, context, etc. of expression evaluation.
  2. Creating new binding forms (i.e. affecting the scope an expression is evaluated in).
  3. Performing compile-time computation, including code analysis and transformation.

Macros that do (1) can be pretty simple. For example, in Racket, the exception-handling form, with-handlers, is just a macro that expands into call-with-exception-handler, some conditionals, and some continuation code. It's used like this:

(with-handlers ([(lambda (e) (exn:fail:network? e))
                 (lambda (e)
                   (printf "network seems to be broken\n")
                   (cleanup))])
  (do-some-network-stuff))

The macro implements the notion of "predicate-and-handler clauses in the dynamic context of the exception" based on the primitive call-with-exception-handler which handles all exceptions at the point they're raised.

A more sophisticated use of macros is an implementation of an LALR(1) parser generator. Instead of a separate file that needs pre-processing, the parser form is just another kind of expression. It takes a grammar description, computes the tables at compile time, and produces a parser function. The action routines are lexically-scoped, so they can refer to other definitions in the file or even lambda-bound variables. You can even use other language extensions in the action routines.

At the extreme end, Typed Racket is a typed dialect of Racket implemented via macros. It has a sophisticated type system designed to match the idioms of Racket/Scheme code, and it interoperates with untyped modules by protecting typed functions with dynamic software contracts (also implemented via macros). It's implemented by a "typed module" macro that expands, type-checks, and transforms the module body as well as auxiliary macros for attaching type information to definitions, etc.

FWIW, there's also Lazy Racket, a lazy dialect of Racket. It's not implemented by turning every function into a macro, but by rebinding lambda, define, and the function application syntax to macros that create and force promises.

In summary, lazy evaluation and macros have a small point of intersection, but they're extremely different things. And macros are certainly not subsumed by lazy evaluation.

MasterMastic
  • 20,711
  • 12
  • 68
  • 90
Ryan Culpepper
  • 10,495
  • 4
  • 31
  • 30
  • 7
    I think this misconception springs mostly from a lack of imagination. Coming from a strict, eager language with poor metaprogramming support, the idea that an *if statement* doesn't need to be built-in seems pretty revolutionary! Which it is, in a way, but there's a lot more to both lazy evaluation *and* macros than just reimplementing control structures because you can. – C. A. McCann Aug 15 '11 at 12:33
  • The LALR(1) parser generator link is broken. I believe the "so294" is [Scott Owens](https://www.cs.kent.ac.uk/people/staff/sao/pubs.html) (now at university of kent). He has a collection of parser tools documented at https://download.racket-lang.org/releases/5.92/doc/parser-tools/LALR_1__Parsers.html. – preferred_anon Oct 18 '22 at 11:08
24

Lazy evaluation can substitute for certain uses of macros (those which delay evaluation to create control constructs) but the converse isn't really true. You can use macros to make delayed evaluation constructs more transparent -- see SRFI 41 (Streams) for an example of how: http://download.plt-scheme.org/doc/4.1.5/html/srfi-std/srfi-41/srfi-41.html

On top of this, you could write your own lazy IO primitives as well.

In my experience, however, pervasively lazy code in a strict language tends to introduce an overhead as compared to pervasively lazy code in a runtime designed to efficiently support it from the start -- which, mind you, is an implementation issue really.

sclv
  • 38,665
  • 7
  • 99
  • 204
  • 6
    Re: "the converse isn't really true", I think the rest of your answer clearly contradicts this. There is an intersection between the utility of laziness and macros, but neither is a subset of the other. – acfoltzer Aug 13 '11 at 05:16
  • For making SOME control structures, I would've thought. How do you write a CASE (not a multi-conditional, but a jump table) using lazy evaluation? – Vatine Aug 13 '11 at 08:39
  • I saw a full '80s BASIC written in a Haskell DSL that had GOTO, IF, and everything. Macros could also do this, of course. But if Haskell's lazy evaluation can produce an entire programming language, I imagine it could do stuff like switch statements etc. – Louis Aug 13 '11 at 20:14
  • 1
    @Louis: AFAIK, Augustuss's BASIC DSL (http://augustss.blogspot.com/2009/02/more-basic-not-that-anybody-should-care.html) is not really about laziness, but mostly about Haskell's light-weight syntax, overloaded Num literals, and monadic bind syntax support. – Peaker Aug 15 '11 at 19:12
23

Laziness is denotative, while macros are not. More precisely, if you add non-strictness to a denotative language, the result is still denotative, but if you add macros, the result isn't denotative. In other words, the meaning of an expression in a lazy pure language is a function solely of the meanings of the component expressions; while macros can yield semantically distinct results from semantically equal arguments.

In this sense, macros are more powerful, while laziness is correspondingly more well-behaved semantically.

Edit: more precisely, macros are non-denotative except with respect to the identity/trivial denotation (where the notion of "denotative" becomes vacuous).

Community
  • 1
  • 1
Conal
  • 18,517
  • 2
  • 37
  • 40
  • A pedant writes... that yielding semantically distinct results indicates semantic inequality of the arguments! But the point is a good one: macros (in the fexpr sense) make an expression's syntax part of its semantics, whether that's the meaning you mean or not. – pigworker Aug 14 '11 at 20:03
  • First, fexprs are not macros. Second, macros are programs that run at compile time -- they aren't different expression forms (those are indeed fexprs, and a bad idea). So macros don't change the runtime semantics of your language at all. Third, I think the word "compositional" is more common that "denotative" for the concept you want. – Sam Tobin-Hochstadt Aug 14 '11 at 22:08
  • 1
    Sam: Following Landin (see link above), I'm using "denotative" in a more specific sense than merely "compositional". For instance, macros are syntactically compositional, but denotationally non-compositional. – Conal Aug 14 '11 at 23:55
  • 3
    pigworker: I sympathize with your POV here, particularly that "macros ... make an expression's syntax part of its semantics", which then destroys all non-trivial equational properties. – Conal Aug 15 '11 at 00:00
  • 1
    Sam: I'm talking about the overall semantics (syntax --> meaning), not just the runtime aspect. – Conal Aug 15 '11 at 00:02
  • 1
    I stand corrected on macros-vs-fexprs. Thanks for the clarification. It seems clear that there is more to semantics than "runtime semantics", especially if some kinds of programs run at times other than runtime. – pigworker Aug 15 '11 at 00:02
  • 2
    [Here is a more direct link](http://conal.net/blog/posts/is-haskell-a-purely-functional-language/#comment-35882) to Landin's recommended term & notion of "denotative". – Conal Aug 15 '11 at 00:19
  • Regard macros as a term rewriting system in your compilation workflow - this way there is a well-defined and well-behaved semantics. – SK-logic Aug 15 '11 at 07:30
  • Conal: if you're describing a semantics that includes macro expansion in your denotational semantics, then your original answer is making a much more basic (but common) mistake, which is thinking that the arguments to macros have semantics except in the sense that they're data (representing syntax). Except in certain cases that do not correspond to any widely-used macro systems, that's the wrong way to think about macros -- instead, they *determine* the semantics of the result by their expansion. – Sam Tobin-Hochstadt Aug 15 '11 at 15:30
  • Conal: the term "compositional" as I've heard it applied to semantics means precisely that: that the meaning of an expression is determined solely by the meaning of its subexpressions. For example, Cartwright and Felleisen define "compositionality" in their Extensible Denotational Language Specifications paper as "the interpretation of a phrase is a function of the interpretations of its sub-phrases". – Sam Tobin-Hochstadt Aug 15 '11 at 15:37
  • 1
    @Sam: You're basically saying that you're only supposed to reason about the result of macro expansion, and not about the source code. That means that the expressions we actually read & write do not have any meaningful semantics, and cannot be composed (unless we tediously expand all the macros before&after the change and try to analyze the semantics of the pre&post expanded code). This makes denotative programming much less useful. – Peaker Aug 15 '11 at 19:20
  • Sam: (about "... mistake, which is thinking that the arguments to macros have semantics"). Yes and thank you for offering a rephrasing of my original statement that unlike laziness, macros are not denotative. Which is to say that they operate on syntax, not semantics. (Which is not a value judgment. I like macros, BTW.) – Conal Aug 15 '11 at 20:25
  • Sam: I believe you that, when used in the context of denotational semantics, "compositional" may have been used to mean *denotationally* compositional. In a broader context, I appreciate how Landin's "denotative" packs more information. – Conal Aug 15 '11 at 20:29
  • Conal: I don't think I'm agreeing with you -- instead, I think you're making a category mistake. In Template Haskell, TH functions are compositional functions of their input, it's just that their input is abstract syntax trees, not Haskell values. Your complaint seems to be that quotation is not the identity, but instead reifies syntax. – Sam Tobin-Hochstadt Aug 16 '11 at 01:28
  • 1
    Peaker: that depends on the sort of reasoning you're doing. If you're reasoning about a Haskell program that you're writing, then it's fine to think of "if-then-else" as a part of the language. If you're writing a Haskell static analyzer, then you should follow the definition, which is a macro expansion, and analyze the result of that expansion. – Sam Tobin-Hochstadt Aug 16 '11 at 01:32
  • Sam: Perhaps you're arguing with yourself here. As before, when I look at the objective content of your remarks (the part that interests me), I find no disagreement between us. Instead, I see we're both saying that, whereas the meaning of functions (whether strict or nonstrict) is denotation transformation, the meaning of macros is notation (syntax) transformation. Both are compositional, but only the former is denotative (== denotationally compositional). And thus macros have expressive power beyond lazy functions. – Conal Aug 17 '11 at 07:03
  • Conal: what I think we disagree on is what the denotation of the inputs to macros are. Obviously, there are many different meaning functions for any given input. But I think the appropriate denotation for a TH input is an AST value, whereas you think the appropriate denotation is to further analyze *that* denotation and look at the meaning of that AST considered as a Haskell program. – Sam Tobin-Hochstadt Aug 17 '11 at 14:30
  • 1
    Sam: I still see you arguing with a figment of your imagination here. I don't think what you say I think, and I wouldn't, since I don't carry around subjective notions like "the appropriate". Nor would I prefer that macro arguments (ASTs) be interpreted. AFAICT, filtering out subjectivity/opinions, we're both saying that macros map syntax (ASTs) to syntax, rather than having a (non-identity) interpretation applied as happens with functions. If you want to say that macros are (vacuously) denotative w.r.t the identity/trivial denotation, okay with me. – Conal Aug 17 '11 at 19:52
  • I amended my answer to add that macros are non-denotative except with respect to the identity/trivial denotation (where the notion of "denotative" becomes vacuous). – Conal Aug 17 '11 at 19:54
  • Conal: of course macros have non-trivial denotations. They're just functions (when written in a pure language). It happens that they're functions on values that represent syntax. But that doesn't make them not have useful denotations. – Sam Tobin-Hochstadt Aug 18 '11 at 02:56
  • Sam: I wasn't talking about *macros* having trivial denotations, but rather the *arguments* to macros (ASTs/syntax). In contrast to the arguments to functions. – Conal Aug 18 '11 at 03:55
  • @Conal: now I think I see what you mean. There are some macro systems that expand inside-out, rather than outside-in, though I think that's almost certainly a bad idea, but it would be (potentially) denotative by your standard. – Sam Tobin-Hochstadt Aug 19 '11 at 03:17
10

Lisp started in the late 50s of the last millennium. See RECURSIVE FUNCTIONS OF SYMBOLIC EXPRESSIONS AND THEIR COMPUTATION BY MACHINE. Macros were not a part of that Lisp. The idea was to compute with symbolic expressions, which can represent all kinds of formulas and programs: mathematical expressions, logical expressions, natural language sentences, computer programs, ...

Later Lisp macros were invented and they are an application of that above idea to Lisp itself: Macros transform Lisp (or Lisp-like) expressions to other Lisp expressions using the full Lisp language as a transformation language.

You can imagine that with Macros you can implement powerful pre-processors and compilers as a user of Lisp.

The typical Lisp dialect uses strict evaluation of arguments: all arguments to functions are evaluated before a function gets executed. Lisp also has several built-in forms which have different evaluation rules. IF is such an example. In Common Lisp IF is a so-called special operator.

But we can define a new Lisp-like (sub-) language which uses lazy evaluation and we can write Macros to transform that language into Lisp. This is an application for macros, but by far not the only one.

An example (relatively old) for such a Lisp extension which uses macros to implement a code transformer which provides data structures with lazy evaluation is the SERIES extension to Common Lisp.

Rainer Joswig
  • 136,269
  • 10
  • 221
  • 346
5

Macros can be used to handle lazy evaluation, but's just part of it. The main point of macros is that thanks to them basically nothing is fixed in the language.

If programming is like playing with LEGO bricks, with macros you can also change the shape of the bricks or the material they're built with.

Macros is more than just delayed evaluation. That was available as fexpr (a macro precursor in the history of lisp). Macros is about program rewriting, where fexpr is just a special case...

As an example consider that I'm writing in my spare time a tiny lisp to javascript compiler and originally (in the javascript kernel) I only had lambda with support for &rest arguments. Now there's support for keyword arguments and that because I redefined what lambda means in lisp itself.

I can now write:

(defun foo (x y &key (z 12) w) ...)

and call the function with

(foo 12 34 :w 56)

When executing that call, in the function body the w parameter will be bound to 56 and the z parameter to 12 because it wasn't passed. I'll also get a runtime error if an unsupported keyword argument is passed to the function. I could even add some compile-time check support by redefining what compiling an expressions means (i.e. adding checks if "static" function call forms are passing the correct parameters to functions).

The central point is that the original (kernel) language did not have support for keyword arguments at all, and I was able to add it using the language itself. The result is exactly like if it was there from the beginning; it's simply part of the language.

Syntax is important (even if it's technically possible to just use a turing machine). Syntax shapes the thoughts you have. Macros (and read macros) give you total control on the syntax.

A key point is that code-rewriting code is not using a crippled dumbed down brainf**k-like language as C++ template metaprogramming (where just making an if is an amazing accomplishment), or with a an even dumber less-than-regexp substitution engine like C preprocessor.

Code-rewriting code uses the same full-blown (and extensible) language. It's lisp all the way down ;-)

Sure writing macros is harder than writing regular code; but it's an "essential complexity" of the problem, not an artificial complexity because you're forced to use a dumb half-language like with C++ metaprogramming.

Writing macros is harder because code is a complex thing and when writing macros you write complex things that build complex things themselves. It's even not so uncommon to go up one level more and write macro-generating macros (that's where the old lisp joke of "I'm writing code that writes code that writes code that I'm being paid for" comes from).

But macro power is simply boundless.

6502
  • 112,025
  • 15
  • 165
  • 265
  • Isn't providing an effortless way to add new syntax to a language just asking for maintainability problems in large projects? I mean, DSLs are fine when when they're well defined and widespread like regular expressions and SQL. But I'm not so sure that user defined-languages popping up in every project is a particularly great idea. After all, we use languages because they're good at certain things -- Not so we can just implement other languages. I'm probably wrong though. Y'know, this might be an interesting debate for a separate question... – Louis Aug 13 '11 at 19:49
  • I forked the question in that comment to a new question: http://stackoverflow.com/questions/7052963/are-project-specific-dsls-a-liability – Louis Aug 13 '11 at 20:10
  • I don't think that question will survive. SO is not a newsgroup but a Q/A site with just a bit of interaction. My opinion is however that eg knowing the Java language isn't enough to program in Java. You have to know the framework to make the correct calls in the correct way. In large software the same need of knowledge appears at higher levels too (e.g. if you touch that column of that table you should also touch that other one in that other table, this kind of data is here, that kind there). If this knowledge or code could be much better expressed with a different syntax, why not doing it? – 6502 Aug 13 '11 at 20:48
  • I don't really understand what that statement has got to do with that new question. I mean, if something needs a different syntax, people will turn to different languages or try to implement DSLs. And that's what I'm talking about in that question. – Louis Aug 13 '11 at 20:55
  • @Louis: one could describe any sufficiently advanced use of macros *or* laziness as a user-defined language; in fact you do so yourself in your comment on sclv's answer. The problem you describe has little to do with macros or laziness, and even less to do with comparing the two. It's a problem with high levels of abstraction, namely that a sensible abstraction to one programmer might be incomprehensible to another. It's worth discussion, but as 6502 says, perhaps not on SO. – acfoltzer Aug 14 '11 at 05:34
  • @Louis, no, it is not a maintainability problem per se. If there is a really *effortless* way of extending a language, your maintenance costs will drop down, as you'll be able to get rid of tons of preprocessors and crooked ad hoc code generators that are so typical to each and every decently large industrial codebase. – SK-logic Aug 15 '11 at 07:32
  • Writing macros in fact is not any harder than using high order functions. Most people are doing it the wrong way, but it is not an excuse. – SK-logic Aug 15 '11 at 07:35
  • @SK-logic: I think macros are just a bit harder than high order functions because of the syntax problem (e.g. unwanted capturing, correct code walking). You also have the same problem as with high-order functions of "time levels". Clearly they're both more logically complex than just regular code because the output is a complex object (code) and not just e.g. numbers. Numbers can be wrong but they cannot be "buggy"... that's IMO where the jump in complexity is. – 6502 Aug 15 '11 at 10:02
  • @6502, in fact there are ways of turning code rewriting macros into a great tool for simplifying your coding. Of course it requires quite a discipline (unless a proper type system is in place to enforce it), but it is possible to code in a much simpler way than "with numbers". – SK-logic Aug 15 '11 at 11:08
  • googling for "It's lisp all the way down" gives one a *lot* of interesting stuff to read...... :) – Will Ness Jan 01 '15 at 22:54