1

As the code https://github.com/elixir-plug/plug/blob/v1.5.0/lib/plug/builder.ex#L183 shows, plugs definition will be compiled to AST at macro expansion phase. But why? Why not just keep the plugs definition and use Enum.reduce_while or recursion to call plugs one by one?

Tony Han
  • 2,130
  • 3
  • 24
  • 37

1 Answers1

1

Two reasons I can think of:

  1. Performance. Consider these two snippets that do the same thing but one does it using compiled function calls and the other uses Enum.reduce and apply:

    defmodule A do
      def add1(x), do: x + 1
      def sub1(x), do: x - 1
    
      def compiled(x) do
        x
        |> add1()
        |> sub1()
        |> add1()
        |> sub1()
        |> add1()
        |> sub1()
        |> add1()
        |> sub1()
      end
    
      @pipeline [
        {A, :add1},
        {A, :sub1},
        {A, :add1},
        {A, :sub1},
        {A, :add1},
        {A, :sub1},
        {A, :add1},
        {A, :sub1}
      ]
      def runtime(x) do
        Enum.reduce(@pipeline, x, fn {module, function}, acc ->
          apply(module, function, [acc])
        end)
      end
    end
    

    A simple benchmark shows that the runtime implementation is 5 times slower.

    IO.inspect(
      :timer.tc(fn ->
        for _ <- 1..1_000_000, do: A.compiled(123)
        :ok
      end)
      |> elem(0)
    )
    
    IO.inspect(
      :timer.tc(fn ->
        for _ <- 1..1_000_000, do: A.runtime(123)
        :ok
      end)
      |> elem(0)
    )
    

    Output:

    82800
    433198
    
  2. Catching bugs at compile time. If you pass a module to plug that doesn't implement call/2, you get an error at compile time instead of a runtime error you'd normally get if you do everything at runtime.

Dogbert
  • 212,659
  • 41
  • 396
  • 397