10

Suppose I have this macro:

import language.experimental.macros
import scala.reflect.macros.Context

object FooExample {
  def foo[A](xs: A*): Int = macro foo_impl[A]
  def foo_impl[A](c: Context)(xs: c.Expr[A]*) = c.literal(xs.size)
}

This works as expected with "real" varargs:

scala> FooExample.foo(1, 2, 3)
res0: Int = 3

But the behavior with a sequence ascribed to the varargs type is confusing to me (in Scala 2.10.0-RC3):

scala> FooExample.foo(List(1, 2, 3): _*)
res1: Int = 1

And to show that nothing fishy is going on with the inferred type:

scala> FooExample.foo[Int](List(1, 2, 3): _*)
res2: Int = 1

I would have expected a compile-time error here, and that's what I want. I've used the following approach in most of the macros I've written:

object BarExample {
  def bar(xs: Int*): Int = macro bar_impl
  def bar_impl(c: Context)(xs: c.Expr[Int]*) = {
    import c.universe._
    c.literal(
      xs.map(_.tree).headOption map {
        case Literal(Constant(x: Int)) => x
        case _ => c.abort(c.enclosingPosition, "bar wants literal arguments!")
      } getOrElse c.abort(c.enclosingPosition, "bar wants arguments!")
    )
  }
}

And this catches the problem at compile time:

scala> BarExample.bar(3, 2, 1)
res3: Int = 3

scala> BarExample.bar(List(3, 2, 1): _*)
<console>:8: error: bar wants literal arguments!
              BarExample.bar(List(3, 2, 1): _*)

This feels like a hack to me, though—it's mixing up one bit of validation (checking that the arguments are literals) with another (confirming that we really have varargs). I can also imagine cases where I don't need the arguments to be literals (or where I want their type to be generic).

I know I could do the following:

object BazExample {
  def baz[A](xs: A*): Int = macro baz_impl[A]
  def baz_impl[A](c: Context)(xs: c.Expr[A]*) = {
    import c.universe._

    xs.toList.map(_.tree) match {
      case Typed(_, Ident(tpnme.WILDCARD_STAR)) :: Nil =>
        c.abort(c.enclosingPosition, "baz wants real varargs!")
      case _ => c.literal(xs.size)
    }
  }
}

But this is an ugly way of handling a very simple (and I'd suppose widely necessary) bit of argument validation. Is there a trick I'm missing here? What's the simplest way that I can make sure that foo(1 :: Nil: _*) in my first example gives a compile-time error?

Travis Brown
  • 138,631
  • 12
  • 375
  • 680
  • When you write "I would have expected a compile-time error here", could you please clarify? You'd expect this to be an error, because this is the requirement of your domain? Or this should be an error for all sorts of vararg macros? – Eugene Burmako Dec 02 '12 at 21:08
  • @EugeneBurmako: My concern is that in the ascription case, `xs.head` isn't actually a `c.Expr[A]` at all—it's more like a `c.Expr[Seq[A]]`. Here are [a couple of examples](https://gist.github.com/4191360). – Travis Brown Dec 02 '12 at 22:37

1 Answers1

1

Does this work as expected?

object BarExample {
  def bar(xs: Int*): Int = macro bar_impl
  def bar_impl(c: Context)(xs: c.Expr[Int]*) = { 
    import c.universe._
    import scala.collection.immutable.Stack
    Stack[Tree](xs map (_.tree): _*) match { 
      case Stack(Literal(Constant(x: Int)), _*) => c.literal(x)
      case _ => c.abort(c.enclosingPosition, "bar wants integer constant arguments!")
    }
  }
}
idonnie
  • 1,703
  • 12
  • 11