0

I'm trying to compose some functions with compound types in their type parameters:

trait Contains[T]
trait Sentence
trait Token
def sentenceSegmenter[T] = (c: Contains[T]) => null: Contains[T with Sentence]
def tokenizer[T <: Sentence] = (c: Contains[T]) => null: Contains[T with Token]

My main goal is to be able to compose them with something as simple as:

val pipeline = sentenceSegmenter andThen tokenizer

However, that produces a compile error, because Scala infers that the type of tokenizer needs to be Contains[? with Sentence] => ?:

scala> val pipeline = sentenceSegmenter andThen tokenizer
<console>:12: error: polymorphic expression cannot be instantiated to expected type;
 found   : [T <: Sentence]Contains[T] => Contains[T with Token]
 required: Contains[? with Sentence] => ?

       val pipeline = sentenceSegmenter andThen tokenizer
                                                ^

I tried a slightly different definition of tokenizer that more closely matches the type inferred by Scala, but I get a similar error:

scala> def tokenizer[T] = (c: Contains[T with Sentence]) => null: Contains[T with Sentence with Token]
tokenizer: [T]=> Contains[T with Sentence] => Contains[T with Sentence with Token]

scala> val pipeline = sentenceSegmenter andThen tokenizer
<console>:12: error: polymorphic expression cannot be instantiated to expected type;
 found   : [T]Contains[T with Sentence] => Contains[T with Sentence with Token]
 required: Contains[? with Sentence] => ?

       val pipeline = sentenceSegmenter andThen tokenizer
                                                ^

I can get things to compile if I specify pretty much any type along with sentenceSegmenter, or if I create a bogus initial function that does not have a type parameter:

scala> val pipeline = sentenceSegmenter[Nothing] andThen tokenizer
pipeline: Contains[Nothing] => Contains[Nothing with Sentence with Sentence with Token] = <function1>

scala> val pipeline = sentenceSegmenter[Any] andThen tokenizer
pipeline: Contains[Any] => Contains[Any with Sentence with Sentence with Token] = <function1>

scala> val begin = identity[Contains[Any]] _
begin: Contains[Any] => Contains[Any] = <function1>

scala> val pipeline = begin andThen sentenceSegmenter andThen tokenizer
pipeline: Contains[Any] => Contains[Any with Sentence with Sentence with Token] = <function1>

I wouldn't mind the type Any or Nothing being inferred, since I don't really care what T is. (I mainly care about the with XXX part.) But I'd like it to be inferred, rather than having to specify it explicitly, or supplying it via a bogus initial function.

Steve
  • 3,038
  • 2
  • 27
  • 46

1 Answers1

1

You can't bind (val) a type parameter. You have to use a def instead so it can bind the types when it's used:

  def pipeline[T] = sentenceSegmenter[T] andThen tokenizer

Note that you can call pipeline with an inferred type:

scala> new Contains[Sentence] {}
res1: Contains[Sentence] = $anon$1@5aea1d29

scala> pipeline(res1)
res2: Contains[Sentence with Sentence with Token] = null
Noah
  • 13,821
  • 4
  • 36
  • 45
  • But the ``begin`` workaround in my code uses a ``val`` and everything works okay, so it can't just be a ``val`` vs. ``def`` issue. Also, in your code, you still have to explicitly specify a type parameter to ``sentenceSegmenter`` and that's what I'm trying to get rid of... – Steve Jul 11 '13 at 19:57
  • Your `begin` is binding the type `Contains[Any]` and is no different than `val pipeline = sentenceSegmenter[Any] andThen tokenizer`. Note that you can call pipeline without a type parameter. – Noah Jul 11 '13 at 20:17
  • I'm not trying to get rid of the type parameter when calling the pipeline though - it was already not needed by either of my workarounds. I'm trying to get rid of the type parameter when constructing the pipeline, e.g. in the ``sentenceSegmenter andThen tokenizer`` line. Perhaps I should clarify that I really don't care what ``T`` gets resolved to. As long as it allows ``sentenceSegmenter andThen tokenizer``, that's fine with me. I edited the question a bit to hopefully clarify this. – Steve Jul 11 '13 at 20:35