3

In my project, I have a constellation like this:

trait F

trait X[A <: F]

def test(x: X[_]): X[_ <: F] = x

Trait X has a type parameter with an upper bound of F. From my understanding, the types X[_] and X[_ <: F] should be equivalent. But scalac 2.12.5 complains that one is not assignable to the other.

$ scalac -Xscript test test.scala 
test.scala:5: error: type mismatch;
 found   : this.X[_$1] where type _$1
 required: this.X[_ <: this.F]
def test(x: X[_]): X[_ <: F] = x
                               ^

I cannot think of a situation where this assignment is making a sound program unsound. What are the reasons that this assignment is rejected? Is there a way that allowing such an assignment (maybe in a more complex example) is problematic?

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
Feuermurmel
  • 9,490
  • 10
  • 60
  • 90
  • 1
    "From my understanding, the types X[_] and X[_ <: F] should be equivalent" They are not. https://www.scala-lang.org/files/archive/spec/2.12/03-types.html#equivalence `Two existential types are equivalent if they have the same number of quantifiers, and, after renaming one list of type quantifiers by another, the quantified types as well as lower and upper bounds of corresponding quantifiers are equivalent.` – Dmytro Mitin Feb 27 '19 at 19:20
  • 1
    https://stackoverflow.com/questions/4323140/why-are-the-bounds-of-type-parameters-ignored-when-using-existential-types-in-sc – Dmytro Mitin Feb 27 '19 at 19:28

2 Answers2

2

This assignment isn't really problematic, and the compiler even kind-of knows this, because the following implementation compiles without problems:

trait F
trait X[A <: F]
def test(x: X[_]): X[_ <: F] = x match { case q: X[t] => q }

If you give the type checker some slack by allowing it to infer more precise bounds for the type variable t, it will eventually figure out that t must be subtype of F, and then allow you to return the value q (which is the same as x) without complaining. It doesn't do this by default for some counter-intuitive reasons that probably have something to do with Java-wildcard interoperability.

(Undeleted again; My original guess didn't seem too far off, and given Dmytro Mitin's link, it doesn't even seem all that vague by comparison.)

Andrey Tyukin
  • 43,673
  • 4
  • 57
  • 93
0

i might be wrong, but it's enough to look at the definition of the function itself:

def test(x: X[_]): X[_ <: F] = x

the only information existential type gives is that there exists something. and with this signature you try to "narrow" the function result

put it in a practical way with an example. let's say you have smth like this:

def test(x: Option[_]): Option[_ <: String]

and then you call it passing inside Option[Int]. would you expect this assignment to be correct?

val result: Option[_ <: String] = test(Some(1): Option[_])
  • 1
    `Option[+A]` does not impose any upper bounds on the type parameter `A`, whereas `X[A <: F]` has the `<: F` part in it. – Andrey Tyukin Feb 28 '19 at 00:57
  • that was just a closer to real life example in which covariance doesn't play any role. if you pass `X[_]` in kind of identity function, you can't have in return type the restriction `<: F` out of nowhere. eg, you pass `X[Int]`, would you expect it to confirm a type of `X[Int <: String]`, if `F` is `String`? – Serhii Shynkarenko Feb 28 '19 at 09:55
  • It's not about variance, it's about the upper type bound. If the trait `X`, by definition, requires its only type parameter to be a subtype of `F`, than it doesn't seem too far-fetched to expect that the compiler either treats `X[_]` as synonymous to `X[_ <: F]`, or forbids `X[_]` (without `<: F`) altogether. But Scala doesn't do this, for whatever reason. That's where all the confusion comes from. – Andrey Tyukin Feb 28 '19 at 15:47