I have the following function defined:
import org.apache.spark.sql.catalyst.{ScalaReflection}
import ScalaReflection.universe
import universe.TypeTag
def scalaTypesFor(dataType: DataType): Set[TypeTag[_]] = ...
def scalaTypeOpt: Option[TypeTag[_]] = ...
val catalystType = ...
scalaTypeOpt.map(v => Set(v))
.getOrElse{
val default = scalaTypesFor(catalystType)
default
}
in this case, both scalaTypesFor and scalaTypeOpt are expected to yield a TypeTag with wildcard parameter, they should be of the same type. However, the compiler gave me the following error:
Error:(29, 51) inferred type arguments [scala.collection.immutable.Set[_117] forSome { type _$2; type _117 >: org.apache.spark.sql.catalyst.ScalaReflection.universe.TypeTag[_$2] <: org.apache.spark.sql.catalyst.ScalaReflection.universe.TypeTag[_] }] do not conform to method getOrElse's type parameter bounds [B >: scala.collection.immutable.Set[org.apache.spark.sql.catalyst.ScalaReflection.universe.TypeTag[_$2]] forSome { type _$2 }]
val effective = scalaTypeOpt.map(v => Set(v)).getOrElse{
^
What's wrong with the type inference and how to fix it?