How to infer the right type parameter from a projection type?

喜你入骨 提交于 2019-12-03 06:55:30

The program compiles by adding this implicit conversion in the context:

implicit def f(x: Bar#X): Foo#X = x

As this implicit conversion is correct for any F <: Foo, I wonder why the compiler does not do that by itself.

You can also:

trait Foo {
    type X
}
trait Bar extends Foo {
    type X = String
}
class BarImpl extends Bar{
  def getX:X="hi"
}
def baz[F <: Foo, T <: F#X](clz:F, x: T): Unit = { println("baz worked!")}
val bi = new BarImpl
val x: Bar#X = bi.getX
baz(bi,x)

but:

def baz2[F <: Foo, T <: F#X](x: T): Unit = { println("baz2 failed!")}
baz2(x)

fails with:

test.scala:22: error: inferred type arguments [Nothing,java.lang.String] do not conform to method baz2's type parameter bounds [F <: this.Foo,T <: F#X]
baz2(x)
^
one error found

I think basically, F <: Foo tells the compiler that F has to be a subtype of Foo, but when it gets an X it doesn't know what class your particular X comes from. Your X is just a string, and doesn't maintain information pointing back to Bar.

Note that:

def baz3[F<: Foo](x : F#X) = {println("baz3 worked!")}
baz3[Bar]("hi")

Also works. The fact that you defined a val x:Bar#X=??? just means that ??? is restricted to whatever Bar#X might happen to be at compile time... the compiler knows Bar#X is String, so the type of x is just a String no different from any other String.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!